WorldWideScience

Sample records for saphire tool set

  1. SAPHIRE 8 Software Project Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis L.Smith; Ted S. Wood

    2010-03-01

    This project is being conducted at the request of the DOE and the NRC. The INL has been requested by the NRC to improve and maintain the Systems Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) tool set concurrent with the changing needs of the user community as well as staying current with new technologies. Successful completion will be upon NRC approved release of all software and accompanying documentation in a timely fashion. This project will enhance the SAPHIRE tool set for the user community (NRC, Nuclear Power Plant operations, Probabilistic Risk Analysis (PRA) model developers) by providing improved Common Cause Failure (CCF), External Events, Level 2, and Significance Determination Process (SDP) analysis capabilities. The SAPHIRE development team at the Idaho National Laboratory is responsible for successful completion of this project. The project is under the supervision of Curtis L. Smith, PhD, Technical Lead for the SAPHIRE application. All current capabilities from SAPHIRE version 7 will be maintained in SAPHIRE 8. The following additional capabilities will be incorporated: • Incorporation of SPAR models for the SDP interface. • Improved quality assurance activities for PRA calculations of SAPHIRE Version 8. • Continue the current activities for code maintenance, documentation, and user support for the code.

  2. Relative humidity distribution from SAPHIR experiment on board Megha-Tropiques satellite mission: Comparison with global radiosonde and other satellite and reanalysis data sets

    Science.gov (United States)

    Venkat Ratnam, M.; Basha, Ghouse; Krishna Murthy, B. V.; Jayaraman, A.

    2013-09-01

    For better understanding the life cycle of the convective systems and their interactions with the environment, a joint Indo-French satellite mission named Megha-Tropiques has been launched in October 2011 in a low-inclination (20°) orbit. In the present study, we show the first results on the comparison of relative humidity (RH) obtained using a six-channel microwave sounder, covering from surface to 100 hPa, from one of the payloads SAPHIR (Sounder for Atmospheric Profiling of Humidity in the Inter-tropical Regions). The RH observations from SAPHIR illustrated the numerous scales of variability in the atmosphere both vertically and horizontally. As a part of its validation, we compare SAPHIR RH with simultaneous observations from a network of radiosondes distributed across the world (±30° latitude), other satellites (Atmospheric Infrared Sounder, Infrared Atmospheric Sounder Interferometer, Constellation Observation System for Meteorology Ionosphere and Climate (COSMIC)), and various reanalysis (National Center for Environmental Prediction (NCEP), European Center for Medium-Range Weather Forecasts reanalysis (ERA)-Interim, Modern-Era Retrospective Analysis for Research and Application (MERRA)) products. Being at a low inclination, SAPHIR is able to show better global coverage when compared to any other existing satellites in the tropical region where some important weather processes take place. A very good correlation is noticed with the RH obtained from a global radiosonde network particularly in the altitude range corresponding to 850-250 hPa, thus providing a valuable data set for investigating the convective processes. In the case of satellite data sets, SAPHIR RH is well comparable with COSMIC RH. Among the reanalysis products, NCEP shows less difference with SAPHIR followed by ERA-Interim, and the MERRA products show large differences in the middle and upper troposphere.

  3. SAPHIRE models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Schroeder, J.A.; Russell, K.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1995-04-01

    The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.

  4. Saphire models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1997-02-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.

  5. SAPHIR, how it ended

    Energy Technology Data Exchange (ETDEWEB)

    Brogli, R.; Hammer, J.; Wiezel, L.; Christen, R.; Heyck, H.; Lehmann, E. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1995-10-01

    On May 16th, 1994, PSI decided to discontinue its efforts to retrofit the SAPHIR reactor for operation at 10 MW. This decision was made because the effort and time for the retrofit work in progress had proven to be more complex than was anticipated. In view of the start-up of the new spallation-neutron source SINQ in 1996, the useful operating time between the eventual restart of SAPHIR and the start-up of SINQ became less than two years, which was regarded by PSI as too short a period to warrant the large retrofit effort. Following the decision of PSI not to re-use SAPHIR as a neutron source, several options for the further utilization of the facility were open. However, none of them appeared promising in comparison with other possibilities; it was therefore decided that SAPHIR should be decommissioned. A concerted effort was initiated to consolidate the nuclear and conventional safety for the post-operational period. (author) 3 figs., 3 tab.

  6. SAPHIRE 8 Software Configuration Management Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-01-01

    The INL software developers use version control for both the formally released SAPHIRE versions, as well as for source code. For each formal release of the software, the developers perform an acceptance test: the software must pass a suite of automated tests prior to official release. Each official release of SAPHIRE is assigned a unique version identifier. The release is bundled into a standard installation package for easy and consistent set-up by individual users. Included in the release is a list of bug fixes and new features for the current release, as well as a history of those items for past releases. Each formal release of SAPHIRE will have passed an acceptance test. In addition to assignment of a unique version identifier for an official software release, each source code file is kept in a controlled library. Source code is a collection of all the computer instructions written by developers to create the finished product. The library is kept on a server, where back-ups are regularly made. This document describes the configuration management approach used as part of the SAPHIRE development.

  7. SAPHIRE 8 Volume 2 - Technical Reference

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Wood; W. J. Galyean; J. A. Schroeder; M. B. Sattison

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). Herein information is provided on the principles used in the construction and operation of Version 8.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, Workspace algorithms, cut set "recovery," end state manipulation, and use of "compound events."

  8. New developments in the Saphire computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Russell, K.D.; Wood, S.T.; Kvarfordt, K.J. [Idaho Engineering Lab., Idaho Falls, ID (United States)] [and others

    1996-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a suite of computer programs that were developed to create and analyze a probabilistic risk assessment (PRA) of a nuclear power plant. Many recent enhancements to this suite of codes have been made. This presentation will provide an overview of these features and capabilities. The presentation will include a discussion of the new GEM module. This module greatly reduces and simplifies the work necessary to use the SAPHIRE code in event assessment applications. An overview of the features provided in the new Windows version will also be provided. This version is a full Windows 32-bit implementation and offers many new and exciting features. [A separate computer demonstration was held to allow interested participants to get a preview of these features.] The new capabilities that have been added since version 5.0 will be covered. Some of these major new features include the ability to store an unlimited number of basic events, gates, systems, sequences, etc.; the addition of improved reporting capabilities to allow the user to generate and {open_quotes}scroll{close_quotes} through custom reports; the addition of multi-variable importance measures; and the simplification of the user interface. Although originally designed as a PRA Level 1 suite of codes, capabilities have recently been added to SAPHIRE to allow the user to apply the code in Level 2 analyses. These features will be discussed in detail during the presentation. The modifications and capabilities added to this version of SAPHIRE significantly extend the code in many important areas. Together, these extensions represent a major step forward in PC-based risk analysis tools. This presentation provides a current up-to-date status of these important PRA analysis tools.

  9. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Tutorial

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Beck; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). This volume is the tutorial manual for the SAPHIRE system. In this document, a series of lessons are provided that guide the user through basic steps common to most analyses preformed with SAPHIRE. The tutorial is divided into two major sections covering both basic and advanced features. The section covering basic topics contains lessons that lead the reader through development of a probabilistic hypothetical problem involving a vehicle accident, highlighting the program’s most fundamental features. The advanced features section contains additional lessons that expand on fundamental analysis features of SAPHIRE and provide insights into more complex analysis techniques. Together, these two elements provide an overview into the operation and capabilities of the SAPHIRE software.

  10. Rain detection and measurement from Megha-Tropiques microwave sounder—SAPHIR

    Science.gov (United States)

    Varma, Atul Kumar; Piyush, D. N.; Gohil, B. S.; Pal, P. K.; Srinivasan, J.

    2016-08-01

    The Megha-Tropiques, an Indo-French satellite, carries on board a microwave sounder, Sondeur Atmosphérique du Profil d'Humidité Intertropical par Radiométrie (SAPHIR), and a microwave radiometer, Microwave Analysis and Detection of Rain and Atmospheric Structures (MADRAS), along with two other instruments. Being a Global Precipitation Measurement constellation satellite MT-MADRAS was an important sensor to study the convective clouds and rainfall. Due to the nonfunctioning of MADRAS, the possibility of detection and estimation of rain from SAPHIR is explored. Using near-concurrent SAPHIR and precipitation radar (PR) onboard Tropical Rainfall Measuring Mission (TRMM) observations, the rain effect on SAPHIR channels is examined. All the six channels of the SAPHIR are used to calculate the average rain probability (PR) for each SAPHIR pixel. Further, an exponential rain retrieval algorithm is developed. This algorithm explains a correlation of 0.72, RMS error of 0.75 mm/h, and bias of 0.04 mm/h. When rain identification and retrieval algorithms are applied together, it explains a correlation of 0.69 with an RMS error of 0.47 mm/h and bias of 0.01 mm/h. On applying the algorithm to the independent SAPHIR data set and compared with TRMM-3B42 rain on monthly scale, it explains a correlation of 0.85 and RMS error of 0.09 mm/h. Further distribution of rain difference of SAPHIR with other rain products is presented on global scale as well as for the climatic zones. For examining the capability of SAPHIR to measure intense rain, instantaneous rain over Phailin cyclone from SAPHIR is compared with other standard satellite-based rain products such as 3B42, Global Satellite Mapping of Precipitation, and Precipitation Estimation from Remote Sensing Information using Artificial Neural Network.

  11. Systems analysis programs for Hands-on integrated reliability evaluations (SAPHIRE) Version 5.0: Verification and validation (V&V) manual. Volume 9

    Energy Technology Data Exchange (ETDEWEB)

    Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M. [Lockheed Idaho Technologies Co., Idaho Falls, ID (United States)

    1995-03-01

    A verification and validation (V&V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V&V of successive versions of SAPHIRE. Previous efforts have been the V&V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V&V plan is based on the SAPHIRE 4.0 V&V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified.

  12. Screening and Evaluation Tool (SET) Users Guide

    Energy Technology Data Exchange (ETDEWEB)

    Pincock, Layne [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-10-01

    This document is the users guide to using the Screening and Evaluation Tool (SET). SET is a tool for comparing multiple fuel cycle options against a common set of criteria and metrics. It does this using standard multi-attribute utility decision analysis methods.

  13. SAPHIRE 8 New Features and Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) software performs probabilistic risk assessment (PRA) calculations. SAPHIRE is used in support of NRC’s risk-informed programs such as the Accident Sequence Precursor (ASP) program, Management Directive 8.3, “NRC Incident Investigation Program,” or the Significance Determination Process (SDP). It is also used to develop and run the Standardized Plant Analysis Risk (SPAR) models. SAPHIRE Version 8 is a new version of the software with an improved interface and capabilities to support risk-informed programs. SAPHIRE Version 8 is designed to easily handle larger and more complex models. Applications of previous SAPHIRE versions indicated the need to build and solve models with a large number of sequences. Risk assessments that include endstate evaluations for core damage frequency and large, early release frequency evaluations have greatly increased the number of sequences required. In addition, the complexity of the models has increased since risk assessments evaluate both potential internal and external events, as well as different plant operational states. Special features of SAPHIRE 8 help create and run integrated models which may be composed of different model types. SAPHIRE 8 includes features and capabilities that are new or improved over the current Version 7 to address the new requirements for risk-informed programs and SPAR models. These include: • Improved User Interfaces • Model development • Methods • General Support Features

  14. SAPHIRE 8 Volume 1 - Overview and Summary

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; S. T. Wood

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events and quantify associated consequential outcome frequencies. Specifically, for nuclear power plant applications, SAPHIRE 8 can identify important contributors to core damage (Level 1 PRA) and containment failure during a severe accident which leads to releases (Level 2 PRA). It can be used for a PRA where the reactor is at full power, low power, or at shutdown conditions. Furthermore, it can be used to analyze both internal and external initiating events and has special features for managing models such as flooding and fire. It can also be used in a limited manner to quantify risk in terms of release consequences to the public and environment (Level 3 PRA). In SAPHIRE 8, the act of creating a model has been separated from the analysis of that model in order to improve the quality of both the model (e.g., by avoiding inadvertent changes) and the analysis. Consequently, in SAPHIRE 8, the analysis of models is performed by using what are called Workspaces. Currently, there are Workspaces for three types of analyses: (1) the NRC’s Accident Sequence Precursor program, where the workspace is called “Events and Condition Assessment (ECA);” (2) the NRC’s Significance Determination

  15. SAPHIRE 8 Volume 7 - Data Loading

    Energy Technology Data Exchange (ETDEWEB)

    K. J. Kvarfordt; S. T. Wood; C. L. Smith; S. R. Prescott

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission and developed by the Idaho National Laboratory. This report is intended to assist the user to enter PRA data into the SAPHIRE program using the built-in MAR-D ASCII-text file data transfer process. Towards this end, a small sample database is constructed and utilized for demonstration. Where applicable, the discussion includes how the data processes for loading the sample database relate to the actual processes used to load a larger PRA models. The procedures described herein were developed for use with SAPHIRE Version 8. The guidance specified in this document will allow a user to have sufficient knowledge to both understand the data format used by SAPHIRE and to carry out the transfer of data between different PRA projects.

  16. SAPHIRE 8 Volume 6 - Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 8 is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows™ operating system. SAPHIRE 8 is funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 8, what constitutes its parts, and limitations of those processes. In addition, this document describes the Independent Verification and Validation that was conducted for Version 8 as part of an overall QA process.

  17. Systems Analysis Programs for Hands-on Intergrated Reliability Evaluations (SAPHIRE) Summary Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith

    2008-08-01

    available in SAPHIRE and presents general instructions for using the software. Section 1 presents SAPHIRE’s historical evolution and summarizes its capabilities. Section 2 presents instructions for installing and using the code. Section 3 explains the database structure used in SAPHIRE and discusses database concepts. Section 4 explains how PRA data (event frequencies, human error probabilities, etc.) can be generated and manipulated using “change sets.” Section 5 deals with fault tree operations, including constructing, editing, solving, and displaying results. Section 6 presents operations associated with event trees, including rule application for event tree linking, partitioning, and editing sequences. Section 7 presents how accident sequences are generated, solved, quantified, and analyzed. Section 8 discusses the functions available for performing end state analysis. Section 9 explains how to modify data stored in a SAPHIRE database. Section 10 illustrates how to generate and customize reports. Section 11 covers SAPHIRE utility options to perform routine functions such as defining constant values, recovering databases, and loading data from external sources. Section 12 provides an overview of GEM’s features and capabilities. Finally, Section 13 summarizes SAPHIRE’s quality assurance process.

  18. Assimilation of SAPHIR radiance: impact on hyperspectral radiances in 4D-VAR

    Science.gov (United States)

    Indira Rani, S.; Doherty, Amy; Atkinson, Nigel; Bell, William; Newman, Stuart; Renshaw, Richard; George, John P.; Rajagopal, E. N.

    2016-04-01

    Assimilation of a new observation dataset in an NWP system may affect the quality of an existing observation data set against the model background (short forecast), which in-turn influence the use of an existing observation in the NWP system. Effect of the use of one data set on the use of another data set can be quantified as positive, negative or neutral. Impact of the addition of new dataset is defined as positive if the number of assimilated observations of an existing type of observation increases, and bias and standard deviation decreases compared to the control (without the new dataset) experiment. Recently a new dataset, Megha Tropiques SAPHIR radiances, which provides atmospheric humidity information, is added in the Unified Model 4D-VAR assimilation system. In this paper we discuss the impact of SAPHIR on the assimilation of hyper-spectral radiances like AIRS, IASI and CrIS. Though SAPHIR is a Microwave instrument, its impact can be clearly seen in the use of hyper-spectral radiances in the 4D-VAR data assimilation systems in addition to other Microwave and InfraRed observation. SAPHIR assimilation decreased the standard deviation of the spectral channels of wave number from 650 -1600 cm-1 in all the three hyperspectral radiances. Similar impact on the hyperspectral radiances can be seen due to the assimilation of other Microwave radiances like from AMSR2 and SSMIS Imager.

  19. STILTS -- Starlink Tables Infrastructure Library Tool Set

    Science.gov (United States)

    Taylor, Mark

    STILTS is a set of command-line tools for processing tabular data. It has been designed for, but is not restricted to, use on astronomical data such as source catalogues. It contains both generic (format-independent) table processing tools and tools for processing VOTable documents. Facilities offered include crossmatching, format conversion, format validation, column calculation and rearrangement, row selection, sorting, plotting, statistical calculations and metadata display. Calculations on cell data can be performed using a powerful and extensible expression language. The package is written in pure Java and based on STIL, the Starlink Tables Infrastructure Library. This gives it high portability, support for many data formats (including FITS, VOTable, text-based formats and SQL databases), extensibility and scalability. Where possible the tools are written to accept streamed data so the size of tables which can be processed is not limited by available memory. As well as the tutorial and reference information in this document, detailed on-line help is available from the tools themselves. STILTS is available under the GNU General Public Licence.

  20. Intercalibrating and Validating Saphir and Atms Observations

    Science.gov (United States)

    Moradi, I.; Ferraro, R. R.

    2014-12-01

    We present the results of evaluating observations from microwave instruments aboard the Suomi National Polar-orbiting Partnership (NPP, ATMS instrument) and Megha-Tropiques (SAPHIR instrument) satellites. ATMS is a cross-track microwave sounder that currently flying on the Suomi National Polar-orbiting Partnership (S-NPP) satellite, launched in October 2011, which is in a Sun-synchronous orbit with the ascending equatorial crossing time at 01:30 a.m. Megha-Tropiques, launched in Nov 2011, is a low-inclination satellite meaning that the satellite only visits the tropical band between 30 S and 30 N. SAPHIR is a microwave humidity sounder with 6 channels operating at the frequencies close to the water vapor absorption line at 183 GHz. Megha-Tropiques revisits the tropical regions several times a day and provide a great capability for inter-calibrating the observations with the polar orbiting satellites. The study includes inter-comparison and inter-calibration of observations of similar channels from the two instruments, evaluation of the satellite data using high-quality radiosonde data from Atmospheric Radiation Measurement Program and GPS Radio Occultaion Observations from COSMIC mission, as well as geolocation error correction. The results of this study are valuable for generating climate data records from these instruments as well as for extending current climate data records from similar instruments such as AMSU-B and MHS to the ATMS and SAPHIR instruments.

  1. Systems Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) Technical Reference

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; W. J. Galyean; S. T. Beck

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows? operating system. Herein information is provided on the principles used in the construction and operation of Version 6.0 and 7.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, cut set "recovery," end state manipulation, and use of "compound events."

  2. Systems Analysis Programs for Hands-On Integrated Reliability Evaluations (SAPHIRE) Technical Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; W. J. Galyean; S. T. Beck

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows? operating system. Herein information is provided on the principles used in the construction and operation of Version 6.0 and 7.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, cut set "recovery," end state manipulation, and use of "compound events."

  3. Strangeness Photoproduction with the Saphir Detector

    CERN Document Server

    Menze, D W

    1997-01-01

    Statistically improved data of total cross sections and of angular distributions for differential cross sections and hyperon recoil polarizations of the reactions \\gamma p --> K^+ \\Lambda and \\gamma p --> K^+ \\Sigma^0 have been collected with the SAPHIR detector at photon energies between threshold and 2.0 GeV. Here total cross section data up to 1.5 GeV are presented. The opposite sign of \\Lambda and \\Sigma polarization and the change of sign between forward and backward direction could be confirmed by higher statistics. A steep threshold behaviour of the K^+ \\Lambda total cross section is observed.

  4. Retrieval and Validation of Upper Tropospheric Humidity from SAPHIR aboard Megha-Tropics

    Science.gov (United States)

    Mathew, Nizy; Krishna Moorthy, K.; Raju C, Suresh; Pillai Renju, Ramachandran; Oommen John, Viju

    Upper tropospheric humidity (UTH) has been derived from brightness temperature of SAPHIR payload aboard Megha-Tropiques (MT) mission. The channels of SAPHIR are very close to the water vapor absorption peak at 183.31GHz. First three channels at 183.31±0.2 GHz, 183.31±1.1 GHz and 183.31±2.8 are used for upper tropospheric humidity (UTH) studies. The channel at 183.31±0.2 GHz enables retrieval of humidity up to the highest altitude possible with the present nadir looking microwave humidity sounders. Transformation coefficients for the first three channels for all the incidence angles have been derived using the simulated brightness temperatures and Jocobians with Chevellier data set as input to the radiative transfer model ARTS. These coefficients are used to convert brightness temperatures to upper tropospheric humidity from different channels. A stringent deep convective cloud screeing has been done using the brightness temperatures of SAPHIR itself. The retrieved UTH has been validated with the Jacobian weighted UTH derived from collocated radiosonde observations and also with the humidity profiles derived from ground based microwave radiometer data. UTH variation over the inter-tropical region on global basis has been studied for one year, taking the advantage of the first humidity product with high spatial and temporal resolution over the tropical belt, unbiased with specific local times of the satellite pass. These data set have been used to adress the seasonal and spatial variability of humidity in the tropical upper tropospheric region and humidity variability during Indian monsoon. The details of the MT-SAPHIR characteristics, methodology and results will be presented. begin{enumerate} begin{center}

  5. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  6. Can SAPHIR Instrument Onboard MEGHATROPIQUES Retrieve Hydrometeors and Rainfall Characteristics ?

    Science.gov (United States)

    Goyal, J. M.; Srinivasan, J.; Satheesh, S. K.

    2014-12-01

    MEGHATROPIQUES (MT) is an Indo-French satellite launched in 2011 with the main intention of understanding the water cycle in the tropical region and is a part of GPM constellation. MADRAS was the primary instrument on-board MT to estimate rainfall characteristics, but unfortunately it's scanning mechanism failed obscuring the primary goal of the mission.So an attempt has been made to retrieve rainfall and different hydrometeors using other instrument SAPHIR onboard MT. The most important advantage of using MT is its orbitography which is specifically designed for tropical regions and can reach up to 6 passes per day more than any other satellite currently in orbit. Although SAPHIR is an humidity sounder with six channels centred around 183 GHz channel, it still operates in the microwave region which directly interacts with rainfall, especially wing channels and thus can pick up rainfall signatures. Initial analysis using radiative transfer models also establish this fact .To get more conclusive results using observations, SAPHIR level 1 brightness temperature (BT) data was compared with different rainfall products utilizing the benefits of each product. SAPHIR BT comparison with TRMM 3B42 for one pass clearly showed that channel 5 and 6 have a considerable sensitivity towards rainfall. Following this a huge database of more than 300000 raining pixels of spatially and temporally collocated 3B42 rainfall and corresponding SAPHIR BT for an entire month was created to include all kinds of rainfall events, to attain higher temporal resolution collocated database was also created for SAPHIR BT and rainfall from infrared sensor on geostationary satellite Kalpana 1.These databases were used to understand response of various channels of SAPHIR to different rainfall regimes . TRMM 2A12 rainfall product was also used to identify capabilities of SAPHIR to retrieve cloud and ice water path which also gave significant correlation. Conclusively,we have shown that SAPHIR has

  7. SAPHIRE 8 Volume 3 - Users' Guide

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. Vedros; K. J. Kvarfordt

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). This reference guide will introduce the SAPHIRE Version 8.0 software. A brief discussion of the purpose and history of the software is included along with general information such as installation instructions, starting and stopping the program, and some pointers on how to get around inside the program. Next, database concepts and structure are discussed. Following that discussion are nine sections, one for each of the menu options on the SAPHIRE main menu, wherein the purpose and general capabilities for each option are

  8. 75 FR 33162 - Airworthiness Directives; Microturbo Saphir 20 Model 095 Auxiliary Power Units (APUs)

    Science.gov (United States)

    2010-06-11

    ...-21-AD; Amendment 39-16332; AD 2010-13-01] RIN 2120-AA64 Airworthiness Directives; Microturbo Saphir..., of the SAPHIR 20 Model 095 APU is a life-limited part. Microturbo had determined through ``fleet...-015-03, of the SAPHIR 20 Model 095 APU is a life-limited part. Microturbo had determined...

  9. A new plant chamber facility PLUS coupled to the atmospheric simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2015-11-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been build and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees are mixed with synthetic air and are transferred to the SAPHIR chamber where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOC) can be studied in detail. In PLUS all important enviromental parameters (e.g. temperature, PAR, soil RH etc.) are well-controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leafes of the plants is constructed such that gases are exposed to FEP Teflon film and other Teflon surfaces only to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 LED panels which have an emission strength up to 800 μmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOC) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light and temperature dependent BVOC emissions are studied using six Quercus Ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus Ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental set up and the utility of the newly added plant chamber.

  10. SAPHIRE 8 Software Independent Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rae J. Nims; Kent M. Norris

    2010-02-01

    SAPHIRE 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach is being taken for the IV&V activities on each vital software object. The IV&V plan is structured around NUREG/BR-0167, “Software Quality Assurance Program and Guidelines,” February 1993. The Nuclear Regulatory Research Office Instruction No.: PRM-12, “Software Quality Assurance for RES Sponsored Codes,” March 26, 2007 specifies that RES-sponsored software is to be evaluated against NUREG/BR-0167. Per the guidance in NUREG/BR-0167, SAPHIRE is classified as “Level 1.” Level 1 software corresponds to technical application software used in a safety decision.

  11. Quality assessment and assimilation of Megha-Tropiques SAPHIR radiances into WRF assimilation system

    Science.gov (United States)

    Singh, Randhir; Ojha, Satya P.; Kishtawal, C. M.; Pal, P. K.

    2013-07-01

    This study presents an initial assessment of the quality of radiances measured from SAPHIR (Sounder for Probing Vertical Profiles of Humidity) on board Megha-Tropiques (Indo-French joint satellite), launched by the Indian Space Research Organisation on 12 October 2011. The radiances measured from SAPHIR are compared with those simulated by the radiative transfer model (RTM) using radiosondes measurements, Atmospheric Infrared Sounder retrievals, and National Centers for Environmental Prediction (NCEP) analyzed fields over the Indian subcontinent, during January to November 2012. The radiances from SAPHIR are also compared with the similar measurements available from Microwave Humidity Sounder (MHS) on board MetOp-A and NOAA-18/19 satellites, during January to November 2012. A limited comparison is also carried out between SAPHIR measured and the RTM computed radiances using European Centre for Medium-Range Weather Forecasts analyzed fields, during May and November 2012. The comparison of SAPHIR measured radiances with RTM simulated and MHS observed radiances reveals that SAPHIR observations are of good quality. After the initial assessment of the quality of the SAPHIR radiances, these radiances have been assimilated within the Weather Research and Forecasting (WRF) three-dimensional variational data assimilation system. Analysis/forecast cycling experiments with and without SAPHIR radiances are performed over the Indian region during the entire month of May 2012. The assimilation of SAPHIR radiances shows considerable improvements (with moisture analysis error reduction up to 30%) in the tropospheric analyses and forecast of moisture, temperature, and winds when compared to NCEP analyses and radiances measurement obtained from MHS, Advanced Microwave Sounding Unit-A, and High Resolution Infrared Sounder. Assimilation of SAPHIR radiances also resulted in substantial improvement in the precipitation forecast skill when compared with satellite-derived rain. Overall

  12. Independent Verification and Validation Of SAPHIRE 8 System Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-02-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE System Test Plan is to assess the approach to be taken for intended testing activities associated with the SAPHIRE software product. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  13. Mathematical tools for data mining set theory, partial orders, combinatorics

    CERN Document Server

    Simovici, Dan A

    2014-01-01

    Data mining essentially relies on several mathematical disciplines, many of which are presented in this second edition of this book. Topics include partially ordered sets, combinatorics, general topology, metric spaces, linear spaces, graph theory. To motivate the reader a significant number of applications of these mathematical tools are included ranging from association rules, clustering algorithms, classification, data constraints, logical data analysis, etc. The book is intended as a reference for researchers and graduate students. The current edition is a significant expansion of the firs

  14. Representative sets and irrelevant vertices: New tools for kernelization

    CERN Document Server

    Kratsch, Stefan

    2011-01-01

    Recent work of the present authors provided a polynomial kernel for Odd Cycle Transversal by introducing matroid-based tools into kernelization. In the current work we further establish the usefulness of matroid theory to kernelization by showing applications of a result on representative sets due to Lov\\'asz (Combinatorial Surveys 1977) and Marx (TCS 2009). We give two types of applications: 1. Direct applications of the representative objects idea. In this direction, we give a polynomial kernel for Almost 2-SAT by reducing the problem to a cut problem with pairs of vertices as sinks, and subsequently reducing the set of pairs to a representative subset of bounded size. This implies polynomial kernels for several other problems, including Vertex Cover parameterized by the size of the LP gap, and the RHorn-Backdoor Deletion Set problem from practical SAT solving. We also get a polynomial kernel for Multiway Cut with deletable terminals, by producing a representative set of vertices, of bounded size, which is ...

  15. Workplace wellness using online learning tools in a healthcare setting.

    Science.gov (United States)

    Blake, Holly; Gartshore, Emily

    2016-09-01

    The aim was to develop and evaluate an online learning tool for use with UK healthcare employees, healthcare educators and healthcare students, to increase knowledge of workplace wellness as an important public health issue. A 'Workplace Wellness' e-learning tool was developed and peer-reviewed by 14 topic experts. This focused on six key areas relating to workplace wellness: work-related stress, musculoskeletal disorders, diet and nutrition, physical activity, smoking and alcohol consumption. Each key area provided current evidence-based information on causes and consequences, access to UK government reports and national statistics, and guidance on actions that could be taken to improve health within a workplace setting. 188 users (93.1% female, age 18-60) completed online knowledge questionnaires before (n = 188) and after (n = 88) exposure to the online learning tool. Baseline knowledge of workplace wellness was poor (n = 188; mean accuracy 47.6%, s.d. 11.94). Knowledge significantly improved from baseline to post-intervention (mean accuracy = 77.5%, s.d. 13.71) (t(75) = -14.801, p health parameters. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Genome display tool: visualizing features in complex data sets

    Directory of Open Access Journals (Sweden)

    Lu Yue

    2007-02-01

    Full Text Available Abstract Background The enormity of the information contained in large data sets makes it difficult to develop intuitive understanding. It would be useful to have software that allows visualization of possible correlations between properties that can be associated with a core data set. In the case of bacterial genomes, existing visualization tools focus on either global properties such as variations in composition or detailed local displays of the features that comprise the annotation. It is not easy to visualize other information in the context of this core information. Results A Java based software known as the Genome Display Tool (GDT, allows the user to simultaneously view the distribution of multiple attributes pertaining to genes and intragenic regions in a single bacterial genome using different colours and shapes on a single screen. The display represents each gene by small boxes that correlate with physical position in the genome. The size of the boxes is dynamically allocated based on the number of genes and a zoom feature allows close-up inspection of regions of interest. The display is interfaced with a MS-Access relational database and can display any feature in the database that can be represented by discrete values. Data is readily added to the database from an MS-Excel spread sheet. The functionality of GDT is demonstrated by comparing the results of two predictions of recent horizontal transfer events in the genome of Synechocystis PCC-6803. The resulting display allows the user to immediately see how much agreement exists between the two methods and also visualize how genes in various categories (e.g. predicted in both methods, one method etc are distributed in the genome. Conclusion The GDT software provides the user with a powerful tool that allows development of an intuitive understanding of the relative distribution of features in a large data set. As additional features are added to the data set, the number of possible

  17. Transcriptome profiling of Set5 and Set1 methyltransferases: Tools for visualization of gene expression

    Directory of Open Access Journals (Sweden)

    Glòria Mas Martín

    2014-12-01

    Full Text Available Cells regulate transcription by coordinating the activities of multiple histone modifying complexes. We recently identified the yeast histone H4 methyltransferase Set5 and discovered functional overlap with the histone H3 methyltransferase Set1 in gene expression. Specifically, using next-generation RNA sequencing (RNA-Seq, we found that Set5 and Set1 function synergistically to regulate specific transcriptional programs at subtelomeres and transposable elements. Here we provide a comprehensive description of the methodology and analysis tools corresponding to the data deposited in NCBI's Gene Expression Omnibus (GEO under the accession number GSE52086. This data complements the experimental methods described in Mas Martín G et al. (2014 and provides the means to explore the cooperative functions of histone H3 and H4 methyltransferases in the regulation of transcription. Furthermore, a fully annotated R code is included to enable researchers to use the following computational tools: comparison of significant differential expression (SDE profiles; gene ontology enrichment of SDE; and enrichment of SDE relative to chromosomal features, such as centromeres, telomeres, and transposable elements. Overall, we present a bioinformatics platform that can be generally implemented for similar analyses with different datasets and in different organisms.

  18. Stacks: an analysis tool set for population genomics.

    Science.gov (United States)

    Catchen, Julian; Hohenlohe, Paul A; Bassham, Susan; Amores, Angel; Cresko, William A

    2013-06-01

    Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics.

  19. AORN Ergonomic Tool 5: Tissue Retraction in the Perioperative Setting.

    Science.gov (United States)

    Spera, Patrice; Lloyd, John D; Hernandez, Edward; Hughes, Nancy; Petersen, Carol; Nelson, Audrey; Spratt, Deborah G

    2011-07-01

    Manual retraction, a task performed to expose the surgical site, poses a high risk for musculoskeletal disorders that affect the hands, arms, shoulders, neck, and back. In recent years, minimally invasive and laparoscopic procedures have led to the development of multifunctional instruments and retractors capable of performing these functions that, in many cases, has eliminated the need for manual retraction. During surgical procedures that are not performed endoscopically, the use of self-retaining retractors enables the assistant to handle tissue and use exposure techniques that do not require prolonged manual retraction. Ergonomic Tool #5: Tissue Retraction in the Perioperative Setting provides an algorithm for perioperative care providers to determine when and under what circumstances manual retraction of tissue is safe and when the use of a self-retaining retractor should be considered.

  20. Investigation of Monoterpene Degradation in the Atmospheric Simulation Chamber SAPHIR

    Science.gov (United States)

    Kaminski, Martin; Acir, Ismail-Hakki; Bohn, Birger; Brauers, Theo; Dorn, Hans-Peter; Fuchs, Hendrik; Haeseler, Rolf; Hofzumahaus, Andreas; Li, Xin; Lutz, Anna; Nehr, Sascha; Rohrer, Franz; Tillmann, Ralf; Wegener, Robert; Wahner, Andreas

    2013-04-01

    Monoterpenes are the volatile organic compound (VOC) species with the highest emission rates on a global scale beside isoprene. In the atmosphere these compounds are rapidly oxidized. Due to their high reactivity towards hydroxyl radicals (OH) they determine the radical chemistry under biogenic conditions if monoterpene concentration is higher than isoprene concentration. Recent field campaigns showed large discrepancies between measured and modeled OH concentration at low NOx conditions together with high reactivity of VOC towards OH (Hofzumahaus et al. 2009) especially in tropical forest areas (Lelieveld et al. 2008). These discrepancies were partly explained by new reaction pathways in the isoprene degradation mechanism (Whalley et al 2011). However, even an additional recycling rate of 2.7 was insufficient to explain the measured OH concentration. So other VOC species could be involved in a nonclassical OH recycling. Since the discrepancies in OH also occurred in the morning hours when the OH chemistry was mainly dominated by monoterpenes, it was assumed that also the degradation of monoterpenes may lead to OH recycling in the absence of NO. (Whalley et al 2011). The photochemical degradation of four monoterpene species was studied under high VOC reactivity and low NOx conditions in a dedicated series of experiments in the atmospheric simulation chamber SAPHIR from August to September 2012 to overcome the lack of mechanistic information for monoterpene degradation schemes. α-Pinene, β-pinene and limonene were chosen as most prominent representatives of this substance class. Moreover the degradation of myrcene was investigated due to its structural analogy to isoprene. The SAPHIR chamber was equipped with instrumentation to measure all important OH precursors (O3, HONO, HCHO), the parent VOC and their main oxidation products, radicals (OH, HO2, RO2), the total OH reactivity, and photolysis frequencies to investigate the degradation mechanism of monoterpenes in

  1. Data Center IT Equipment Energy Assessment Tools: Current State of Commercial Tools, Proposal for a Future Set of Assessment Tools

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, Ben D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); National Univ., San Diego, CA (United States). School of Engineering

    2012-06-30

    This research project, which was conducted during the Summer and Fall of 2011, investigated some commercially available assessment tools with a focus on IT equipment to see if such tools could round out the DC Pro tool suite. In this research, the assessment capabilities of the various tools were compiled to help make “non-biased” information available to the public. This research should not be considered to be exhaustive on all existing vendor tools although a number of vendors were contacted. Large IT equipment OEM’s like IBM and Dell provide their proprietary internal automated software which does not work on any other IT equipment. However, found two companies with products that showed promise in performing automated assessments for IT equipment from different OEM vendors. This report documents the research and provides a list of software products reviewed, contacts and websites, product details, discussions with specific companies, a set of recommendations, and next steps. As a result of this research, a simple 3-level approach to an IT assessment tool is proposed along with an example of an assessment using a simple IT equipment data collection tool (Level 1, spreadsheet). The tool has been reviewed with the Green Grid and LBNL staff. The initial feedback has been positive although further refinement to the tool will be necessary. Proposed next steps include a field trial of at least two vendors’ software in two different data centers with an objective to prove the concept, ascertain the extent of energy and computational assessment, ease of installation and opportunities for continuous improvement. Based on the discussions, field trials (or case studies) are proposed with two vendors – JouleX (expected to be completed in 2012) and Sentilla.

  2. Megha-Tropiques/SAPHIR measurements of humidity profiles: validation with AIRS and global radiosonde network

    Science.gov (United States)

    Subrahmanyam, K. V.; Kumar, K. K.

    2013-12-01

    The vertical profiles of humidity measured by SAPHIR (Sondeur Atmospherique du Profil d' Humidité Intropicale par Radiométrie) on-board Megha-Tropiques satellite are validated using Atmosphere Infrared Sounder (AIRS) and ground based radiosonde observations during July-September 2012. SAPHIR provides humidity profiles at six pressure layers viz., 1000-850 (level 1), 850-700 (level 2), 700-550 (level 3), 550-400 (level 4) 400-250 (level 5) and 250-100(level 6) hPa. Segregated AIRS observations over land and oceanic regions are used to assess the performance of SAPHIR quantitatively. The regression analysis over oceanic region (125° W-180° W; 30° S-30° N) reveal that the SAPHIR measurements agrees very well with the AIRS measurements at levels 3, 4, 5 and 6 with correlation coefficients 0.79, 0.88, 0.87 and 0.78 respectively. However, at level 6 SAPHIR seems to be systematically underestimating the AIRS measurements. At level 2, the agreement is reasonably good with correlation coefficient of 0.52 and at level 1 the agreement is very poor with correlation coefficient 0.17. The regression analysis over land region (10° W-30° E; 8° N-30° N) revealed an excellent correlation between AIRS and SAPHIR at all the six levels with 0.80, 0.78, 0.84, 0.84, 0.86 and 0.65 respectively. However, again at levels 5 and 6, SAPHIR seems to be underestimating the AIRS measurements. After carrying out the quantitative comparison between SAPHIR and AIRS separately over land and ocean, the ground based global radiosonde network observations of humidity profiles over three distinct geographical locations (East Asia, tropical belt of South and North America and South Pacific) are then used to further validate the SAPHIR observations as AIRS has its own limitations. The SAPHIR observations within a radius of 50 km around the radiosonde stations are averaged and then the regression analysis is carried out at the first five levels of SAPHIR. The comparison is not carried out at sixth

  3. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Quality Assurance Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; R. Nims; K. J. Kvarfordt; C. Wharton

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 6 and 7, what constitutes its parts, and limitations of those processes.

  4. SAPHIRE: A New Flat-Panel Digital Mammography Detector With Avalanche Photoconductor and High-Resolution Field Emitter Readout

    Science.gov (United States)

    2006-06-01

    AD_________________ Award Number: W81XWH-04-1-0554 TITLE: SAPHIRE : A New Flat-Panel Digital... SAPHIRE : A New Flat-Panel Digital Mammography Detector with Avalanche Photoconductor and High-Resolution Field Emitter Readout 5b. GRANT NUMBER w81xwh-04...CsI), and form a charge image that is read out by a high-resolution field emitter array (FEA). We call the proposed detector SAPHIRE (Scintillator

  5. Setting of angles on machine tools speeded by magnetic protractor

    Science.gov (United States)

    Vale, L. B.

    1964-01-01

    An adjustable protractor facilitates transference of angles to remote machine tools. It has a magnetic base incorporating a beam which can be adjusted until its shadow coincides with an image on the screen of a projector.

  6. Methods improvements incorporated into the SAPHIRE ASP models

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1995-04-01

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3) enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.

  7. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  8. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with

  9. A Theoretically Based, Easy-to-Use Tool for Promoting Goal-Setting Behaviors in Youths

    Science.gov (United States)

    James, Anthony G.

    2017-01-01

    Extension youth development professionals benefit from having theoretically based, easy-to-use tools for promoting goal-setting behaviors in youths. The Youth Goal-Setting Map provides practitioners with a mechanism for helping youth develop attributes that place them on a pathway to thriving. This article provides the Youth Goal-Setting Map tool,…

  10. HEMODOSE: A Set of Multi-parameter Biodosimetry Tools

    Science.gov (United States)

    Hu, Shaowen; Blakely, William F.; Cucinotta, Francis A.

    2014-01-01

    After the events of September 11, 2001 and recent events at the Fukushima reactors in Japan, there is an increasing concern of the occurrence of nuclear and radiological terrorism or accidents that may result in large casualty in densely populated areas. To guide medical personnel in their clinical decisions for effective medical management and treatment of the exposed individuals, biological markers are usually applied to examine the radiation induced changes at different biological levels. Among these the peripheral blood cell counts are widely used to assess the extent of radiation induced injury. This is due to the fact that hematopoietic system is the most vulnerable part of the human body to radiation damage. Particularly, the lymphocyte, granulocyte, and platelet cells are the most radiosensitive of the blood elements, and monitoring their changes after exposure is regarded as the most practical and best laboratory test to estimate radiation dose. The HEMODOSE web tools are built upon solid physiological and pathophysiological understanding of mammalian hematopoietic systems, and rigorous coarse-grained biomathematical modeling and validation. Using single or serial granulocyte, lymphocyte, leukocyte, or platelet counts after exposure, these tools can estimate absorbed doses of adult victims very rapidly and accurately. Some patient data in historical accidents are utilized as examples to demonstrate the capabilities of these tools as a rapid point-of-care diagnostic or centralized high-throughput assay system in a large scale radiological disaster scenario. Unlike previous dose prediction algorithms, the HEMODOSE web tools establish robust correlations between the absorbed doses and victim's various types of blood cell counts not only in the early time window (1 or 2 days), but also in very late phase (up to 4 weeks) after exposure

  11. Electronic Mail in Academic Settings: A Multipurpose Communications Tool.

    Science.gov (United States)

    D'Souza, Patricia Veasey

    1992-01-01

    Explores possible uses of electronic mail in three areas of the academic setting: instruction, research, and administration. Electronic mail is defined, the components needed to get started with electronic mail are discussed, and uses and benefits of electronic mail in diverse educational environments are suggested. (12 references) (DB)

  12. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Data Loading Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory. This report is intended to assist the user to enter PRA data into the SAPHIRE program using the built-in MAR-D ASCII-text file data transfer process. Towards this end, a small sample database is constructed and utilized for demonstration. Where applicable, the discussion includes how the data processes for loading the sample database relate to the actual processes used to load a larger PRA models. The procedures described herein were developed for use with SAPHIRE Version 6.0 and Version 7.0. In general, the data transfer procedures for version 6 and 7 are the same, but where deviations exist, the differences are noted. The guidance specified in this document will allow a user to have sufficient knowledge to both understand the data format used by SAPHIRE and to carry out the transfer of data between different PRA projects.

  13. Technical Note: Intercomparison of formaldehyde measurements at the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    A. Wisthaler

    2007-11-01

    Full Text Available The atmosphere simulation chamber SAPHIR at the Research Centre Jülich was used to test the suitability of state-of-the-art analytical instruments for the measurement of gas-phase formaldehyde (HCHO in air. Five analyzers based on four different sensing principles were deployed: a differential optical absorption spectrometer (DOAS, cartridges for 2,4-dinitro-phenyl-hydrazine (DNPH derivatization followed by off-line high pressure liquid chromatography (HPLC analysis, two different types of commercially available wet chemical sensors based on Hantzsch fluorimetry, and a proton-transfer-reaction mass spectrometer (PTR-MS. A new optimized mode of operation was used for the PTR-MS instrument which significantly enhanced its performance for on-line HCHO detection at low absolute humidities.

    The instruments were challenged with typical ambient levels of HCHO ranging from zero to several ppb. Synthetic air of high purity and particulate-filtered ambient air were used as sample matrices in the atmosphere simulation chamber onto which HCHO was spiked under varying levels of humidity and ozone. Measurements were compared to mixing ratios calculated from the chamber volume and the known amount of HCHO injected into the chamber; measurements were also compared between the different instruments. The formal and blind intercomparison exercise was conducted under the control of an independent referee. A number of analytical problems associated with the experimental set-up and with individual instruments were identified, the overall agreement between the methods was good.

  14. Technical Note: Intercomparison of formaldehyde measurements at the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Wisthaler, A.; Apel, E. C.; Bossmeyer, J.; Hansel, A.; Junkermann, W.; Koppmann, R.; Meier, R.; Müller, K.; Solomon, S. J.; Steinbrecher, R.; Tillmann, R.; Brauers, T.

    2008-04-01

    The atmosphere simulation chamber SAPHIR at the Research Centre Jülich was used to test the suitability of state-of-the-art analytical instruments for the measurement of gas-phase formaldehyde (HCHO) in air. Five analyzers based on four different sensing principles were deployed: a differential optical absorption spectrometer (DOAS), cartridges for 2,4-dinitrophenylhydrazine (DNPH) derivatization followed by off-line high pressure liquid chromatography (HPLC) analysis, two different types of commercially available wet chemical sensors based on Hantzsch fluorimetry, and a proton-transfer-reaction mass spectrometer (PTR-MS). A new optimized mode of operation was used for the PTR-MS instrument which significantly enhanced its performance for online HCHO detection at low absolute humidities. The instruments were challenged with typical ambient levels of HCHO ranging from zero to several ppb. Synthetic air of high purity and particulate-filtered ambient air were used as sample matrices in the atmosphere simulation chamber onto which HCHO was spiked under varying levels of humidity and ozone. Measurements were compared to mixing ratios calculated from the chamber volume and the known amount of HCHO injected into the chamber; measurements were also compared between the different instruments. The formal and blind intercomparison exercise was conducted under the control of an independent referee. A number of analytical problems associated with the experimental set-up and with individual instruments were identified, the overall agreement between the methods was fair.

  15. Technical Note: Intercomparison of formaldehyde measurements at the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    A. Wisthaler

    2008-04-01

    Full Text Available The atmosphere simulation chamber SAPHIR at the Research Centre Jülich was used to test the suitability of state-of-the-art analytical instruments for the measurement of gas-phase formaldehyde (HCHO in air. Five analyzers based on four different sensing principles were deployed: a differential optical absorption spectrometer (DOAS, cartridges for 2,4-dinitro-phenyl-hydrazine (DNPH derivatization followed by off-line high pressure liquid chromatography (HPLC analysis, two different types of commercially available wet chemical sensors based on Hantzsch fluorimetry, and a proton-transfer-reaction mass spectrometer (PTR-MS. A new optimized mode of operation was used for the PTR-MS instrument which significantly enhanced its performance for online HCHO detection at low absolute humidities.

    The instruments were challenged with typical ambient levels of HCHO ranging from zero to several ppb. Synthetic air of high purity and particulate-filtered ambient air were used as sample matrices in the atmosphere simulation chamber onto which HCHO was spiked under varying levels of humidity and ozone. Measurements were compared to mixing ratios calculated from the chamber volume and the known amount of HCHO injected into the chamber; measurements were also compared between the different instruments. The formal and blind intercomparison exercise was conducted under the control of an independent referee. A number of analytical problems associated with the experimental set-up and with individual instruments were identified, the overall agreement between the methods was fair.

  16. Gerasimov-Drell-Hearn Sum Rule and the Discrepancy between the New CLAS and SAPHIR Data

    CERN Document Server

    Mart, T

    2008-01-01

    Contribution of the K^+\\Lambda channel to the Gerasimov-Drell-Hearn (GDH) sum rule has been calculated by using the models that fit the recent SAPHIR or CLAS differential cross section data. It is shown that the two data sets yield quite different contributions. Contribution of this channel to the forward spin polarizability of the proton has been also calculated. It is also shown that the inclusion of the recent CLAS C_x and C_z data in the fitting data base does not significantly change the result of the present calculation. Results of the fit, however, reveal the role of the S_{11}(1650), P_{11}(1710), P_{13}(1720), and P_{13}(1900) resonances for the description of the C_x and C_z data. A brief discussion on the importance of these resonances is given. Measurements of the polarized total cross section \\sigma_{TT'} by the CLAS, LEPS, and MAMI collaborations are expected to verify this finding.

  17. Investigation of MACR oxidation by OH in the atmosphere simulation chamber SAPHIR at low NO concentrations.

    Science.gov (United States)

    Fuchs, Hendrik; Acir, Ismail-Hakki; Bohn, Birger; Brauers, Theo; Dorn, Hans-Peter; Häseler, Rolf; Hofzumahaus, Andreas; Holland, Frank; Li, Xin; Lu, Keding; Lutz, Anna; Kaminski, Martin; Nehr, Sascha; Rohrer, Franz; Tillmann, Ralf; Wegener, Robert; Wahner, Andreas

    2013-04-01

    During recent field campaigns, hydroxyl radical (OH) concentrations were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low nitrogen monoxide (NO) concentrations. These discrepancies were observed in forests, where isoprene oxidation turnover rates were large. Methacrolein (MACR) is one of the major first generation products of isoprene oxidation, so that MACR was also an important reactant for OH. Here, we present a detailed investigation of the MACR oxidation mechanism including a full set of accurate and precise radical measurements in the atmosphere simulation chamber SAPHIR in Juelich, Germany. The conditions during the chamber experiments were comparable to those during field campaigns with respect to radical and trace gas concentrations. In particular, OH reactivity was as high as 15 per second and NO mixing ratios were as low as 200pptv. Results of the experiments were compared to model predictions using the Master Chemical Mechanism, in order to identify so far unknown reaction pathways, which potentially recycle OH radicals without reactions with NO.

  18. A simulator tool set for evaluating HEVC/SHVC streaming

    Science.gov (United States)

    Al Hadhrami, Tawfik; Nightingale, James; Wang, Qi; Grecos, Christos; Kehtarnavaz, Nasser

    2015-02-01

    Video streaming and other multimedia applications account for an ever increasing proportion of all network traffic. The recent adoption of High Efficiency Video Coding (HEVC) as the H.265 standard provides many opportunities for new and improved services multimedia services and applications in the consumer domain. Since the delivery of version one of H.265, the Joint Collaborative Team on Video Coding have been working towards standardisation of a scalable extension (SHVC) to the H.265 standard and a series of range extensions and new profiles. As these enhancements are added to the standard the range of potential applications and research opportunities will expend. For example the use of video is also growing rapidly in other sectors such as safety, security, defence and health with real-time high quality video transmission playing an important role in areas like critical infrastructure monitoring and disaster management. Each of which may benefit from the application of enhanced HEVC/H.265 and SHVC capabilities. The majority of existing research into HEVC/H.265 transmission has focussed on the consumer domain addressing issues such as broadcast transmission and delivery to mobile devices with the lack of freely available tools widely cited as an obstacle to conducting this type of research. In this paper we present a toolset which facilitates the transmission and evaluation of HEVC/H.265 and SHVC encoded video on the popular open source NCTUns simulator. Our toolset provides researchers with a modular, easy to use platform for evaluating video transmission and adaptation proposals on large scale wired, wireless and hybrid architectures. The toolset consists of pre-processing, transmission, SHVC adaptation and post-processing tools to gather and analyse statistics. It has been implemented using HM15 and SHM5, the latest versions of the HEVC and SHVC reference software implementations to ensure that currently adopted proposals for scalable and range extensions to

  19. Parents' and Service Providers' Perceptions of the Family Goal Setting Tool: A Pilot Study

    Science.gov (United States)

    Rodger, Sylvia; O'Keefe, Amy; Cook, Madonna; Jones, Judy

    2012-01-01

    Background: This qualitative study describes parents' and service providers' experiences in using the Family Goal Setting Tool (FGST). This article looks specifically at the tool's perceived clinical utility during annual, collaborative goal setting. Methods: Participants included eight parents and ten service providers involved in a Family and…

  20. Inter-calibration and validation of observations from SAPHIR and ATMS instruments

    Science.gov (United States)

    Moradi, I.; Ferraro, R. R.

    2015-12-01

    We present the results of evaluating observations from microwave instruments aboard the Suomi National Polar-orbiting Partnership (NPP, ATMS instrument) and Megha-Tropiques (SAPHIR instrument) satellites. The study includes inter-comparison and inter-calibration of observations of similar channels from the two instruments, evaluation of the satellite data using high-quality radiosonde data from Atmospheric Radiation Measurement Program and GPS Radio Occultaion Observations from COSMIC mission, as well as geolocation error correction. The results of this study are valuable for generating climate data records from these instruments as well as for extending current climate data records from similar instruments such as AMSU-B and MHS to the ATMS and SAPHIR instruments. Reference: Moradi et al., Intercalibration and Validation of Observations From ATMS and SAPHIR Microwave Sounders. IEEE Transactions on Geoscience and Remote Sensing. 01/2015; DOI: 10.1109/TGRS.2015.2427165

  1. Independent Verification and Validation SAPHIRE Version 8 Final Report Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-04-01

    This report provides an evaluation of the SAPHIRE version 8 software product. SAPHIRE version 8 is being developed with a phased or cyclic iterative rapid application development methodology. Due to this approach, a similar approach has been taken for the IV&V activities on each vital software object. IV&V and Software Quality Assurance (SQA) activities occur throughout the entire development life cycle and therefore, will be required through the full development of SAPHIRE version 8. Later phases of the software life cycle, the operation and maintenance phases, are not applicable in this effort since the IV&V is being done prior to releasing Version 8.

  2. Mobile Dichotomous Key Application as a Scaffolding Tool in the Museum Setting

    Science.gov (United States)

    Knight, Kathryn

    2012-01-01

    This study explored the use of a dichotomous key as a scaffolding tool in the museum setting. The dichotomous key was designed as a scaffolding tool to help students make more detailed observations as they identified various species of birds on display. The dichotomous key was delivered to groups of fifth and seventh graders in two ways: on a…

  3. JAG: A Computational Tool to Evaluate the Role of Gene-Sets in Complex Traits.

    Science.gov (United States)

    Lips, Esther S; Kooyman, Maarten; de Leeuw, Christiaan; Posthuma, Danielle

    2015-05-14

    Gene-set analysis has been proposed as a powerful tool to deal with the highly polygenic architecture of complex traits, as well as with the small effect sizes typically found in GWAS studies for complex traits. We developed a tool, Joint Association of Genetic variants (JAG), which can be applied to Genome Wide Association (GWA) data and tests for the joint effect of all single nucleotide polymorphisms (SNPs) located in a user-specified set of genes or biological pathway. JAG assigns SNPs to genes and incorporates self-contained and/or competitive tests for gene-set analysis. JAG uses permutation to evaluate gene-set significance, which implicitly controls for linkage disequilibrium, sample size, gene size, the number of SNPs per gene and the number of genes in the gene-set. We conducted a power analysis using the Wellcome Trust Case Control Consortium (WTCCC) Crohn's disease data set and show that JAG correctly identifies validated gene-sets for Crohn's disease and has more power than currently available tools for gene-set analysis. JAG is a powerful, novel tool for gene-set analysis, and can be freely downloaded from the CTG Lab website.

  4. JAG: A Computational Tool to Evaluate the Role of Gene-Sets in Complex Traits

    Directory of Open Access Journals (Sweden)

    Esther S. Lips

    2015-05-01

    Full Text Available Gene-set analysis has been proposed as a powerful tool to deal with the highly polygenic architecture of complex traits, as well as with the small effect sizes typically found in GWAS studies for complex traits. We developed a tool, Joint Association of Genetic variants (JAG, which can be applied to Genome Wide Association (GWA data and tests for the joint effect of all single nucleotide polymorphisms (SNPs located in a user-specified set of genes or biological pathway. JAG assigns SNPs to genes and incorporates self-contained and/or competitive tests for gene-set analysis. JAG uses permutation to evaluate gene-set significance, which implicitly controls for linkage disequilibrium, sample size, gene size, the number of SNPs per gene and the number of genes in the gene-set. We conducted a power analysis using the Wellcome Trust Case Control Consortium (WTCCC Crohn’s disease data set and show that JAG correctly identifies validated gene-sets for Crohn’s disease and has more power than currently available tools for gene-set analysis. JAG is a powerful, novel tool for gene-set analysis, and can be freely downloaded from the CTG Lab website.

  5. Use of tool sets by chimpanzees for multiple purposes in Moukalaba-Doudou National Park, Gabon.

    Science.gov (United States)

    Wilfried, Ebang Ella Ghislain; Yamagiwa, Juichi

    2014-10-01

    We report our recent findings on the use of tool sets by chimpanzees in Moukalaba-Doudou National Park, Gabon. Direct observations and evidences left by chimpanzees showed that chimpanzees used sticks as pounders, enlargers, and collectors to extract honey from beehives of stingless bees (Meliponula sp.), which may correspond to those previously found in the same site for fishing termites and to those found in Loango National Park, Gabon. However, we observed chimpanzees using a similar set of tools for hunting a medium-sized mammal (possibly mongoose) that hid inside a log. This is the first report of hunting with tools by a chimpanzee population in Central Africa. Chimpanzees may recognize the multiple functions and applicability of tools (extracting honey and driving prey), although it is still a preliminary speculation. Our findings may provide us a new insight on the chimpanzee's flexibility of tool use and cognitive abilities of complex food gathering.

  6. Intervene: a tool for intersection and visualization of multiple gene or genomic region sets.

    Science.gov (United States)

    Khan, Aziz; Mathelier, Anthony

    2017-05-31

    A common task for scientists relies on comparing lists of genes or genomic regions derived from high-throughput sequencing experiments. While several tools exist to intersect and visualize sets of genes, similar tools dedicated to the visualization of genomic region sets are currently limited. To address this gap, we have developed the Intervene tool, which provides an easy and automated interface for the effective intersection and visualization of genomic region or list sets, thus facilitating their analysis and interpretation. Intervene contains three modules: venn to generate Venn diagrams of up to six sets, upset to generate UpSet plots of multiple sets, and pairwise to compute and visualize intersections of multiple sets as clustered heat maps. Intervene, and its interactive web ShinyApp companion, generate publication-quality figures for the interpretation of genomic region and list sets. Intervene and its web application companion provide an easy command line and an interactive web interface to compute intersections of multiple genomic and list sets. They have the capacity to plot intersections using easy-to-interpret visual approaches. Intervene is developed and designed to meet the needs of both computer scientists and biologists. The source code is freely available at https://bitbucket.org/CBGR/intervene , with the web application available at https://asntech.shinyapps.io/intervene .

  7. Zebrafish Expression Ontology of Gene Sets (ZEOGS): a tool to analyze enrichment of zebrafish anatomical terms in large gene sets.

    Science.gov (United States)

    Prykhozhij, Sergey V; Marsico, Annalisa; Meijsing, Sebastiaan H

    2013-09-01

    The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene expression

  8. Retrieval of cloud ice water path using SAPHIR on board Megha-Tropiques over the tropical ocean

    Science.gov (United States)

    Piyush, Durgesh Nandan; Goyal, Jayesh; Srinivasan, J.

    2017-04-01

    The SAPHIR sensor onboard Megha-Tropiques (MT) measures the earth emitted radiation at frequencies near the water vapor absorption band. SAPHIR operates in six frequencies ranging from 183 ± 0.1 to 183 ± 11 GHz. These frequencies have been used to retrieve cloud ice water path (IWP) at a very high resolution. A method to retrieve IWP over the Indian ocean region is attempted in this study. The study is in two parts, in first part a radiative transfer based simulation is carried out to give an insight of using SAPHIR frequency channels for IWP retrieval, in the next part the real observations of SAPHIR and TRMM-TMI was used for IWP retrieval. The concurrent observations of SAPHIR brightness temperatures (Tbs) and TRMM TMI IWP were used in the development of the retrieval algorithm. An Eigen Vector analysis was done to identify weight of each channel in retrieving IWP; following this a two channel regression based algorithm was developed. The SAPHIR channels which are away from the water vapor absorption band were used to avoid possible water vapor contamination. When the retrieval is compared with independent test dataset, it gives a correlation of 0.80 and RMSE of 3.5%. SAPHIR derived IWP has been compared with other available global IWP products such as TMI, MSPPS, CloudSat and GPM-GMI qualitatively as well as quantitatively. PDF comparison of SAPHIR derived IWP found to have good agreement with CloudSat. Zonal mean comparison with recently launched GMI shows the strength of this algorithm.

  9. Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets

    Science.gov (United States)

    Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.

    On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.

  10. A Rough Set-Based Effective State Identification Method of Multisensor Tool Condition Monitoring System

    Directory of Open Access Journals (Sweden)

    Nan Xie

    2014-06-01

    Full Text Available Multisensor improves the accuracy of machine tool condition monitoring system, which provides the critical feedback information to the manufacture process controller. Multisensor monitoring system needs to collect abundant data to employ attribute extraction, election, reduction, and classification to form the decision knowledge. A machine tool condition monitoring system has been built and the method of tool condition decision knowledge discovery is also presented. Multiple sensors include vibration, force, acoustic emission, and main spindle current. The novel approach engages rough theory as a knowledge extraction tool to work on the data that are obtained from both multisensor and machining parameters and then extracts a set of minimal state identification rules encoding the preference pattern of decision making by domain experts. By means of the knowledge acquired, the tool conditions are identified. A case study is presented to illustrate that the approach produces effective and minimal rules and provides satisfactory accuracy.

  11. Comparison of N2O5 mixing ratios during NO3Comp 2007 in SAPHIR

    Directory of Open Access Journals (Sweden)

    A. W. Rollins

    2012-07-01

    Full Text Available N2O5 detection in the atmosphere has been accomplished using techniques which have been developed during the last decade. Most techniques use a heated inlet to thermally decompose N2O5 to NO3, which can be detected by either cavity based absorption at 662 nm or by laser-induced fluorescence. In summer 2007, a large set of instruments, which were capable of measuring NO3 mixing ratios, were simultaneously deployed in the atmosphere simulation chamber SAPHIR in Jülich, Germany. Some of these instruments measured N2O5 mixing ratios either simultaneously or alternatively. Experiments focussed on the investigation of potential interferences from e.g. water vapor or aerosol and on the investigation of the oxidation of biogenic volatile organic compounds by NO3. The comparison of N2O5 mixing ratios shows an excellent agreement between measurements of instruments applying different techniques (3 cavity ring-down (CRDS instruments, 2 laser-induced fluorescence (LIF instruments. Data sets are highly correlated as indicated by the square of the linear correlation coefficients, R2, which values are larger than 0.96 for the entire data sets. N2O5 mixing ratios well agree within the combined accuracy of measurements. Slopes of the linear regression range between 0.87 and 1.26 and intercepts are negligible. The most critical aspect of N2O5 measurements by cavity ring-down instruments is the determination of the inlet and filter transmission efficiency. Measurements here show that the N2O5 inlet transmission efficiency can decrease in the presence of high aerosol loads, and that frequent filter/inlet changing is necessary to quantitatively sample N2O5 in some environments. The analysis of data also demonstrates that a general correction for degrading filter transmission is not applicable for all conditions encountered during this campaign. Besides the effect of a gradual degradation of the inlet transmission efficiency aerosol exposure, no other interference

  12. Le disque à saphir dans l’édition phonographique – Première partie

    OpenAIRE

    Sébald, Bruno

    2010-01-01

    L’expression « disque à saphir » que nous emploierons dans cet article recouvre un terme générique utilisé pour décrire des disques plats lus, à l’origine, par le truchement d’un saphir de forme sphérique. Cette technique relève d’un procédé de gravure verticale et se démarque ainsi du disque à aiguille, dont la pointe de lecture diffère et dont le mode de gravure est latéral. Elle s’en distingue également physiquement par la texture perlée qui couvre la surface du disque. Historiquement, le ...

  13. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    Science.gov (United States)

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  14. Policy Analysis: A Tool for Setting District Computer Use Policy. Paper and Report Series No. 97.

    Science.gov (United States)

    Gray, Peter J.

    This report explores the use of policy analysis as a tool for setting computer use policy in a school district by discussing the steps in the policy formation and implementation processes and outlining how policy analysis methods can contribute to the creation of effective policy. Factors related to the adoption and implementation of innovations…

  15. Determination of real machine-tool settings and minimization of real surface deviation by computerized inspection

    Science.gov (United States)

    Litvin, Faydor L.; Kuan, Chihping; Zhang, YI

    1991-01-01

    A numerical method is developed for the minimization of deviations of real tooth surfaces from the theoretical ones. The deviations are caused by errors of manufacturing, errors of installment of machine-tool settings and distortion of surfaces by heat-treatment. The deviations are determined by coordinate measurements of gear tooth surfaces. The minimization of deviations is based on the proper correction of initially applied machine-tool settings. The contents of accomplished research project cover the following topics: (1) Descriptions of the principle of coordinate measurements of gear tooth surfaces; (2) Deviation of theoretical tooth surfaces (with examples of surfaces of hypoid gears and references for spiral bevel gears); (3) Determination of the reference point and the grid; (4) Determination of the deviations of real tooth surfaces at the points of the grid; and (5) Determination of required corrections of machine-tool settings for minimization of deviations. The procedure for minimization of deviations is based on numerical solution of an overdetermined system of n linear equations in m unknowns (m much less than n ), where n is the number of points of measurements and m is the number of parameters of applied machine-tool settings to be corrected. The developed approach is illustrated with numerical examples.

  16. Nurse Practitioner Perceptions of a Diabetes Risk Assessment Tool in the Retail Clinic Setting.

    Science.gov (United States)

    Marjama, Kristen L; Oliver, JoAnn S; Hayes, Jennifer

    2016-10-01

    IN BRIEF This article describes a study to gain insight into the utility and perceived feasibility of the American Diabetes Association's Diabetes Risk Test (DRT) implemented by nurse practitioners (NPs) in the retail clinic setting. The DRT is intended for those without a known risk for diabetes. Researchers invited 1,097 NPs working in the retail clinics of a nationwide company to participate voluntarily in an online questionnaire. Of the 248 NPs who sent in complete responses, 114 (46%) indicated that they used the DRT in the clinic. Overall mean responses from these NPs indicated that they perceive the DRT as a feasible tool in the retail clinic setting. Use of the DRT or similar risk assessment tools in the retail clinic setting can aid in the identification of people at risk for type 2 diabetes.

  17. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) GEM Manual

    Energy Technology Data Exchange (ETDEWEB)

    C. L. Smith; J. Schroeder; S. T. Beck

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer running the Microsoft Windows? operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer and tester. Using the SAPHIRE analysis engine and relational database is a complementary program called GEM. GEM has been designed to simplify using existing PRA analysis for activities such as the NRC’s Accident Sequence Precursor program. In this report, the theoretical framework behind GEM-type calculations are discussed in addition to providing guidance and examples for performing evaluations when using the GEM software. As part of this analysis framework, the two types of GEM analysis are outlined, specifically initiating event (where an initiator occurs) and condition (where a component is failed for some length of time) assessments.

  18. User’s guide for the Delaware River Basin Streamflow Estimator Tool (DRB-SET)

    Science.gov (United States)

    Stuckey, Marla H.; Ulrich, James E.

    2016-06-09

    IntroductionThe Delaware River Basin Streamflow Estimator Tool (DRB-SET) is a tool for the simulation of streamflow at a daily time step for an ungaged stream location in the Delaware River Basin. DRB-SET was developed by the U.S. Geological Survey (USGS) and funded through WaterSMART as part of the National Water Census, a USGS research program on national water availability and use that develops new water accounting tools and assesses water availability at the regional and national scales. DRB-SET relates probability exceedances at a gaged location to those at an ungaged stream location. Once the ungaged stream location has been identified by the user, an appropriate streamgage is automatically selected in DRB-SET using streamflow correlation (map correlation method). Alternately, the user can manually select a different streamgage or use the closest streamgage. A report file is generated documenting the reference streamgage and ungaged stream location information, basin characteristics, any warnings, baseline (minimally altered) and altered (affected by regulation, diversion, mining, or other anthropogenic activities) daily mean streamflow, and the mean and median streamflow. The estimated daily flows for the ungaged stream location can be easily exported as a text file that can be used as input into a statistical software package to determine additional streamflow statistics, such as flow duration exceedance or streamflow frequency statistics.

  19. Comparison of N2O5 mixing ratios during NO3Comp 2007 in SAPHIR

    Science.gov (United States)

    Fuchs, H.; Simpson, W. R.; Apodaca, R. L.; Brauers, T.; Cohen, R. C.; Crowley, J. N.; Dorn, H.-P.; Dubé, W. P.; Fry, J. L.; Häseler, R.; Kajii, Y.; Kiendler-Scharr, A.; Labazan, I.; Matsumoto, J.; Mentel, T. F.; Nakashima, Y.; Rohrer, F.; Rollins, A. W.; Schuster, G.; Tillmann, R.; Wahner, A.; Wooldridge, P. J.; Brown, S. S.

    2012-11-01

    N2O5 detection in the atmosphere has been accomplished using techniques which have been developed during the last decade. Most techniques use a heated inlet to thermally decompose N2O5 to NO3, which can be detected by either cavity based absorption at 662 nm or by laser-induced fluorescence. In summer 2007, a large set of instruments, which were capable of measuring NO3 mixing ratios, were simultaneously deployed in the atmosphere simulation chamber SAPHIR in Jülich, Germany. Some of these instruments measured N2O5 mixing ratios either simultaneously or alternatively. Experiments focused on the investigation of potential interferences from, e.g., water vapour or aerosol and on the investigation of the oxidation of biogenic volatile organic compounds by NO3. The comparison of N2O5 mixing ratios shows an excellent agreement between measurements of instruments applying different techniques (3 cavity ring-down (CRDS) instruments, 2 laser-induced fluorescence (LIF) instruments). Datasets are highly correlated as indicated by the square of the linear correlation coefficients, R2, which values were larger than 0.96 for the entire datasets. N2O5 mixing ratios well agree within the combined accuracy of measurements. Slopes of the linear regression range between 0.87 and 1.26 and intercepts are negligible. The most critical aspect of N2O5 measurements by cavity ring-down instruments is the determination of the inlet and filter transmission efficiency. Measurements here show that the N2O5 inlet transmission efficiency can decrease in the presence of high aerosol loads, and that frequent filter/inlet changing is necessary to quantitatively sample N2O5 in some environments. The analysis of data also demonstrates that a general correction for degrading filter transmission is not applicable for all conditions encountered during this campaign. Besides the effect of a gradual degradation of the inlet transmission efficiency aerosol exposure, no other interference for N2O5

  20. Intercomparison of measurements of NO2 concentrations in the atmosphere simulation chamber SAPHIR during the NO3Comp campaign

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2010-01-01

    Full Text Available NO2 concentrations were measured by various instruments during the NO3Comp campaign at the atmosphere simulation chamber SAPHIR at Forschungszentrum Jülich, Germany, in June 2007. Analytical methods included photolytic conversion with chemiluminescence (PC-CLD, broadband cavity ring-down spectroscopy (BBCRDS, pulsed cavity ring-down spectroscopy (CRDS, incoherent broadband cavity-enhanced absorption spectroscopy (IBB\\-CEAS, and laser-induced fluorescence (LIF. All broadband absorption spectrometers were optimized for the detection of the main target species of the campaign, NO3, but were also capable of detecting NO2 simultaneously with reduced sensitivity. NO2 mixing ratios in the chamber were within a range characteristic of polluted, urban conditions, with a maximum mixing ratio of approximately 75 ppbv. The overall agreement between measurements of all instruments was excellent. Linear fits of the combined data sets resulted in slopes that differ from unity only within the stated uncertainty of each instrument. Possible interferences from species such as water vapor and ozone were negligible under the experimental conditions.

  1. Intercomparison of measurements of NO2 concentrations in the atmosphere simulation chamber SAPHIR during the NO3Comp campaign

    Directory of Open Access Journals (Sweden)

    R. M. Varma

    2009-10-01

    Full Text Available NO2 concentrations were measured by various instruments during the NO3Comp campaign at the atmosphere simulation chamber SAPHIR at Forschungszentrum Jülich, Germany, in June 2007. Analytic methods included photolytic conversion with chemiluminescence (PC-CLD, broadband cavity ring-down spectroscopy (BBCRDS, pulsed cavity ring-down spectroscopy (CRDS, incoherent broadband cavity-enhanced absorption spectroscopy (IBBCEAS, and laser-induced fluorescence (LIF. All broadband absorption spectrometers were optimized for the detection of the main target species of the campaign, NO2, but were also capable of detecting NO2 simultaneously with reduced sensitivity. NO2 mixing ratios in the chamber were within a range characteristic of polluted, urban conditions, with a maximum mixing ratio of approximately 75 ppbv. The overall agreement between measurements of all instruments was excellent. Linear fits of the combined data sets resulted in slopes that differ from unity only within the stated uncertainty of each instrument. Possible interferences from species such as water vapor and ozone were negligible under the experimental conditions.

  2. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    Science.gov (United States)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the "MiniWall" to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP Server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  3. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    Science.gov (United States)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the MiniWall to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  4. Effectiveness of an Adapted SBAR Communication Tool for a Rehabilitation Setting.

    Science.gov (United States)

    Velji, Karima; Baker, G Ross; Fancott, Carol; Andreoli, Angie; Boaro, Nancy; Tardif, Gaétan; Aimone, Elaine; Sinclair, Lynne

    2008-01-01

    Effective communication and teamwork have been identified in the literature as key enablers of patient safety. The SBAR (Situation-Background-Assessment-Recommendation) process has proven to be an effective communication tool in acute care settings to structure high-urgency communications, particularly between physicians and nurses; however, little is known of its effectiveness in other settings. This study evaluated the effectiveness of an adapted SBAR tool for both urgent and non-urgent situations within a rehabilitation setting. In phase 1 of this study, clinical staff, patient and family input was gathered in a focus-group format to help guide, validate and refine adaptations to the SBAR tool. In phase 2, the adapted SBAR was implemented in one interprofessional team; clinical and support staff participated in educational workshops with experiential learning to enhance their proficiency in using the SBAR process. Key champions reinforced its use within the team. In phase 3, evaluation of the effectiveness of the adapted SBAR tool focused on three main areas: staff perceptions of team communication and patient safety culture (as measured by the Agency for Healthcare Research and Quality Hospital Survey on Patient Safety Culture), patient satisfaction (as determined using the Client Perspectives on Rehabilitation Services questionnaire) and safety reporting (including incident and near-miss reporting). Findings from this study suggest that staff found the use of the adapted SBAR tool helpful in both individual and team communications, which ultimately affected perceived changes in the safety culture of the study team. There was a positive but not significant impact on patient satisfaction, likely due to a ceiling effect. Improvements were also seen in safety reporting of incidents and near misses across the organization and within the study team.

  5. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    LENUS (Irish Health Repository)

    Hennerby, Cathy

    2012-02-01

    AIM: This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. BACKGROUND: The increased number of registered general agency nurses working in an acute children\\'s hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about \\'near misses\\

  6. International Distribution as Communication Tool. What Builds Experience and Value Creation in Luxury Retail Setting?

    OpenAIRE

    Tisovski, Marija

    2009-01-01

    The thesis argues that the distribution formats can be significant strategic communication and differentiation tools for luxury brand and that the intangible determinants within the space can provide balancing link between company trying to manage its brand expression and consumers search for the meaningful experiences. The dissertation uses a luxury retail setting, as the highest in distribution hierarchy to analyze these relations. This ensures a level of diversification from mass retail ap...

  7. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  8. A Python tool to set up relative free energy calculations in GROMACS.

    Science.gov (United States)

    Klimovich, Pavel V; Mobley, David L

    2015-11-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.

  9. ErmineJ: Tool for functional analysis of gene expression data sets

    Directory of Open Access Journals (Sweden)

    Braynen William

    2005-11-01

    Full Text Available Abstract Background It is common for the results of a microarray study to be analyzed in the context of biologically-motivated groups of genes such as pathways or Gene Ontology categories. The most common method for such analysis uses the hypergeometric distribution (or a related technique to look for "over-representation" of groups among genes selected as being differentially expressed or otherwise of interest based on a gene-by-gene analysis. However, this method suffers from some limitations, and biologist-friendly tools that implement alternatives have not been reported. Results We introduce ErmineJ, a multiplatform user-friendly stand-alone software tool for the analysis of functionally-relevant sets of genes in the context of microarray gene expression data. ErmineJ implements multiple algorithms for gene set analysis, including over-representation and resampling-based methods that focus on gene scores or correlation of gene expression profiles. In addition to a graphical user interface, ErmineJ has a command line interface and an application programming interface that can be used to automate analyses. The graphical user interface includes tools for creating and modifying gene sets, visualizing the Gene Ontology as a table or tree, and visualizing gene expression data. ErmineJ comes with a complete user manual, and is open-source software licensed under the Gnu Public License. Conclusion The availability of multiple analysis algorithms, together with a rich feature set and simple graphical interface, should make ErmineJ a useful addition to the biologist's informatics toolbox. ErmineJ is available from http://microarray.cu.genome.org.

  10. Prediction of falls using a risk assessment tool in the acute care setting

    Directory of Open Access Journals (Sweden)

    Ferko Nicole

    2004-01-01

    Full Text Available Abstract Background The British STRATIFY tool was previously developed to predict falls in hospital. Although the tool has several strengths, certain limitations exist which may not allow generalizability to a Canadian setting. Thus, we tested the STRATIFY tool with some modification and re-weighting of items in Canadian hospitals. Methods This was a prospective validation cohort study in four acute care medical units of two teaching hospitals in Hamilton, Ontario. In total, 620 patients over the age of 65 years admitted during a 6-month period. Five patient characteristics found to be risk factors for falls in the British STRATIFY study were tested for predictive validity. The characteristics included history of falls, mental impairment, visual impairment, toileting, and dependency in transfers and mobility. Multivariate logistic regression was used to obtain optimal weights for the construction of a risk score. A receiver-operating characteristic curve was generated to show sensitivities and specificities for predicting falls based on different threshold scores for considering patients at high risk. Results Inter-rater reliability for the weighted risk score indicated very good agreement (inter-class correlation coefficient = 0.78. History of falls, mental impairment, toileting difficulties, and dependency in transfer / mobility significantly predicted fallers. In the multivariate model, mental status was a significant predictor (P Conclusion Good predictive validity for identifying fallers was achieved in a Canadian setting using a simple-to-obtain risk score that can easily be incorporated into practice.

  11. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  12. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  13. Instructor's Perceptions towards the Use of an Online Instructional Tool in an Academic English Setting in Kuwait

    Science.gov (United States)

    Erguvan, Deniz

    2014-01-01

    This study sets out to explore the faculty members' perceptions of a specific web-based instruction tool (Achieve3000) in a private higher education institute in Kuwait. The online tool provides highly differentiated instruction, which is initiated with a level set at the beginning of the term. The program is used in two consecutive courses as…

  14. Tools for measuring patient safety in primary care settings using the RAND/UCLA appropriateness method.

    Science.gov (United States)

    Bell, Brian G; Spencer, Rachel; Avery, Anthony J; Campbell, Stephen M

    2014-06-05

    The majority of patient contacts occur in general practice but general practice patient safety has been poorly described and under-researched to date compared to hospital settings. Our objective was to produce a set of patient safety tools and indicators that can be used in general practices in any healthcare setting and develop a 'toolkit' of feasible patient safety measures for general practices in England. A RAND/UCLA Appropriateness Method exercise was conducted with a panel of international experts in general practice patient safety. Statements were developed from an extensive systematic literature review of patient safety in general practice. We used standard RAND/UCLA Appropriateness Method rating methods to identify necessary items for assessing patient safety in general practice, framed in terms of the Structure-Process-Outcome taxonomy. Items were included in the toolkit if they received an overall panel median score of ≥ 7 with agreement (no more than two panel members rating the statement outside a 3-point distribution around the median). Of 205 identified statements, the panel rated 101 as necessary for assessing the safety of general practices. Of these 101 statements, 73 covered structures or organisational issues, 22 addressed processes and 6 focused on outcomes. We developed and tested tools that can lead to interventions to improve safety outcomes in general practice. This paper reports the first attempt to systematically develop a patient safety toolkit for general practice, which has the potential to improve safety, cost effectiveness and patient experience, in any healthcare system.

  15. A new plant chamber facility, PLUS, coupled to the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Hohaus, T.; Kuhn, U.; Andres, S.; Kaminski, M.; Rohrer, F.; Tillmann, R.; Wahner, A.; Wegener, R.; Yu, Z.; Kiendler-Scharr, A.

    2016-03-01

    A new PLant chamber Unit for Simulation (PLUS) for use with the atmosphere simulation chamber SAPHIR (Simulation of Atmospheric PHotochemistry In a large Reaction Chamber) has been built and characterized at the Forschungszentrum Jülich GmbH, Germany. The PLUS chamber is an environmentally controlled flow-through plant chamber. Inside PLUS the natural blend of biogenic emissions of trees is mixed with synthetic air and transferred to the SAPHIR chamber, where the atmospheric chemistry and the impact of biogenic volatile organic compounds (BVOCs) can be studied in detail. In PLUS all important environmental parameters (e.g., temperature, photosynthetically active radiation (PAR), soil relative humidity (RH)) are well controlled. The gas exchange volume of 9.32 m3 which encloses the stem and the leaves of the plants is constructed such that gases are exposed to only fluorinated ethylene propylene (FEP) Teflon film and other Teflon surfaces to minimize any potential losses of BVOCs in the chamber. Solar radiation is simulated using 15 light-emitting diode (LED) panels, which have an emission strength up to 800 µmol m-2 s-1. Results of the initial characterization experiments are presented in detail. Background concentrations, mixing inside the gas exchange volume, and transfer rate of volatile organic compounds (VOCs) through PLUS under different humidity conditions are explored. Typical plant characteristics such as light- and temperature- dependent BVOC emissions are studied using six Quercus ilex trees and compared to previous studies. Results of an initial ozonolysis experiment of BVOC emissions from Quercus ilex at typical atmospheric concentrations inside SAPHIR are presented to demonstrate a typical experimental setup and the utility of the newly added plant chamber.

  16. Priority-setting tools for rheumatology disease referrals: a review of the literature.

    Science.gov (United States)

    De Coster, Carolyn; Fitzgerald, Avril; Cepoiu, Monica

    2008-11-01

    As part of a larger body of work to develop a rheumatology priority referral score, a literature review was conducted. The objective of the literature review was to identify preexisting priority-setting, triage, and referral tools/scales developed to guide referrals from primary care to specialist care/consultation usually provided by a rheumatologist. Using a combination of database, citation, Internet, and hand-searching, 20 papers were identified that related to referral prioritization in three areas: rheumatoid arthritis (RA; 5), musculoskeletal (MSK) diseases other than RA (3), and MSK diseases in general (12). No single set of priority-setting criteria was identified for rheumatologic disorders across the spectrum of patients who may be referred from primary care providers (PCPs) to rheumatologists. There appears to be more congruence on conditions at either end of the urgency spectrum with conditions such as suspected cranial arteritis or systemic vasculitis deemed to be emergency referrals and fibromyalgia and other soft-tissue syndromes deemed to be more routine referrals. Between these two extremes, there is a divergence of opinion about urgency and few papers on the issue. The exception to this is referral for early RA for which several criteria have been established. Despite the inherent complexities in developing a tool to prioritize patients referred by PCPs to rheumatologists, there are compelling reasons to proceed. With the aging of the population, the number of patients being referred to rheumatologists is expected to increase. With pharmaceutical advances, there are demonstrable benefits in early referral for some conditions. These trends have led to increased pressure on scarce rheumatological human resources. A tool to prioritize referrals is a critical component of improving access and the referral process.

  17. Towards sets of hazardous waste indicators. Essential tools for modern industrial management.

    Science.gov (United States)

    Peterson, Peter J; Granados, Asa

    2002-01-01

    Decision-makers require useful tools, such as indicators, to help them make environmentally sound decisions leading to effective management of hazardous wastes. Four hazardous waste indicators are being tested for such a purpose by several countries within the Sustainable Development Indicator Programme of the United Nations Commission for Sustainable Development. However, these indicators only address the 'down-stream' end-of-pipe industrial situation. More creative thinking is clearly needed to develop a wider range of indicators that not only reflects all aspects of industrial production that generates hazardous waste but considers socio-economic implications of the waste as well. Sets of useful and innovative indicators are proposed that could be applied to the emerging paradigm shift away from conventional end-of-pipe management actions and towards preventive strategies that are being increasingly adopted by industry often in association with local and national governments. A methodological and conceptual framework for the development of a core-set of hazardous waste indicators has been developed. Some of the indicator sets outlined quantify preventive waste management strategies (including indicators for cleaner production, hazardous waste reduction/minimization and life cycle analysis), whilst other sets address proactive strategies (including changes in production and consumption patterns, eco-efficiency, eco-intensity and resource productivity). Indicators for quantifying transport of hazardous wastes are also described. It was concluded that a number of the indicators proposed could now be usefully implemented as management tools using existing industrial and economic data. As cleaner production technologies and waste minimization approaches are more widely deployed, and industry integrates environmental concerns at all levels of decision-making, it is expected that the necessary data for construction of the remaining indicators will soon become available.

  18. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    Science.gov (United States)

    Hennerby, Cathy; Joyce, Pauline

    2011-03-01

    This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. The increased number of registered general agency nurses working in an acute children's hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about 'near misses', parental dissatisfaction, perceived competency weaknesses and rising cost associated with their use. [Young's (2009) Journal of Organisational Change, 22, 524-548] nine-stage change framework was used to guide the implementation of the competency assessment tool within a paediatric acute care setting. The ongoing success of the initiative, from a nurse manager's perspective, relies on structured communication with the agency provider before employing competent agency nurses. Sustainability of the change will depend on nurse managers' persistence in attending the concerns of those resisting the change while simultaneously supporting those championing the change. These key communication and supporting roles highlight the pivotal role held by nurse managers, as gate keepers, in safe-guarding children while in hospital. Leadership qualities of nurse managers will also be challenged in continuing to manage and drive the change where resistance might prevail. © 2011 The Authors. Journal compilation © 2011 Blackwell Publishing Ltd.

  19. COLLABORATIVE RESEARCH: Parallel Analysis Tools and New Visualization Techniques for Ultra-Large Climate Data Set

    Energy Technology Data Exchange (ETDEWEB)

    middleton, Don [Co-PI; Haley, Mary

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  20. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  1. Independent Verification and Validation Of SAPHIRE 8 Software Configuration Management Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-10-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE configuration management is to assess the activities that results in the process of identifying and defining the baselines associated with the SAPHIRE software product; controlling the changes to baselines and release of baselines throughout the life cycle; recording and reporting the status of baselines and the proposed and actual changes to the baselines; and verifying the correctness and completeness of baselines.. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  2. Independent Verification and Validation Of SAPHIRE 8 Software Configuration Management Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-02-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE configuration management is to assess the activities that results in the process of identifying and defining the baselines associated with the SAPHIRE software product; controlling the changes to baselines and release of baselines throughout the life cycle; recording and reporting the status of baselines and the proposed and actual changes to the baselines; and verifying the correctness and completeness of baselines.. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  3. Independent Verification and Validation Of SAPHIRE 8 Risk Management Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-11-01

    This report provides an evaluation of the risk management. Risk management is intended to ensure a methodology for conducting risk management planning, identification, analysis, responses, and monitoring and control activities associated with the SAPHIRE project work, and to meet the contractual commitments prepared by the sponsor; the Nuclear Regulatory Commission.

  4. Isotope effect in the formation of H2 from H2CO studied at the atmospheric simulation chamber SAPHIR

    NARCIS (Netherlands)

    Röckmann, T.; Walter, S.; Bohn, B.; Wegener, R.; Spahn, H.; Brauers, T.; Tillmann, R.; Schlosser, E.; Koppmann, R.; Rohrer, F.

    2010-01-01

    Formaldehyde of known, near-natural isotopic composition was photolyzed in the SAPHIR atmosphere simulation chamber under ambient conditions. The isotopic composition of the product H2 was used to determine the isotope effects in formaldehyde photolysis. The experiments are sensitive to the molecula

  5. Secondary organic aerosols - formation and ageing studies in the SAPHIR chamber

    Science.gov (United States)

    Spindler, Christian; Müller, Lars; Trimborn, Achim; Mentel, Thomas; Hoffmann, Thorsten

    2010-05-01

    Secondary organic aerosol (SOA) formation from oxidation products of biogenic volatile organic compounds (BVOC) constitutes an important coupling between vegetation, atmospheric chemistry, and climate change. Such secondary organic aerosol components play an important role in particle formation in Boreal regions ((Laaksonen et al., 2008)), where biogenic secondary organic aerosols contribute to an overall negative radiative forcing, thus a negative feed back between vegetation and climate warming (Spracklen et al., 2008). Within the EUCAARI project we investigated SOA formation from mixtures of monoterpenes (and sesquiterpenes) as emitted typically from Boreal tree species in Southern Finland. The experiments were performed in the large photochemical reactor SAPHIR in Juelich at natural light and oxidant levels. Oxidation of the BVOC mixtures and SOA formation was induced by OH radicals and O3. The SOA was formed on the first day and then aged for another day. The resulting SOA was characterized by HR-ToF-AMS, APCI-MS, and filter samples with subsequent H-NMR, GC-MS and HPLC-MS analysis. The chemical evolution of the SOA is characterized by a fast increase of the O/C ratio during the formation process on the first day, stable O/C ratio during night, and a distinctive increase of O/C ratio at the second day. The increase of the O/C ratio on the second day is highly correlated to the OH dose and is accompanied by condensational growth of the particles. We will present simultaneous factor analysis of AMS times series (PMF, Ulbrich et al., 2009 ) and direct measurements of individual chemical species. We found that four factors were needed to represent the time evolution of the SOA composition (in the mass spectra) if oxidation by OH plays a mayor role. Corresponding to these factors we observed individual, representative molecules with very similar time behaviour. The correlation between tracers and AMS factors is astonishingly good as the molecular tracers

  6. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    J. Thieser

    2013-01-01

    Full Text Available The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv in the troposphere. While long-path differential optical absorption spectroscopy (DOAS is a well established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS, two utilised open-path cavity enhanced absorption spectroscopy (CEAS, and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany in June 2007. Twelve experiments were performed in the well mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2 over all experiments of the campaign (60 correlations is r2 = 0.981 (25th/75th percentiles: 0.949/0.994; min/max: 0.540/0.999. The linear regression analysis of the campaign data set yielded very small intercepts (1.2 ± 5.3 pptv and the average slope of the regression lines was close to unity (1.02, min: 0.72, max: 1.36. The deviation of individual regression slopes from unity was always within the combined accuracies of each instrument pair. The very good correspondence between the NO3 measurements

  7. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    H.-P. Dorn

    2013-05-01

    Full Text Available The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv in the troposphere. While long-path differential optical absorption spectroscopy (DOAS has been a well-established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS, two utilised open-path cavity-enhanced absorption spectroscopy (CEAS, and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany in June 2007. Twelve experiments were performed in the well-mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2 over all experiments of the campaign (60 correlations is r2 = 0.981 (quartile 1 (Q1: 0.949; quartile 3 (Q3: 0.994; min/max: 0.540/0.999. The linear regression analysis of the campaign data set yielded very small intercepts (median: 1.1 pptv; Q1/Q3: −1.1/2.6 pptv; min/max: −14.1/28.0 pptv, and the slopes of the regression lines were close to unity (median: 1.01; Q1/Q3: 0.92/1.10; min/max: 0.72/1.36. The deviation of individual regression slopes from unity was always within the combined

  8. Intercomparison of NO3 radical detection instruments in the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Dorn, H.-P.; Apodaca, R. L.; Ball, S. M.; Brauers, T.; Brown, S. S.; Crowley, J. N.; Dubé, W. P.; Fuchs, H.; Häseler, R.; Heitmann, U.; Jones, R. L.; Kiendler-Scharr, A.; Labazan, I.; Langridge, J. M.; Meinen, J.; Mentel, T. F.; Platt, U.; Pöhler, D.; Rohrer, F.; Ruth, A. A.; Schlosser, E.; Schuster, G.; Shillings, A. J. L.; Simpson, W. R.; Thieser, J.; Tillmann, R.; Varma, R.; Venables, D. S.; Wahner, A.

    2013-05-01

    The detection of atmospheric NO3 radicals is still challenging owing to its low mixing ratios (≈ 1 to 300 pptv) in the troposphere. While long-path differential optical absorption spectroscopy (DOAS) has been a well-established NO3 detection approach for over 25 yr, newly sensitive techniques have been developed in the past decade. This publication outlines the results of the first comprehensive intercomparison of seven instruments developed for the spectroscopic detection of tropospheric NO3. Four instruments were based on cavity ring-down spectroscopy (CRDS), two utilised open-path cavity-enhanced absorption spectroscopy (CEAS), and one applied "classical" long-path DOAS. The intercomparison campaign "NO3Comp" was held at the atmosphere simulation chamber SAPHIR in Jülich (Germany) in June 2007. Twelve experiments were performed in the well-mixed chamber for variable concentrations of NO3, N2O5, NO2, hydrocarbons, and water vapour, in the absence and in the presence of inorganic or organic aerosol. The overall precision of the cavity instruments varied between 0.5 and 5 pptv for integration times of 1 s to 5 min; that of the DOAS instrument was 9 pptv for an acquisition time of 1 min. The NO3 data of all instruments correlated excellently with the NOAA-CRDS instrument, which was selected as the common reference because of its superb sensitivity, high time resolution, and most comprehensive data coverage. The median of the coefficient of determination (r2) over all experiments of the campaign (60 correlations) is r2 = 0.981 (quartile 1 (Q1): 0.949; quartile 3 (Q3): 0.994; min/max: 0.540/0.999). The linear regression analysis of the campaign data set yielded very small intercepts (median: 1.1 pptv; Q1/Q3: -1.1/2.6 pptv; min/max: -14.1/28.0 pptv), and the slopes of the regression lines were close to unity (median: 1.01; Q1/Q3: 0.92/1.10; min/max: 0.72/1.36). The deviation of individual regression slopes from unity was always within the combined accuracies of each

  9. ESPbase: a microsoft access tool for selecting symbol and icon sets for usability.

    Science.gov (United States)

    de Bruijn, O; McDougall, S; Curry, M B

    1999-08-01

    The ESPbase provides a tool for storing symbols and icons along with information about their characteristics. Information about a wide range of symbol characteristics is included on the database to facilitate the selection of symbol sets for research and design. The database includes information about the graphical characteristics and functions of symbols. It also includes ratings of symbol concreteness, complexity, familiarity, and meaningfulness. Symbols and icons can be accessed on the basis of each of these characteristics or any combination of characteristics. This makes it easier to select symbols on the basis of usability and design requirements. It also means that symbols can be easily selected for research while controlling their characteristics on a number of dimensions.

  10. Google Sets, Google Suggest, and Google Search History: Three More Tools for the Reference Librarian's Bag of Tricks

    OpenAIRE

    Cirasella, Jill

    2008-01-01

    This article examines the features, quirks, and uses of Google Sets, Google Suggest, and Google Search History and argues that these three lesser-known Google tools warrant inclusion in the resourceful reference librarian’s bag of tricks.

  11. Linked Scatter Plots, A Powerful Exploration Tool For Very Large Sets of Spectra

    Science.gov (United States)

    Carbon, Duane Francis; Henze, Christopher

    2015-08-01

    We present a new tool, based on linked scatter plots, that is designed to efficiently explore very large spectrum data sets such as the SDSS, APOGEE, LAMOST, GAIA, and RAVE data sets.The tool works in two stages: the first uses batch processing and the second runs interactively. In the batch stage, spectra are processed through our data pipeline which computes the depths relative to the local continuum at preselected feature wavelengths. These depths, and any additional available variables such as local S/N level, magnitudes, colors, positions, and radial velocities, are the basic measured quantities used in the interactive stage.The interactive stage employs the NASA hyperwall, a configuration of 128 workstation displays (8x16 array) controlled by a parallelized software suite running on NASA's Pleiades supercomputer. Each hyperwall panel is used to display a fully linked 2-D scatter plot showing the depth of feature A vs the depth of feature B for all of the spectra. A and B change from panel to panel. The relationships between the various (A,B) strengths and any distinctive clustering, as well as unique outlier groupings, are visually apparent when examining and inter-comparing the different panels on the hyperwall. In addition, the data links between the scatter plots allow the user to apply a logical algebra to the measurements. By graphically selecting the objects in any interesting region of any 2-D plot on the hyperwall, the tool immediately and clearly shows how the selected objects are distributed in all the other 2-D plots. The selection process may be repeated multiple times and, at each step, the selections can represent a sequence of logical constraints on the measurements, revealing those objects which satisfy all the constraints thus far. The spectra of the selected objects may be examined at any time on a connected workstation display.Using over 945,000,000 depth measurements from 569,738 SDSS DR10 stellar spectra, we illustrate how to quickly

  12. The clinical conscientiousness index: a valid tool for exploring professionalism in the clinical undergraduate setting.

    Science.gov (United States)

    Kelly, Martina; O'Flynn, Siun; McLachlan, John; Sawdon, Marina A

    2012-09-01

    The need to develop effective tools to measure professionalism continues to challenge medical educators; thus, as a follow-up to a recent examination of the "Conscientiousness Index" (CI, a novel measure of one facet of professionalism) in one setting with preclinical medical students, the authors aimed to investigate the validity of the CI as a proxy measure of professionalism in a different context and in the clinical phase of undergraduate medical education. In academic year 2009-2010, the authors collected data similar to those collected for the original preclinical study. In an effort to create a Clinical Conscientiousness Index (CCI) score, they collected the following information on 124 third-year medical students completing their clinical rotations: attendance, timeliness of assessment submissions, and completion of rotation evaluations. Then, they compared the resultant CCI scores with faculty views on professionalism and with formal assessments of students' professionalism (i.e., their portfolios and objective structured clinical examinations [OSCEs]). The authors demonstrate significant correlations between CCI scores and faculty views on professionalism (rS = 0.3; P = .001), and between CCI scores and OSCE score (rS = 0.237; P = .008), but not between CCI scores and portfolio assessment (rS = 0.084; P = .354). The authors also present relationships between CCI scores and demographics. The CCI is a practical, valid proxy measure of professionalism, achieving good correlation with faculty views on professionalism and clinical competency examinations, but not portfolio assessment, in one clinical undergraduate setting.

  13. Sustainability appraisal tools for soil and groundwater remediation: how is the choice of remediation alternative influenced by different sets of sustainability indicators and tool structures?

    Science.gov (United States)

    Beames, Alistair; Broekx, Steven; Lookman, Richard; Touchant, Kaat; Seuntjens, Piet

    2014-02-01

    The state-of-the-science in sustainability assessment of soil and groundwater remediation is evaluated with the application of four decision support systems (DSSs) to a large-scale brownfield revitalization case study. The DSSs were used to perform sustainability appraisals of four technically feasible remediation alternatives proposed for the site. The first stage of the review compares the scope of each tool's sustainability indicators, how these indicators are measured and how the tools differ in terms of standardization and weighting procedures. The second stage of the review compares the outputs from the tools and determines the key factors that result in differing results between tools. The evaluation of indicator sets and tool structures explains why the tools generate differing results. Not all crucial impact areas, as identified by sustainable remediation forums, are thoroughly considered by the tools, particularly with regard to the social and economic aspects of sustainability. Variations in boundary conditions defined between technologies, produce distorted environmental impact results, especially when in-situ and ex-situ technologies are compared. The review draws attention to the need for end users to be aware of which aspects of sustainability are considered, how the aspects are measured and how all aspects are ultimately balanced in the evaluation of potential remediation strategies. Existing tools can be improved by considering different technologies within the same boundary conditions and by expanding indicator sets to include indicators deemed to be relevant by remediation forums. © 2013.

  14. Independent Verification and Validation Of SAPHIRE 8 Volume 3 Users' Guide Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Volume 3 Users’ Guide is to assess the user documentation for its completeness, correctness, and consistency with respect to requirements for user interface and for any functionality that can be invoked by the user. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  15. WhichGenes: a web-based tool for gathering, building, storing and exporting gene sets with application in gene set enrichment analysis.

    Science.gov (United States)

    Glez-Peña, Daniel; Gómez-López, Gonzalo; Pisano, David G; Fdez-Riverola, Florentino

    2009-07-01

    WhichGenes is a web-based interactive gene set building tool offering a very simple interface to extract always-updated gene lists from multiple databases and unstructured biological data sources. While the user can specify new gene sets of interest by following a simple four-step wizard, the tool is able to run several queries in parallel. Every time a new set is generated, it is automatically added to the private gene-set cart and the user is notified by an e-mail containing a direct link to the new set stored in the server. WhichGenes provides functionalities to edit, delete and rename existing sets as well as the capability of generating new ones by combining previous existing sets (intersection, union and difference operators). The user can export his sets configuring the output format and selecting among multiple gene identifiers. In addition to the user-friendly environment, WhichGenes allows programmers to access its functionalities in a programmatic way through a Representational State Transfer web service. WhichGenes front-end is freely available at http://www.whichgenes.org/, WhichGenes API is accessible at http://www.whichgenes.org/api/.

  16. Observation of the positive-strangeness pentaquark $\\Theta^+$ in photoproduction with the SAPHIR detector at ELSA

    CERN Document Server

    Barth, J; Ernst, J; Glander, K H; Hannappel, J; Jöpen, N; Kalinowsky, H; Klein, F; Klempt, E; Lawall, R; Link, J; Menze, D W; Neuerburg, W; Ostrick, M; Paul, E; Van Pee, H; Schulday, I; Schwille, W J; Wiegers, B; Wieland, F W; Wisskirchen, J; Wu, C

    2003-01-01

    The positive--strangeness baryon resonance $\\Theta^+$ is observed in photoproduction of the $\\rm nK^+K^0_s$ final state with the SAPHIR detector at the Bonn ELectron Stretcher Accelerator ELSA. It is seen as a peak in the $\\rm nK^+$ invariant mass distribution with a $4.8\\sigma$ confidence level. We find a mass $\\rm M_{\\Theta^+} = 1540\\pm 4\\pm 2$ MeV and an upper limit of the width $\\rm \\Gamma_{\\Theta^+} < 25$ MeV at 90% c.l. The photoproduction cross section for $\\rm\\bar K^0\\Theta^+$ is in the order of 300 nb. From the absence of a signal in the $\\rm pK^+$ invariant mass distribution in $\\rm\\gamma p\\to pK^+K^-$ at the expected strength we conclude that the $\\Theta^+$ must be isoscalar.

  17. SAPHIR: a physiome core model of body fluid homeostasis and blood pressure regulation.

    Science.gov (United States)

    Thomas, S Randall; Baconnier, Pierre; Fontecave, Julie; Françoise, Jean-Pierre; Guillaud, François; Hannaert, Patrick; Hernández, Alfredo; Le Rolle, Virginie; Mazière, Pierre; Tahi, Fariza; White, Ronald J

    2008-09-13

    We present the current state of the development of the SAPHIR project (a Systems Approach for PHysiological Integration of Renal, cardiac and respiratory function). The aim is to provide an open-source multi-resolution modelling environment that will permit, at a practical level, a plug-and-play construction of integrated systems models using lumped-parameter components at the organ/tissue level while also allowing focus on cellular- or molecular-level detailed sub-models embedded in the larger core model. Thus, an in silico exploration of gene-to-organ-to-organism scenarios will be possible, while keeping computation time manageable. As a first prototype implementation in this environment, we describe a core model of human physiology targeting the short- and long-term regulation of blood pressure, body fluids and homeostasis of the major solutes. In tandem with the development of the core models, the project involves database implementation and ontology development.

  18. Geo-Scape, a Granularity Depended Spatialization Tool for Visualizing Multidimensional Data Sets

    Institute of Scientific and Technical Information of China (English)

    Kontaxaki Sofia; Kokla Margarita; Kavouras Marinos

    2010-01-01

    Recently, the expertise accumulated in the field of geovisualization has found application in the visualization of abstract multidimensional data, on the basis of methods called spatialization methods. Spatialization methods aim at visualizing multidimensional data into low-dimensional representational spaces by making use of spatial metaphors and applying dimension reduction techniques. Spatial metaphors are able to provide a metaphoric framework for the visualization of information at different levels of granularity. The present paper makes an investigation on how the issue of granularity is handled in the context of representative examples of spatialization methods. Furthermore, this paper introduces the prototyping tool Geo-Scape, which provides an interactive spatialization environment for representing and exploring multidimensional data at different levels of granularity, by making use of a kernel density estimation technique and on the landscape "smoothness" metaphor. A demonstration scenario is presented next to show how Geo-Scape helps to discover knowledge into a large set of data, by grouping them into meaningful clusters on the basis ora similarity measure and organizing them at different levels of granularity.

  19. A TOOL FOR STRATEGIC TARGET SETTING ON DEVELOPMENT AND IMPROVEMENT OF REMEDIATION TECHNOLOGIES

    Science.gov (United States)

    Inoue, Yasushi; Katayama, Arata

    A tool for strategic development and improvement of remediation technologies was proposed to set a target specification by applying the RNSOIL, an evaluation index of remediation technologies for contaminated soil. Under the scenario of agricultural site contamination with dieldrin and its remediation, improving items and the target values of the bioremediation using charcoal material (charcoal bioremediation), as a developing technology, were determined. The development target was that the RNSOIL value of charcoal bioremediation fell below that of high temperature thermal desorption as a competing technology. Sensitivity assessments of the RNSOIL selected a remediation period and an incubation volume for bacterial growth and settlement in the charcoal as improving properties. Risk assessment and life cycle inventory analysis was introduced to determine a human health risk due to contaminant, and a total cost of remediation or a CO2 emission accompanied with remediation, as evaluating factors of RNSOIL, respectively. Assessment based on the RNSOIL was able to show clearly improving items for achieving the target or items with great effect for improvement.

  20. Hand injuries from tools in domestic and leisure settings: relative incidence and patterns of initial management.

    Science.gov (United States)

    Williams, S T B; Power, D

    2011-06-01

    A search of the UK Department of Trade and Industry's Home and Leisure Accident database found 16,003 emergency hospital attendances in 2000-2002 following accidents with tools. The hand was the site of injury in 9535 cases (60%). The tool most commonly involved was a Stanley knife, causing as many hand injuries (21%) as all power tools combined. The power tools most frequently causing hand injury were circular saws (28% of power tool injuries), hedge trimmers (21%) and electric drills (17%). Compared to injuries from manual tools, power tool hand injuries were more than twice as likely to be referred to specialists and three times more likely to be admitted to hospital. Specialist referral/admission most commonly occurred following hand injury from mowers (51% admitted/referred), routers (50%) and circular saws (48%). The rate for manual blade injuries was 14%. Missed diagnoses following manual blade injuries may stem from comparatively low rates of specialist assessment.

  1. Modeling and evaluation of the influence of micro-EDM sparking state settings on the tool electrode wear behavior

    DEFF Research Database (Denmark)

    Puthumana, Govindan

    2017-01-01

    Micromachining technologies are now being employed in various industries for generation of precise features on engineering components. Among these processes, micro electrical discharge machining is a 'non-contact' machining technology suitable for material removal from electrically conductive...... materials characterized by considerable wear ofthe tool used for material removal. This paper presents an investigation involving modeling and estimation of the effect of settings for generation of discharges in stable conditions of micro-EDM on the phenomenon of tool electrode wear. A stable sparking...... condition during the process is achieved with varying voltage (V), capacitance (C), threshold (T), and discharge frequency (f). The tool electrode wear model has revealed that the energy of the sparks interacting with the tool surfaces control the phenomenon through the settings of capacitance followed...

  2. Automated innovative diagnostic, data management and communication tool, for improving malaria vector control in endemic settings.

    Science.gov (United States)

    Vontas, John; Mitsakakis, Konstantinos; Zengerle, Roland; Yewhalaw, Delenasaw; Sikaala, Chadwick Haadezu; Etang, Josiane; Fallani, Matteo; Carman, Bill; Müller, Pie; Chouaïbou, Mouhamadou; Coleman, Marlize; Coleman, Michael

    2016-01-01

    Malaria is a life-threatening disease that caused more than 400,000 deaths in sub-Saharan Africa in 2015. Mass prevention of the disease is best achieved by vector control which heavily relies on the use of insecticides. Monitoring mosquito vector populations is an integral component of control programs and a prerequisite for effective interventions. Several individual methods are used for this task; however, there are obstacles to their uptake, as well as challenges in organizing, interpreting and communicating vector population data. The Horizon 2020 project "DMC-MALVEC" consortium will develop a fully integrated and automated multiplex vector-diagnostic platform (LabDisk) for characterizing mosquito populations in terms of species composition, Plasmodium infections and biochemical insecticide resistance markers. The LabDisk will be interfaced with a Disease Data Management System (DDMS), a custom made data management software which will collate and manage data from routine entomological monitoring activities providing information in a timely fashion based on user needs and in a standardized way. The ResistanceSim, a serious game, a modern ICT platform that uses interactive ways of communicating guidelines and exemplifying good practices of optimal use of interventions in the health sector will also be a key element. The use of the tool will teach operational end users the value of quality data (relevant, timely and accurate) to make informed decisions. The integrated system (LabDisk, DDMS & ResistanceSim) will be evaluated in four malaria endemic countries, representative of the vector control challenges in sub-Saharan Africa, (Cameroon, Ivory Coast, Ethiopia and Zambia), highly representative of malaria settings with different levels of endemicity and vector control challenges, to support informed decision-making in vector control and disease management.

  3. Evaluation of a novel electronic genetic screening and clinical decision support tool in prenatal clinical settings.

    Science.gov (United States)

    Edelman, Emily A; Lin, Bruce K; Doksum, Teresa; Drohan, Brian; Edelson, Vaughn; Dolan, Siobhan M; Hughes, Kevin; O'Leary, James; Vasquez, Lisa; Copeland, Sara; Galvin, Shelley L; DeGroat, Nicole; Pardanani, Setul; Gregory Feero, W; Adams, Claire; Jones, Renee; Scott, Joan

    2014-07-01

    "The Pregnancy and Health Profile" (PHP) is a free prenatal genetic screening and clinical decision support (CDS) software tool for prenatal providers. PHP collects family health history (FHH) during intake and provides point-of-care risk assessment for providers and education for patients. This pilot study evaluated patient and provider responses to PHP and effects of using PHP in practice. PHP was implemented in four clinics. Surveys assessed provider confidence and knowledge and patient and provider satisfaction with PHP. Data on the implementation process were obtained through semi-structured interviews with administrators. Quantitative survey data were analyzed using Chi square test, Fisher's exact test, paired t tests, and multivariate logistic regression. Open-ended survey questions and interviews were analyzed using qualitative thematic analysis. Of the 83% (513/618) of patients that provided feedback, 97% felt PHP was easy to use and 98% easy to understand. Thirty percent (21/71) of participating physicians completed both pre- and post-implementation feedback surveys [13 obstetricians (OBs) and 8 family medicine physicians (FPs)]. Confidence in managing genetic risks significantly improved for OBs on 2/6 measures (p values ≤0.001) but not for FPs. Physician knowledge did not significantly change. Providers reported value in added patient engagement and reported mixed feedback about the CDS report. We identified key steps, resources, and staff support required to implement PHP in a clinical setting. To our knowledge, this study is the first to report on the integration of patient-completed, electronically captured and CDS-enabled FHH software into primary prenatal practice. PHP is acceptable to patients and providers. Key to successful implementation in the future will be customization options and interoperability with electronic health records.

  4. Is Google Trends a reliable tool for digital epidemiology? Insights from different clinical settings.

    Science.gov (United States)

    Cervellin, Gianfranco; Comelli, Ivan; Lippi, Giuseppe

    2017-09-01

    Internet-derived information has been recently recognized as a valuable tool for epidemiological investigation. Google Trends, a Google Inc. portal, generates data on geographical and temporal patterns according to specified keywords. The aim of this study was to compare the reliability of Google Trends in different clinical settings, for both common diseases with lower media coverage, and for less common diseases attracting major media coverage. We carried out a search in Google Trends using the keywords "renal colic", "epistaxis", and "mushroom poisoning", selected on the basis of available and reliable epidemiological data. Besides this search, we carried out a second search for three clinical conditions (i.e., "meningitis", "Legionella Pneumophila pneumonia", and "Ebola fever"), which recently received major focus by the Italian media. In our analysis, no correlation was found between data captured from Google Trends and epidemiology of renal colics, epistaxis and mushroom poisoning. Only when searching for the term "mushroom" alone the Google Trends search generated a seasonal pattern which almost overlaps with the epidemiological profile, but this was probably mostly due to searches for harvesting and cooking rather than to for poisoning. The Google Trends data also failed to reflect the geographical and temporary patterns of disease for meningitis, Legionella Pneumophila pneumonia and Ebola fever. The results of our study confirm that Google Trends has modest reliability for defining the epidemiology of relatively common diseases with minor media coverage, or relatively rare diseases with higher audience. Overall, Google Trends seems to be more influenced by the media clamor than by true epidemiological burden. Copyright © 2017 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  5. A Short Screening Tool to Identify Victims of Child Sex Trafficking in the Health Care Setting.

    Science.gov (United States)

    Greenbaum, V Jordan; Dodd, Martha; McCracken, Courtney

    2015-11-23

    The aim of this study was to describe characteristics of commercial sexual exploitation of children/child sex trafficking (CSEC/CST) victims and to develop a screening tool to identify victims among a high-risk adolescent population. In this cross-sectional study, patients aged 12 to 18 years who presented to 1 of 3 metropolitan pediatric emergency departments or 1 child protection clinic and who were identified as victims of CSEC/CST were compared with similar-aged patients with allegations of acute sexual assault/sexual abuse (ASA) without evidence of CSEC/CST. The 2 groups were compared on variables related to medical and reproductive history, high-risk behavior, mental health symptoms, and injury history. After univariate analysis, a subset of candidate variables was subjected to multivariable logistic regression to identify an optimum set of 5 to 7 screening items. Of 108 study participants, 25 comprised the CSEC/CST group, and 83 comprised the ASA group. Average (SD) age was 15.4 (1.8) years for CSEC/CST patients and 14.8 (1.6) years for ASA patients; 100% of the CSEC/CST and 95% of the ASA patients were female. The 2 groups differed significantly on 16 variables involving reproductive history, high-risk behavior, sexually transmitted infections, and previous experience with violence. A 6-item screen was constructed, and a cutoff score of 2 positive answers had a sensitivity of 92%, specificity of 73%, positive predictive value of 51%, and negative predictive value of 97%. Adolescent CSEC/CST victims differ from ASA victims without evidence of CSEC/CST across several domains. A 6-item screen effectively identifies CSEC/CST victims in a high-risk adolescent population.

  6. Impact of Megha-Tropiques SAPHIR radiance assimilation on the simulation of tropical cyclones over Bay of Bengal

    Science.gov (United States)

    Dhanya, M.; Gopalakrishnan, Deepak; Chandrasekar, Anantharaman; Singh, Sanjeev Kumar; Prasad, V. S.

    2016-05-01

    Impact of SAPHIR radiance assimilation on the simulation of tropical cyclones over Indian region has been investigated using the Weather Research and Forecasting (WRF) model. Three cyclones that formed over Bay of Bengal have been considered in the present study. Assimilation methodology used here is the three dimensional variational (3DVar) scheme within the WRF model. With the initial and boundary conditions from Global Forecasting System (GFS) analyses from the National Centres for Environmental Prediction (NCEP), a control run (CTRL) without assimilation of any data and a 3DVar run with the assimilation of SAPHIR radiance have been performed. Both model simulations have been compared with the observations from India Meteorological Department (IMD), Tropical Rainfall Measurement Mission (TRMM), and analysis fields from GFS. Detailed analysis reveals that, the SAPHIR radiance assimilation has led to significant improvement in the simulation of all the three cyclones in terms of cyclone track, intensity, accumulated rainfall. The simulation of warm core structure and relative vorticity profile of each cyclone by 3DVar run are found to be more closer to GFS analyses, when compared with the CTRL run.

  7. Comparison of OH concentration measurements by DOAS and LIF during SAPHIR chamber experiments at high OH reactivity and low NO concentration

    Directory of Open Access Journals (Sweden)

    H. Fuchs

    2012-07-01

    Full Text Available During recent field campaigns, hydroxyl radical (OH concentrations that were measured by laser-induced fluorescence (LIF were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low NO concentration. These discrepancies, which were observed in forests and urban-influenced rural environments, are so far not entirely understood. In summer 2011, a series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, in order to investigate the photochemical degradation of isoprene, methyl-vinyl ketone (MVK, methacrolein (MACR and aromatic compounds by OH. Conditions were similar to those experienced during the PRIDE-PRD2006 campaign in the Pearl River Delta (PRD, China, in 2006, where a large difference between OH measurements and model predictions was found. During experiments in SAPHIR, OH was simultaneously detected by two independent instruments: LIF and differential optical absorption spectroscopy (DOAS. Because DOAS is an inherently calibration-free technique, DOAS measurements are regarded as a reference standard. The comparison of the two techniques was used to investigate potential artifacts in the LIF measurements for PRD-like conditions of OH reactivities of 10 to 30 s−1 and NO mixing ratios of 0.1 to 0.3 ppbv. The analysis of twenty experiment days shows good agreement. The linear regression of the combined data set (averaged to the DOAS time resolution, 2495 data points yields a slope of 1.02 ± 0.01 with an intercept of (0.10 ± 0.03 × 106 cm−3 and a linear correlation coefficient of R2 = 0.86. This indicates that the sensitivity of the LIF instrument is well-defined by its calibration procedure. No hints for artifacts are observed for isoprene, MACR, and different aromatic compounds. LIF measurements were approximately 30–40% (median larger than those by DOAS after MVK (20 ppbv and

  8. Comparison of OH concentration measurements by DOAS and LIF during SAPHIR chamber experiments at high OH reactivity and low NO concentration

    Science.gov (United States)

    Fuchs, H.; Dorn, H.-P.; Bachner, M.; Bohn, B.; Brauers, T.; Gomm, S.; Hofzumahaus, A.; Holland, F.; Nehr, S.; Rohrer, F.; Tillmann, R.; Wahner, A.

    2012-07-01

    During recent field campaigns, hydroxyl radical (OH) concentrations that were measured by laser-induced fluorescence (LIF) were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low NO concentration. These discrepancies, which were observed in forests and urban-influenced rural environments, are so far not entirely understood. In summer 2011, a series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, in order to investigate the photochemical degradation of isoprene, methyl-vinyl ketone (MVK), methacrolein (MACR) and aromatic compounds by OH. Conditions were similar to those experienced during the PRIDE-PRD2006 campaign in the Pearl River Delta (PRD), China, in 2006, where a large difference between OH measurements and model predictions was found. During experiments in SAPHIR, OH was simultaneously detected by two independent instruments: LIF and differential optical absorption spectroscopy (DOAS). Because DOAS is an inherently calibration-free technique, DOAS measurements are regarded as a reference standard. The comparison of the two techniques was used to investigate potential artifacts in the LIF measurements for PRD-like conditions of OH reactivities of 10 to 30 s-1 and NO mixing ratios of 0.1 to 0.3 ppbv. The analysis of twenty experiment days shows good agreement. The linear regression of the combined data set (averaged to the DOAS time resolution, 2495 data points) yields a slope of 1.02 ± 0.01 with an intercept of (0.10 ± 0.03) × 106 cm-3 and a linear correlation coefficient of R2 = 0.86. This indicates that the sensitivity of the LIF instrument is well-defined by its calibration procedure. No hints for artifacts are observed for isoprene, MACR, and different aromatic compounds. LIF measurements were approximately 30-40% (median) larger than those by DOAS after MVK (20 ppbv) and toluene (90 ppbv) had been added. However, this discrepancy has a

  9. An approach and a tool for setting sustainable energy retrofitting strategies referring to the 2010 EP

    Directory of Open Access Journals (Sweden)

    Charlot-Valdieu, C.

    2011-10-01

    Full Text Available The 2010 EPBD asks for an economic and social analysis in order to preserve social equity and to promote innovation and building productivity. This is possible with a life cycle energy cost (LCEC analysis, such as with the SEC (Sustainable Energy Cost model whose bottom up approach begins with a building typology including inhabitants. Then the analysis of some representative buildings includes the identification of a technico-economical optimum and energy retrofitting scenarios for each retrofitting programme and the extrapolation for the whole building stock. An extrapolation for the whole building stock allows to set up the strategy and to identify the needed means for reaching the objectives. SEC is a decision aid tool for optimising sustainable energy retrofitting strategies for buildings at territorial and patrimonial scales inside a sustainable development approach towards the factor 4. Various versions of the SEC model are now available for housing and for tertiary buildings.

    La directiva europea de 2010 sobre eficiencia energética en los edificios exige un análisis económico y social con el objetivo de preservar la equidad social, promover la innovación y reforzar la productividad en la construcción. Esto es posible con el análisis del coste global ampliado y especialmente con el modelo SEC. El análisis “bottom up” realizado con la SEC se basa en una tipología de edificio/usuario y en el análisis de edificios representativos: la identificación del óptimo técnico-económico y elaboración de escenarios antes de hacer una extrapolación al conjunto del parque. SEC es una herramienta de ayuda a la decisión para desarrollar estrategias territoriales o patrimoniales de rehabilitación energética. Existen diversas versiones del modelo: para edificios residenciales (unifamiliares y plurifamiliares, públicos y privados y para edificios terciarios.

  10. Tools for Analysis and Visualization of Large Time-Varying CFD Data Sets

    Science.gov (United States)

    Wilhelms, Jane; VanGelder, Allen

    1997-01-01

    In the second year, we continued to built upon and improve our scanline-based direct volume renderer that we developed in the first year of this grant. This extremely general rendering approach can handle regular or irregular grids, including overlapping multiple grids, and polygon mesh surfaces. It runs in parallel on multi-processors. It can also be used in conjunction with a k-d tree hierarchy, where approximate models and error terms are stored in the nodes of the tree, and approximate fast renderings can be created. We have extended our software to handle time-varying data where the data changes but the grid does not. We are now working on extending it to handle more general time-varying data. We have also developed a new extension of our direct volume renderer that uses automatic decimation of the 3D grid, as opposed to an explicit hierarchy. We explored this alternative approach as being more appropriate for very large data sets, where the extra expense of a tree may be unacceptable. We also describe a new approach to direct volume rendering using hardware 3D textures and incorporates lighting effects. Volume rendering using hardware 3D textures is extremely fast, and machines capable of using this technique are becoming more moderately priced. While this technique, at present, is limited to use with regular grids, we are pursuing possible algorithms extending the approach to more general grid types. We have also begun to explore a new method for determining the accuracy of approximate models based on the light field method described at ACM SIGGRAPH '96. In our initial implementation, we automatically image the volume from 32 equi-distant positions on the surface of an enclosing tessellated sphere. We then calculate differences between these images under different conditions of volume approximation or decimation. We are studying whether this will give a quantitative measure of the effects of approximation. We have created new tools for exploring the

  11. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  12. Evaluation of SAPHIR / Megha-Tropiques observations - CINDY/DYNAMO Campaign

    Science.gov (United States)

    Clain, Gaelle; Brogniez, Hélène; John, Viju; Payne, Vivienne; Luo, Ming

    2014-05-01

    The SAPHIR sounder (Sondeur Atmosphérique du Profil d'Humidité Intertropicale par Radiométrie) onboard the Megha-Tropiques (MT) platform observes the microwave radiation emitted by the Earth system in the strong absorption line of water vapor at 183.31 GHz. It is a multi-channel microwave humidity sounder with 6 channels in the 183.31GHz water vapor absorption band, a maximum scan angle of 42.96° around nadir, a 1700 km wide swath and a footprint resolution of 10 km at nadir. A comparison between the sensor L1A2 observations and radiative transfer calculations using in situ measurements from radiosondes as input is performed in order to validate the satellite observations on the brightness temperature (BT) level. The radiosonde humidity observations chosen as reference were performed during the CINDY/DYNAMO campaign (september 2011 to March 2012) with Vaïsala RS92-SGPD probes and match to a spatio-temporal co-location with MT satellite overpasses. Although several sonde systems were used during the campaign, all of the sites selected for this study used the Vaïsala RS92-SGPD system and were chosen in order to avoid discrepancies in data quality and biases. This work investigates the difference - or bias - between the BTs observed by the sensor and BT simulations from a radiative transfer model, RTTOV-10. The bias amplitude is characterized by a temperature dependent pattern, increasing from nearly 0 Kelvin for the 183.31 ± 0.2 channel to a range of 2 K for the 183.31 ± 11 channel. However the comparison between the sensor data and the radiative transfer simulations is not straightforward and uncertainties associated to the data processing must be propagated throughout the evaluation. Therefore this work documents an evaluation of the uncertainties and errors that can impact the BT bias. These can be linked to the radiative transfer model input and design, the radiosonde observations, the methodology chosen for the comparison and the SAPHIR instrument itself.

  13. The Malawi Developmental Assessment Tool (MDAT: the creation, validation, and reliability of a tool to assess child development in rural African settings.

    Directory of Open Access Journals (Sweden)

    Melissa Gladstone

    2010-05-01

    Full Text Available Although 80% of children with disabilities live in developing countries, there are few culturally appropriate developmental assessment tools available for these settings. Often tools from the West provide misleading findings in different cultural settings, where some items are unfamiliar and reference values are different from those of Western populations.Following preliminary and qualitative studies, we produced a draft developmental assessment tool with 162 items in four domains of development. After face and content validity testing and piloting, we expanded the draft tool to 185 items. We then assessed 1,426 normal rural children aged 0-6 y from rural Malawi and derived age-standardized norms for all items. We examined performance of items using logistic regression and reliability using kappa statistics. We then considered all items at a consensus meeting and removed those performing badly and those that were unnecessary or difficult to administer, leaving 136 items in the final Malawi Developmental Assessment Tool (MDAT. We validated the tool by comparing age-matched normal children with those with malnutrition (120 and neurodisabilities (80. Reliability was good for items remaining with 94%-100% of items scoring kappas >0.4 for interobserver immediate, delayed, and intra-observer testing. We demonstrated significant differences in overall mean scores (and individual domain scores for children with neurodisabilities (35 versus 99 [p<0.001] when compared to normal children. Using a pass/fail technique similar to the Denver II, 3% of children with neurodisabilities passed in comparison to 82% of normal children, demonstrating good sensitivity (97% and specificity (82%. Overall mean scores of children with malnutrition (weight for height <80% were also significantly different from scores of normal controls (62.5 versus 77.4 [p<0.001]; scores in the separate domains, excluding social development, also differed between malnourished children and

  14. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    Science.gov (United States)

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling

  15. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki

    2010-05-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  16. Some Topological Properties of Rough Sets with Tools for Data Mining

    Directory of Open Access Journals (Sweden)

    A S Salama

    2011-05-01

    Full Text Available Rough set theory has a significant importance in many fields. For example, some branches of artificial intelligence, such as inductive reasoning, automatic classification, pattern recognition, learning algorithms, classification theory, cluster analysis, measurement theory and taxonomy. Also, in the domains of Medicine, Pharmacology, Banking, Market research and Engineering the rough set theory has demonstrated its usefulness. The main aim of this paper is to describe some topological properties of rough sets and open the door about more accurate topological measures of data mining.

  17. Developing a free and easy to use digital goal setting tool for busy mums

    Directory of Open Access Journals (Sweden)

    Babs Evans

    2015-09-01

    Using data, research and the expertise of commercial and charity partners was an effective way to design a digital product to support behavioural change. By understanding the target audience from the beginning and involving them in the planning stages, the organisations were able to develop a tool the users want with a strong focus on user experience.

  18. Investigation of isoprene oxidation in the atmosphere simulation chamber SAPHIR at low NO concentrations

    Science.gov (United States)

    Fuchs, H.; Rohrer, F.; Hofzumahaus, A.; Bohn, B.; Brauers, T.; Dorn, H.; Häseler, R.; Holland, F.; Li, X.; Lu, K.; Nehr, S.; Tillmann, R.; Wahner, A.

    2012-12-01

    During recent field campaigns, hydroxyl radical (OH) concentrations that were measured by laser-induced fluorescence spectroscopy (LIF) were up to a factor of ten larger than predicted by current chemical models for conditions of high OH reactivity and low nitrogen monoxide (NO) concentrations. These discrepancies were observed in the Pearl-River-Delta, China, which is an urban-influenced rural area, in rainforests, and forested areas in North America and Europe. Isoprene contributed significantly to the total OH reactivity in these field studies, so that potential explanations for the missing OH focused on new reaction pathways in the isoprene degradation mechanism. These pathways regenerate OH without oxidation of NO and thus without ozone production. In summer 2011, a series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Juelich, Germany, in order to investigate the photochemical degradation of isoprene at low NO concentrations (NOSAPHIR by established chemical models like the Master Chemical Mechanism (MCM). Moreover, OH concentration measurements of two independent instruments (LIF and DOAS) agreed during all chamber experiments. Here, we present the results of the experiments and compare measurements with model predictions using the MCM. Furthermore, the validity of newly proposed reaction pathways in the isoprene degradation is evaluated by comparison with observations.

  19. Total OH reactivity study from VOC photochemical oxidation in the SAPHIR chamber

    Science.gov (United States)

    Yu, Z.; Tillmann, R.; Hohaus, T.; Fuchs, H.; Novelli, A.; Wegener, R.; Kaminski, M.; Schmitt, S. H.; Wahner, A.; Kiendler-Scharr, A.

    2015-12-01

    It is well known that hydroxyl radicals (OH) act as a dominant reactive species in the degradation of VOCs in the atmosphere. In recent field studies, directly measured total OH reactivity often showed poor agreement with OH reactivity calculated from VOC measurements (e.g. Nölscher et al., 2013; Lu et al., 2012a). This "missing OH reactivity" is attributed to unaccounted biogenic VOC emissions and/or oxidation products. The comparison of total OH reactivity being directly measured and calculated from single component measurements of VOCs and their oxidation products gives us a further understanding on the source of unmeasured reactive species in the atmosphere. This allows also the determination of the magnitude of the contribution of primary VOC emissions and their oxidation products to the missing OH reactivity. A series of experiments was carried out in the atmosphere simulation chamber SAPHIR in Jülich, Germany, to explore in detail the photochemical degradation of VOCs (isoprene, ß-pinene, limonene, and D6-benzene) by OH. The total OH reactivity was determined from the measurement of VOCs and their oxidation products by a Proton Transfer Reaction Time of Flight Mass Spectrometer (PTR-TOF-MS) with a GC/MS/FID system, and directly measured by a laser-induced fluorescence (LIF) at the same time. The comparison between these two total OH reactivity measurements showed an increase of missing OH reactivity in the presence of oxidation products of VOCs, indicating a strong contribution to missing OH reactivity from uncharacterized oxidation products.

  20. Impact of horizontal and vertical localization scales on microwave sounder SAPHIR radiance assimilation

    Science.gov (United States)

    Krishnamoorthy, C.; Balaji, C.

    2016-05-01

    In the present study, the effect of horizontal and vertical localization scales on the assimilation of direct SAPHIR radiances is studied. An Artificial Neural Network (ANN) has been used as a surrogate for the forward radiative calculations. The training input dataset for ANN consists of vertical layers of atmospheric pressure, temperature, relative humidity and other hydrometeor profiles with 6 channel Brightness Temperatures (BTs) as output. The best neural network architecture has been arrived at, by a neuron independence study. Since vertical localization of radiance data requires weighting functions, a ANN has been trained for this purpose. The radiances were ingested into the NWP using the Ensemble Kalman Filter (EnKF) technique. The horizontal localization has been taken care of, by using a Gaussian localization function centered around the observed coordinates. Similarly, the vertical localization is accomplished by assuming a function which depends on the weighting function of the channel to be assimilated. The effect of both horizontal and vertical localizations has been studied in terms of ensemble spread in the precipitation. Aditionally, improvements in 24 hr forecast from assimilation are also reported.

  1. Comparison of OH reactivity instruments in the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, Hendrik

    2016-04-01

    OH reactivity measurement has become an important measurement to constrain the total OH loss frequency in field experiments. Different techniques have been developed by various groups. They can be based on flow-tube or pump and probe techniques, which include direct OH detection by fluorescence, or on a comparative method, in which the OH loss of a reference species competes with the OH loss of trace gases in the sampled air. In order to ensure that these techniques deliver equivalent results, a comparison exercise was performed under controlled conditions. Nine OH reactivity instruments measured together in the atmosphere simulation chamber SAPHIR (volume 270 m3) during ten daylong experiments in October 2015 at ambient temperature (5 to 10° C) and pressure (990-1010 hPa). The chemical complexity of air mixtures in these experiments varied from CO in pure synthetic air to emissions from real plants and VOC/NOx mixtures representative of urban atmospheres. Potential differences between measurements were systematically investigated by changing the amount of reactants (including isoprene, monoterpenes and sesquiterpenes), water vapour, and nitrogen oxides. Some of the experiments also included the oxidation of reactants with ozone or hydroxyl radicals, in order to elaborate, if the presence of oxidation products leads to systematic differences between measurements of different instruments. Here we present first results of this comparison exercise.

  2. 雷肯Saphir-7系列播种机

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    雷肯Saphir-7系列播种机,是同大型拖拉机配套使用的机械传动播种机械。该机悬挂于拖拉机后进行播种作业,利用3点悬挂及机前控制装置调节播种深度,适用于大面积浅耕麦类播种作业。主要特点:可选装单、双圆盘开沟器及锄齿式开沟器,对土地适应性强:采用播种计量轮及油浴式齿轮箱,无级调节播种量,播量准确、节省种子;可与驱动耙或耕耘机具组成机组进行复式作业。

  3. Core Outcome Sets and Multidimensional Assessment Tools for Harmonizing Outcome Measure in Chronic Pain and Back Pain

    Directory of Open Access Journals (Sweden)

    Ulrike Kaiser

    2016-08-01

    Full Text Available Core Outcome Sets (COSs are a set of domains and measurement instruments recommended for application in any clinical trial to ensure comparable outcome assessment (both domains and instruments. COSs are not exclusively recommended for clinical trials, but also for daily record keeping in routine care. There are several COS recommendations considering clinical trials as well as multidimensional assessment tools to support daily record keeping in low back pain. In this article, relevant initiatives will be described, and implications for research in COS development in chronic pain and back pain will be discussed.

  4. Core Outcome Sets and Multidimensional Assessment Tools for Harmonizing Outcome Measure in Chronic Pain and Back Pain

    Science.gov (United States)

    Kaiser, Ulrike; Neustadt, Katrin; Kopkow, Christian; Schmitt, Jochen; Sabatowski, Rainer

    2016-01-01

    Core Outcome Sets (COSs) are a set of domains and measurement instruments recommended for application in any clinical trial to ensure comparable outcome assessment (both domains and instruments). COSs are not exclusively recommended for clinical trials, but also for daily record keeping in routine care. There are several COS recommendations considering clinical trials as well as multidimensional assessment tools to support daily record keeping in low back pain. In this article, relevant initiatives will be described, and implications for research in COS development in chronic pain and back pain will be discussed. PMID:27589816

  5. Imprecision and uncertainty in information representation and processing new tools based on intuitionistic fuzzy sets and generalized nets

    CERN Document Server

    Sotirov, Sotir

    2016-01-01

    The book offers a comprehensive and timely overview of advanced mathematical tools for both uncertainty analysis and modeling of parallel processes, with a special emphasis on intuitionistic fuzzy sets and generalized nets. The different chapters, written by active researchers in their respective areas, are structured to provide a coherent picture of this interdisciplinary yet still evolving field of science. They describe key tools and give practical insights into and research perspectives on the use of Atanassov's intuitionistic fuzzy sets and logic, and generalized nets for describing and dealing with uncertainty in different areas of science, technology and business, in a single, to date unique book. Here, readers find theoretical chapters, dealing with intuitionistic fuzzy operators, membership functions and algorithms, among other topics, as well as application-oriented chapters, reporting on the implementation of methods and relevant case studies in management science, the IT industry, medicine and/or ...

  6. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  7. Optical Method For Monitoring Tool Control For Green Burnishing With Using Of Algorithms With Adaptive Settings

    Science.gov (United States)

    Lukyanov, A. A.; Grigoriev, S. N.; Bobrovskij, I. N.; Melnikov, P. A.; Bobrovskij, N. M.

    2017-05-01

    With regard to the complexity of the new technology and increase its reliability requirements laboriousness of control operations in industrial quality control systems increases significantly. The importance of quality management control due to the fact that its promotes the correct use of production conditions, the relevant requirements are required. Digital image processing allows to reach a new technological level of production (new technological way). The most complicated automated interpretation of information is the basis for decision-making in the management of production processes. In the case of surface analysis of tools used for processing with the using of metalworking fluids (MWF) it is more complicated. The authors suggest new algorithm for optical inspection of the wear of the cylinder tool for burnishing, which used in surface plastic deformation without using of MWF. The main advantage of proposed algorithm is the possibility of automatic recognition of images of burnisher tool with the subsequent allocation of its boundaries, finding a working surface and automatically allocating the defects and wear area. Software that implements the algorithm was developed by the authors in Matlab programming environment, but can be implemented using other programming languages.

  8. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-10-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  9. Independent Verification and Validation Of SAPHIRE 8 Software Acceptance Test Plan Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE 8 Software Acceptance Test Plan is to assess the approach to be taken for intended testing activities. The plan typically identifies the items to be tested, the requirements being tested, the testing to be performed, test schedules, personnel requirements, reporting requirements, evaluation criteria, and any risks requiring contingency planning. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production.

  10. Independent Verification and Validation Of SAPHIRE 8 Software Design and Interface Design Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2010-03-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE software design and interface design is to assess the activities that results in the development, documentation, and review of a software design that meets the requirements defined in the software requirements documentation. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP) design specification.

  11. Generated spiral bevel gears: Optimal machine-tool settings and tooth contact analysis

    Science.gov (United States)

    Litvin, F. L.; Tsung, W. J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  12. Generated spiral bevel gears - Optimal machine-tool settings and tooth contact analysis

    Science.gov (United States)

    Litvin, F. L.; Tsung, W.-J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  13. Design of Pinion Machine Tool-settings for Spiral Bevel Gears by Controlling Contact Path and Transmission Errors

    Institute of Scientific and Technical Information of China (English)

    Cao Xuemei; Fang Zongde; Xu Hao; Su Jinzhan

    2008-01-01

    This paper proposes a new approach to design pinion machine tool-settings for spiral bevel gears by controlling contact path and transmis- sion errors. It is based on the satisfaction of contact condition of three given control points on the tooth surface. The three meshing points are controlled to be on a predesigned straight contact path that meets the pre-designed parabolic function of transmission errors. Designed separately, the magnitude of transmission errors and the orientation of the contact path are subjected to precise control. In addition, in order to meet the manufacturing requirements, we suggest to modify the values of blank offset, one of the pinion machine tool-settings, and redesign pinion ma- chine tool-settings to ensure that the magnitude and the geometry of transmission errors should not be influenced apart from minor effects on the predesigned straight contact path. The proposed approach together with its ideas has been proven by a numerical example and the manufacturing practice of a pair of spiral bevel gears.

  14. User's manual for tooth contact analysis of face-milled spiral bevel gears with given machine-tool settings

    Science.gov (United States)

    Litvin, Faydor L.; Zhang, YI; Chen, Jui-Sheng

    1991-01-01

    Research was performed to develop a computer program that will: (1) simulate the meshing and bearing contact for face milled spiral beval gears with given machine tool settings; and (2) to obtain the output, some of the data is required for hydrodynamic analysis. It is assumed that the machine tool settings and the blank data will be taken from the Gleason summaries. The theoretical aspects of the program are based on 'Local Synthesis and Tooth Contact Analysis of Face Mill Milled Spiral Bevel Gears'. The difference between the computer programs developed herein and the other one is as follows: (1) the mean contact point of tooth surfaces for gears with given machine tool settings must be determined iteratively, while parameters (H and V) are changed (H represents displacement along the pinion axis, V represents the gear displacement that is perpendicular to the plane drawn through the axes of the pinion and the gear of their initial positions), this means that when V differs from zero, the axis of the pionion and the gear are crossed but not intersected; (2) in addition to the regular output data (transmission errors and bearing contact), the new computer program provides information about the contacting force for each contact point and the sliding and the so-called rolling velocity. The following topics are covered: (1) instructions for the users as to how to insert the input data; (2) explanations regarding the output data; (3) numerical example; and (4) listing of the program.

  15. Goal-Setting as a Teaching and Evaluation Tool in Student Teaching.

    Science.gov (United States)

    Schofer, Gillian

    1981-01-01

    Analyzes 399 stated goals of participants in four separate student-teaching seminars. Evaluation revealed a correspondence with Snyder and Anderson's seven subsystems of teacher proficiency. Goal-setting assignments before student teaching experience, a follow-up self-evaluation, and analysis appear to have value for student growth and department…

  16. Spiritual Assessment and Native Americans: Establishing the Social Validity of a Complementary Set of Assessment Tools

    Science.gov (United States)

    Hodge, David R.; Limb, Gordon E.

    2011-01-01

    Although social work practitioners are increasingly likely to administer spiritual assessments with Native American clients, few qualitative assessment instruments have been validated with this population. This mixed-method study validates a complementary set of spiritual assessment instruments. Drawing on the social validity literature, a sample…

  17. Development of a Physical Environmental Observational Tool for Dining Environments in Long-Term Care Settings.

    Science.gov (United States)

    Chaudhury, Habib; Keller, Heather; Pfisterer, Kaylen; Hung, Lillian

    2017-02-20

    This paper presents the first standardized physical environmental assessment tool titled Dining Environment Audit Protocol (DEAP) specifically designed for dining spaces in care homes and reports the results of its psychometric properties. Items rated include: adequacy of lighting, glare, personal control, clutter, staff supervision support, restraint use, and seating arrangement option for social interaction. Two scales summarize the prior items and rate the overall homelikeness and functionality of the space. Ten dining rooms in three long-term care homes were selected for assessment. Data were collected over 11 days across 5 weeks. Two trained assessors completed DEAP independently on the same day. Interrater-reliability was completed for lighting, glare, space, homelike aspects, seating arrangements and the two summary scales, homelikeness and functionality of the space. For categorical measures, measure responses were dichotomized at logical points and Cohen's Kappa and concordance on ratings were determined. The two overall rating scales on homelikeness and functionality of space were found to be reliable intraclass correlation coefficient (ICC) (~0.7). The mean rating for homelikeness for Assessor 1 was 3.5 (SD 1.35) and for functionality of the room was 5.3. (SD 0.82; median 5.5). The findings indicate that the tool's interrater-reliability scores are promising. The high concordance on the overall scores for homelikeness and functionality is indicative of the strength of the individual items in generating a reliable global assessment score on these two important aspects of the dining space.

  18. Using workload measurement tools in diverse care contexts: the experience of staff in mental health and learning disability inpatient settings.

    Science.gov (United States)

    Fanneran, T; Brimblecombe, N; Bradley, E; Gregory, S

    2015-12-01

    What is known on the subject? Difficulties with the recruitment and retention of qualified nursing staff have resulted in nursing shortages worldwide with a consequential impact on the quality of care. It is increasingly recommended that evidence-based staffing levels are central to the development of workforce plans. Due to a paucity of empirical research in mental health and learning disability services the staffing needs and requirements for these settings are undefined and the availability of tools to aid staffing decisions is limited. What this paper adds to existing knowledge? This paper provides a valuable insight into the practical uses of these tools as perceived by staff members with day-to-day experience of the requirements of mental health and learning disability wards. It reveals that while workload measurement tools are considered a valuable aid for the development of workforce plans, they are limited in their ability to capture all aspects of care provision in these settings. It further emphasizes the inapplicability of a one-shoe-fits-all approach for determining nurse staffing levels and the need for individual and customized workforce plans. What are the implications for practice? This study demonstrates that the development of tools for use in mental health and learning disability services is in its infancy, yet no tool that has been validated as such. It highlights the potential for workload measurement tools to aid staffing decisions; however, a more holistic approach that considers additional factors is needed to ensure robust workforce planning models are developed for these services. The critical challenge of determining the correct level and skill mix of nursing staff required to deliver safe and effective health care has become an international concern. It is recommended that evidence-based staffing decisions are central to the development of future workforce plans. Workforce planning in mental health and learning disability nursing is

  19. Laparohysteroscopy in female infertility: A diagnostic cum therapeutic tool in Indian setting

    OpenAIRE

    Puri, Suman; Jain, Dinesh; Puri, Sandeep; Kaushal, Sandeep; Deol, Satjeet Kaur

    2015-01-01

    Aims: To evaluate the role of laparohysteroscopy in female infertility andto study the effect of therapeutic procedures in achieving fertility. Settings and Design: Patients with female infertility presenting to outpatient Department of Obstetrics and Gynecology were evaluated over a period of 18 months. Materials and Methods: Fifty consenting subjects excluding male factor infertility with normal hormonal profile and no contraindication to laparoscopy were subject to diagnostic laparoscopy a...

  20. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates

    DEFF Research Database (Denmark)

    Schwämmle, Veit; León, Ileana R.; Jensen, Ole Nørregaard

    2013-01-01

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical...... changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets...... with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved...

  1. Laparohysteroscopy in female infertility: A diagnostic cum therapeutic tool in Indian setting.

    Science.gov (United States)

    Puri, Suman; Jain, Dinesh; Puri, Sandeep; Kaushal, Sandeep; Deol, Satjeet Kaur

    2015-01-01

    To evaluate the role of laparohysteroscopy in female infertility andto study the effect of therapeutic procedures in achieving fertility. Patients with female infertility presenting to outpatient Department of Obstetrics and Gynecology were evaluated over a period of 18 months. Fifty consenting subjects excluding male factor infertility with normal hormonal profile and no contraindication to laparoscopy were subject to diagnostic laparoscopy and hysteroscopy. T-test. We studied 50 patients comprising of 24 (48%) cases of primary infertility and 26 (52%) patients of secondary infertility. The average age of active married life for 50 patients was between 8 and 9 years. In our study, the most commonly found pathologies were PCOD, endometroisis and tubal blockage. 11 (28.2) patients conceived after laparohysteroscopy followed by artificial reproductive techniques. This study demonstrates the benefit of laparohysteroscopy for diagnosis and as a therapeutic tool in patients with primary and secondary infertility. We were able to achieve a higher conception rate of 28.2%.

  2. Geostatistics as a validation tool for setting ozone standards for durum wheat.

    Science.gov (United States)

    De Marco, Alessandra; Screpanti, Augusto; Paoletti, Elena

    2010-02-01

    Which is the best standard for protecting plants from ozone? To answer this question, we must validate the standards by testing biological responses vs. ambient data in the field. A validation is missing for European and USA standards, because the networks for ozone, meteorology and plant responses are spatially independent. We proposed geostatistics as validation tool, and used durum wheat in central Italy as a test. The standards summarized ozone impact on yield better than hourly averages. Although USA criteria explained ozone-induced yield losses better than European criteria, USA legal level (75 ppb) protected only 39% of sites. European exposure-based standards protected > or =90%. Reducing the USA level to the Canadian 65 ppb or using W126 protected 91% and 97%, respectively. For a no-threshold accumulated stomatal flux, 22 mmol m(-2) was suggested to protect 97% of sites. In a multiple regression, precipitation explained 22% and ozone explained <0.9% of yield variability.

  3. AnyStitch: a tool for combining electron backscatter diffraction data sets.

    Science.gov (United States)

    Pilchak, A L; Shiveley, A R; Tiley, J S; Ballard, D L

    2011-10-01

    Recent advances in electron backscatter diffraction equipment and software have permitted increased data acquisition rates on the order of hundreds of points per second with additional increases in the foreseeable future likely. This increase in speed allows users to collect data from statistically significant areas of samples by combining beam-control scans and automated stage movements. To facilitate data analysis, however, the individual tiles must be combined, or stitched, into a single data set. In this paper, we describe a matlab(®) (The Mathworks, Inc., Natick, MA, USA) program to facilitate stitching of electron backscatter diffraction data. The method offers users a wide range of controls for tile placement including independent overlaps for horizontal and vertical tiles and also includes a parameter to account for systematic stage positioning errors or improperly calibrated scan rotation. The code can stitch data collected on either square or hexagonal grids and contains a function to reduce the resolution of square grid data if the resulting file is too large (or has too many grains) to be opened by the analysis software. The software was primarily written to work with TSL(®) OIM™ data sets and includes a function to quickly read compressed *.osc files into a variable in the matlab(®) workspace as opposed to using slower, text-reading functions. The output file is in *.ang format and can be opened directly by TSL(®) OIM™ Analysis software. A set of functions to facilitate stitching of text-based *.ctf files produced by Oxford Instruments HKL systems are also included. Finally, the code can also be used to combine *.tif images to produce a montage. The source code, a graphical user interface and a compiled version of the software was made available in the online version of this paper.

  4. Atmospheric photochemistry of aromatic hydrocarbons: OH budgets during SAPHIR chamber experiments

    Science.gov (United States)

    Nehr, S.; Bohn, B.; Dorn, H.-P.; Fuchs, H.; Häseler, R.; Hofzumahaus, A.; Li, X.; Rohrer, F.; Tillmann, R.; Wahner, A.

    2014-07-01

    Current photochemical models developed to simulate the atmospheric degradation of aromatic hydrocarbons tend to underestimate OH radical concentrations. In order to analyse OH budgets, we performed experiments with benzene, toluene, p-xylene and 1,3,5-trimethylbenzene in the atmosphere simulation chamber SAPHIR. Experiments were conducted under low-NO conditions (typically 0.1-0.2 ppb) and high-NO conditions (typically 7-8 ppb), and starting concentrations of 6-250 ppb of aromatics, dependent on OH rate constants. For the OH budget analysis a steady-state approach was applied in which OH production and destruction rates (POH and DOH) have to be equal. The POH were determined from measurements of HO2, NO, HONO, and O3 concentrations, considering OH formation by photolysis and recycling from HO2. The DOH were calculated from measurements of the OH concentrations and total OH reactivities. The OH budgets were determined from DOH/POH ratios. The accuracy and reproducibility of the approach were assessed in several experiments using CO as a reference compound where an average ratio DOH/POH = 1.13 ± 0.19 was obtained. In experiments with aromatics, these ratios ranged within 1.1-1.6 under low-NO conditions and 0.9-1.2 under high-NO conditions. The results indicate that OH budgets during photo-oxidation experiments with aromatics are balanced within experimental accuracies. Inclusion of a further, recently proposed OH production via HO2 + RO2 reactions led to improvements under low-NO conditions but the differences were small and insignificant within the experimental errors.

  5. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  6. Capnography as a tool to detect metabolic changes in patients cared for in the emergency setting

    Directory of Open Access Journals (Sweden)

    Francisco José Cereceda-Sánchez

    Full Text Available ABSTRACT Objective: to evaluate the usefulness of capnography for the detection of metabolic changes in spontaneous breathing patients, in the emergency and intensive care settings. Methods: in-depth and structured bibliographical search in the databases EBSCOhost, Virtual Health Library, PubMed, Cochrane Library, among others, identifying studies that assessed the relationship between capnography values and the variables involved in blood acid-base balance. Results: 19 studies were found, two were reviews and 17 were observational studies. In nine studies, capnography values were correlated with carbon dioxide (CO2, eight with bicarbonate (HCO3, three with lactate, and four with blood pH. Conclusions: most studies have found a good correlation between capnography values and blood biomarkers, suggesting the usefulness of this parameter to detect patients at risk of severe metabolic change, in a fast, economical and accurate way.

  7. The L3+C detector, a unique tool-set to study cosmic rays

    Energy Technology Data Exchange (ETDEWEB)

    Adriani, O.; Akker, M. van den; Banerjee, S.; Baehr, J.; Betev, B.; Bourilkov, D.; Bottai, S.; Bobbink, G.; Cartacci, A.; Chemarin, M.; Chen, G.; Chen, H.S.; Chiarusi, T.; Dai, C.J.; Ding, L.K.; Duran, I.; Faber, G.; Fay, J.; Grabosch, H.J.; Groenstege, H.; Guo, Y.N.; Gupta, S.; Haller, Ch.; Hayashi, Y.; He, Z.X.; Hebbeker, T.; Hofer, H.; Hoferjun, H.; Huo, A.X.; Ito, N.; Jing, C.L.; Jones, L.; Kantserov, V.; Kawakami, S.; Kittel, W.; Koenig, A.C.; Kok, E.; Korn, A.; Kuang, H.H.; Kuijpers, J.; Ladron de Guevara, P.; Le Coultre, P. E-mail: pierre.le.coultre@cern.ch; Lei, Y.; Leich, H.; Leiste, R.; Li, D.; Li, L.; Li, Z.C.; Liu, Z.A.; Liu, H.T.; Lohmann, W.; Lu, Y.S.; Ma, X.H.; Ma, Y.Q.; Mil, A. van; Monteleoni, B.; Nahnhauer, R.; Pauss, F.; Parriaud, J.-F.; Petersen, B.; Pohl, M.; Qing, C.R.; Ramelli, R.; Ravindran, K.C.; Rewiersma, P.; Rojkov, A.; Saidi, R.; Schmitt, V.; Schoeneich, B.; Schotanus, D.J.; Shen, C.Q.; Sulanke, H.; Tang, X.W.; Timmermans, C.; Tonwar, S.; Trowitzsch, G.; Unger, M.; Verkooijen, H.; Wang, X.L.; Wang, X.W.; Wang, Z.M.; Wijk, R. van; Wijnen, Th.A.M.; Wilkens, H.; Xu, Y.P.; Xu, Z.Z.; Yang, C.G.; Yang, X.F.; Yao, Z.G.; Yu, Z.Q.; Zhang, S.; Zhu, G.Y.; Zhu, Q.Q.; Zhuang, H.L.; Zwart, A.N.M

    2002-08-01

    The L3 detector at the CERN electron-positron collider, LEP, has been employed for the study of cosmic ray muons. The muon spectrometer of L3 consists of a set of high-precision drift chambers installed inside a magnet with a volume of about 1000 m{sup 3} and a field of 0.5 T. Muon momenta are measured with a resolution of a few percent at 50 GeV. The detector is located under 30 m of overburden. A scintillator air shower array of 54 m by 30 m is installed on the roof of the surface hall above L3 in order to estimate the energy and the core position of the shower associated with a sample of detected muons. Thanks to the unique properties of the L3+C detector, muon research topics relevant to various current problems in cosmic ray and particle astrophysics can be studied.

  8. Evaluating cognitive impairment in the clinical setting: practical screening and assessment tools.

    Science.gov (United States)

    Valcour, Victor G

    2011-12-01

    HIV-associated neurocognitive disorders (HAND) remain a substantial problem in the era of combination antiretroviral therapy. Neither the Mini Mental State Exam nor the HIV Dementia Scale is sufficiently sensitive for HAND. The Montreal Cognitive Assessment shows promise, but current data suggest that adding an additional test will be needed to improve sensitivity for the clinical setting. Patient reporting of symptoms is insensitive as most cases of HAND are asymptomatic. Examination of cerebrospinal fluid (CSF) is sometimes warranted in select patients to evaluate for CSF HIV RNA detectability. CSF escape of virus, when CSF HIV RNA is detectable but plasma HIV RNA is not, appears to be a relatively uncommon event in the clinical setting where the level of detectability for typical clinical assays is around 50 copies/mL. In cases of CSF escape, cognitive improvement has been linked to changes in antiretroviral regimens that are aimed at either overcoming antiretroviral resistance or improving central nervous system (CNS) penetration-effectiveness. Currently, for most patients with HAND in the absence of unusual features, there are insufficient data for a recommendation to routinely intensify therapy with a neurointensive antiretroviral regimen; however, there is considerable uncertainty given emerging data and variability in approach among experts in the field. This article summarizes a case-based presentation by Victor G. Valcour, MD, at the 14th Annual Clinical Conference for the Ryan White HIV/AIDS Program held in Tampa, Florida, in June 2011. The Clinical Conference is sponsored by the IAS-USA under the Health Resources and Services Administration (HRSA) contract number HHSH250200900010C.

  9. BMPOS: a Flexible and User-Friendly Tool Sets for Microbiome Studies.

    Science.gov (United States)

    Pylro, Victor S; Morais, Daniel K; de Oliveira, Francislon S; Dos Santos, Fausto G; Lemos, Leandro N; Oliveira, Guilherme; Roesch, Luiz F W

    2016-08-01

    Recent advances in science and technology are leading to a revision and re-orientation of methodologies, addressing old and current issues under a new perspective. Advances in next generation sequencing (NGS) are allowing comparative analysis of the abundance and diversity of whole microbial communities, generating a large amount of data and findings at a systems level. The current limitation for biologists has been the increasing demand for computational power and training required for processing of NGS data. Here, we describe the deployment of the Brazilian Microbiome Project Operating System (BMPOS), a flexible and user-friendly Linux distribution dedicated to microbiome studies. The Brazilian Microbiome Project (BMP) has developed data analyses pipelines for metagenomic studies (phylogenetic marker genes), conducted using the two main high-throughput sequencing platforms (Ion Torrent and Illumina MiSeq). The BMPOS is freely available and possesses the entire requirement of bioinformatics packages and databases to perform all the pipelines suggested by the BMP team. The BMPOS may be used as a bootable live USB stick or installed in any computer with at least 1 GHz CPU and 512 MB RAM, independent of the operating system previously installed. The BMPOS has proved to be effective for sequences processing, sequences clustering, alignment, taxonomic annotation, statistical analysis, and plotting of metagenomic data. The BMPOS has been used during several metagenomic analyses courses, being valuable as a tool for training, and an excellent starting point to anyone interested in performing metagenomic studies. The BMPOS and its documentation are available at http://www.brmicrobiome.org .

  10. Coupling CFAST fire modeling and SAPHIRE probabilistic assessment software for internal fire safety evaluation of a typical TRIGA research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Safaei Arshi, Saiedeh [School of Engineering, Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Nematollahi, Mohammadreza, E-mail: nema@shirazu.ac.i [School of Engineering, Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Safety Research Center of Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of); Sepanloo, Kamran [Safety Research Center of Shiraz University, 71348-51154 Shiraz (Iran, Islamic Republic of)

    2010-03-15

    Due to the significant threat of internal fires for the safety operation of nuclear reactors, presumed fire scenarios with potential hazards for loss of typical research reactor safety functions are analyzed by coupling CFAST fire modeling and SAPHIRE probabilistic assessment software. The investigations show that fire hazards associated with electrical cable insulation, lubricating oils, diesel, electrical equipment and carbon filters may lead to unsafe situations called core damage states. Using system-specific event trees, the occurrence frequency of core damage states after the occurrence of each possible fire scenario in critical fire compartments is evaluated. Probability that the fire ignited in the given fire compartment will burn long enough to cause the extent of damage defined by each fire scenario is calculated by means of detection-suppression event tree. As a part of detection-suppression event trees quantification, and also for generating the necessary input data for evaluating the frequency of core damage states by SAPHIRE 7.0 software, CFAST fire modeling software is applied. The results provide a probabilistic measure of the quality of existing fire protection systems in order to maintain the reactor at a reasonable safety level.

  11. Validation of Nurse Practitioner Primary Care Organizational Climate Questionnaire: A New Tool to Study Nurse Practitioner Practice Settings.

    Science.gov (United States)

    Poghosyan, Lusine; Chaplin, William F; Shaffer, Jonathan A

    2017-04-01

    Favorable organizational climate in primary care settings is necessary to expand the nurse practitioner (NP) workforce and promote their practice. Only one NP-specific tool, the Nurse Practitioner Primary Care Organizational Climate Questionnaire (NP-PCOCQ), measures NP organizational climate. We confirmed NP-PCOCQ's factor structure and established its predictive validity. A crosssectional survey design was used to collect data from 314 NPs in Massachusetts in 2012. Confirmatory factor analysis and regression models were used. The 4-factor model characterized NP-PCOCQ. The NP-PCOCQ score predicted job satisfaction (beta = .36; p organizational climate in their clinics. Further testing of NP-PCOCQ is needed.

  12. PetroPlot: A plotting and data management tool set for Microsoft Excel

    Science.gov (United States)

    Su, Yongjun; Langmuir, Charles H.; Asimow, Paul D.

    2003-03-01

    PetroPlot is a 4000-line software code written in Visual Basic for the spreadsheet program Excel that automates plotting and data management tasks for large amount of data. The major plotting functions include: automation of large numbers of multiseries XY plots; normalized diagrams (e.g., spider diagrams); replotting of any complex formatted diagram with multiple series for any other axis parameters; addition of customized labels for individual data points; and labeling flexible log scale axes. Other functions include: assignment of groups for samples based on multiple customized criteria; removal of nonnumeric values; calculation of averages/standard deviations; calculation of correlation matrices; deletion of nonconsecutive rows; and compilation of multiple rows of data for a single sample to single rows appropriate for plotting. A cubic spline function permits curve fitting to complex time series, and comparison of data to the fits. For users of Excel, PetroPlot increases efficiency of data manipulation and visualization by orders of magnitude and allows exploration of large data sets that would not be possible making plots individually. The source codes are open to all users.

  13. Limitations of Extending Juvenile Psychopathy Research Assessment Tools and Methods to Forensic Settings

    Directory of Open Access Journals (Sweden)

    Robert Semel

    2015-10-01

    Full Text Available Numerous studies have demonstrated significant associations between psychopathic-like characteristics in youths, which include affective, interpersonal, and behavioral dimensions, and severe and persisting conduct problems, violence, aggression, and antisocial behavior. The results of such studies have had practical considerations in the area of assessment with respect to the added specifier, “with limited prosocial emotions”, for the diagnosis of conduct disorder, and to the inclusion of features that have been described as callous-unemotional (CU traits on empirically supported juvenile risk assessment instruments. However, it is questionable whether findings obtained through research which provides anonymity and/or confidentiality to research participants are comparable across research and applied settings, and whether self-report youth psychopathy or CU questionnaire measures, which are vulnerable to deception, can effectively measure the same latent constructs in applied contexts. Forensic mental health evaluators attempt to obtain optimally consistent and reliable information through the use of multiple sources of information and through multiple assessment methods. Current practices of forensic mental health evaluators with regard to risk assessment and assessment of juvenile psychopathy are referenced. Some observations and suggestions are made to help meet the challenge of bridging the gap between research that utilizes self-report measures and applied forensic contexts.

  14. Measuring situation awareness in emergency settings: a systematic review of tools and outcomes.

    Science.gov (United States)

    Cooper, Simon; Porter, Joanne; Peach, Linda

    2014-01-01

    Nontechnical skills have an impact on health care outcomes and improve patient safety. Situation awareness is core with the view that an understanding of the environment will influence decision-making and performance. This paper reviews and describes indirect and direct measures of situation awareness applicable for emergency settings. Electronic databases and search engines were searched from 1980 to 2010, including CINAHL, Ovid Medline, Pro-Quest, Cochrane, and the search engine, Google Scholar. Access strategies included keyword, author, and journal searches. Publications identified were assessed for relevance, and analyzed and synthesized using Oxford evidence levels and the Critical Appraisal Skills Programme guidelines in order to assess their quality and rigor. One hundred and thirteen papers were initially identified, and reduced to 55 following title and abstract review. The final selection included 14 papers drawn from the fields of emergency medicine, intensive care, anesthetics, and surgery. Ten of these discussed four general nontechnical skill measures (including situation awareness) and four incorporated the Situation Awareness Global Assessment Technique. A range of direct and indirect techniques for measuring situation awareness is available. In the medical literature, indirect approaches are the most common, with situation awareness measured as part of a nontechnical skills assessment. In simulation-based studies, situation awareness in emergencies tends to be suboptimal, indicating the need for improved training techniques to enhance awareness and improve decision-making.

  15. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  16. Measuring situation awareness in emergency settings: a systematic review of tools and outcomes

    Directory of Open Access Journals (Sweden)

    Cooper S

    2013-12-01

    Full Text Available Simon Cooper,1,2 Joanne Porter,3 Linda Peach41School of Nursing and Midwifery, Monash University, Berwick, Victoria, Australia; 2School of Nursing and Midwifery, University of Brighton, Brighton, UK; 3School of Nursing and Midwifery Monash University, Gippsland, VIC, 4School of Nursing and Midwifery, Monash University, Melbourne, VIC, AustraliaBackground: Nontechnical skills have an impact on health care outcomes and improve patient safety. Situation awareness is core with the view that an understanding of the environment will influence decision-making and performance. This paper reviews and describes indirect and direct measures of situation awareness applicable for emergency settings.Methods: Electronic databases and search engines were searched from 1980 to 2010, including CINAHL, Ovid Medline, Pro-Quest, Cochrane, and the search engine, Google Scholar. Access strategies included keyword, author, and journal searches. Publications identified were assessed for relevance, and analyzed and synthesized using Oxford evidence levels and the Critical Appraisal Skills Programme guidelines in order to assess their quality and rigor.Results: One hundred and thirteen papers were initially identified, and reduced to 55 following title and abstract review. The final selection included 14 papers drawn from the fields of emergency medicine, intensive care, anesthetics, and surgery. Ten of these discussed four general nontechnical skill measures (including situation awareness and four incorporated the Situation Awareness Global Assessment Technique.Conclusion: A range of direct and indirect techniques for measuring situation awareness is available. In the medical literature, indirect approaches are the most common, with situation awareness measured as part of a nontechnical skills assessment. In simulation-based studies, situation awareness in emergencies tends to be suboptimal, indicating the need for improved training techniques to enhance awareness and

  17. All-sky radiance simulation of Megha-Tropiques SAPHIR microwave sensor using multiple scattering radiative transfer model for data assimilation applications

    Indian Academy of Sciences (India)

    A Madhulatha; John P George; E N Rajagopal

    2017-03-01

    Incorporation of cloud- and precipitation-affected radiances from microwave satellite sensors in data assimilation system has a great potential in improving the accuracy of numerical model forecasts over the regions of high impact weather. By employing the multiple scattering radiative transfer model RTTOVSCATT,all-sky radiance (clear sky and cloudy sky) simulation has been performed for six channel microwave SAPHIR (Sounder for Atmospheric Profiling of Humidity in the Inter-tropics by Radiometry) sensors of Megha-Tropiques (MT) satellite. To investigate the importance of cloud-affected radiance data in severe weather conditions, all-sky radiance simulation is carried out for the severe cyclonic storm ‘Hudhud’ formed over Bay of Bengal. Hydrometeors from NCMRWF unified model (NCUM) forecasts are used as input to the RTTOV model to simulate cloud-affected SAPHIR radiances. Horizontal and vertical distribution of all-sky simulated radiances agrees reasonably well with the SAPHIR observed radiancesover cloudy regions during different stages of cyclone development. Simulated brightness temperatures of six SAPHIR channels indicate that the three dimensional humidity structure of tropical cyclone is well represented in all-sky computations. Improved correlation and reduced bias and root mean squareerror against SAPHIR observations are apparent. Probability distribution functions reveal that all-sky simulations are able to produce the cloud-affected lower brightness temperatures associated with cloudy regions. The density scatter plots infer that all-sky radiances are more consistent with observed radiances.Correlation between different types of hydrometeors and simulated brightness temperatures at respective atmospheric levels highlights the significance of inclusion of scattering effects from different hydrometeors in simulating the cloud-affected radiances in all-sky simulations. The results are promisingand suggest that the inclusion of multiple scattering

  18. All-sky radiance simulation of Megha-Tropiques SAPHIR microwave sensor using multiple scattering radiative transfer model for data assimilation applications

    Science.gov (United States)

    Madhulatha, A.; George, John P.; Rajagopal, E. N.

    2017-03-01

    Incorporation of cloud- and precipitation-affected radiances from microwave satellite sensors in data assimilation system has a great potential in improving the accuracy of numerical model forecasts over the regions of high impact weather. By employing the multiple scattering radiative transfer model RTTOV-SCATT, all-sky radiance (clear sky and cloudy sky) simulation has been performed for six channel microwave SAPHIR (Sounder for Atmospheric Profiling of Humidity in the Inter-tropics by Radiometry) sensors of Megha-Tropiques (MT) satellite. To investigate the importance of cloud-affected radiance data in severe weather conditions, all-sky radiance simulation is carried out for the severe cyclonic storm `Hudhud' formed over Bay of Bengal. Hydrometeors from NCMRWF unified model (NCUM) forecasts are used as input to the RTTOV model to simulate cloud-affected SAPHIR radiances. Horizontal and vertical distribution of all-sky simulated radiances agrees reasonably well with the SAPHIR observed radiances over cloudy regions during different stages of cyclone development. Simulated brightness temperatures of six SAPHIR channels indicate that the three dimensional humidity structure of tropical cyclone is well represented in all-sky computations. Improved correlation and reduced bias and root mean square error against SAPHIR observations are apparent. Probability distribution functions reveal that all-sky simulations are able to produce the cloud-affected lower brightness temperatures associated with cloudy regions. The density scatter plots infer that all-sky radiances are more consistent with observed radiances. Correlation between different types of hydrometeors and simulated brightness temperatures at respective atmospheric levels highlights the significance of inclusion of scattering effects from different hydrometeors in simulating the cloud-affected radiances in all-sky simulations. The results are promising and suggest that the inclusion of multiple scattering

  19. Overview of the SOFIA Data Cycle System: An integrated set of tools and services for the SOFIA General Investigator

    CERN Document Server

    Shuping, R Y; Lin, Lan; Sun, Li; Krzaczek, Robert

    2013-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne astronomical observatory comprised of a 2.5 meter infrared telescope mounted in the aft section of a Boeing 747SP aircraft that flies at operational altitudes between 37,000 and 45,00 feet, above 99% of atmospheric water vapor. During routine operations, a host of instruments will be available to the astronomical community including cameras and spectrographs in the near- to far-IR; a sub-mm heterodyne receiver; and an high-speed occultation imager. One of the challenges for SOFIA (and all observatories in general) is providing a uniform set of tools that enable the non-expert General Investigator (GI) to propose, plan, and obtain observations using a variety of very different instruments in an easy and seamless manner. The SOFIA Data Cycle System (DCS) is an integrated set of services and user tools for the SOFIA Science and Mission Operations GI Program designed to address this challenge. Program activities supported by the DCS inclu...

  20. Capnography as a tool to detect metabolic changes in patients cared for in the emergency setting.

    Science.gov (United States)

    Cereceda-Sánchez, Francisco José; Molina-Mula, Jesús

    2017-05-15

    to evaluate the usefulness of capnography for the detection of metabolic changes in spontaneous breathing patients, in the emergency and intensive care settings. in-depth and structured bibliographical search in the databases EBSCOhost, Virtual Health Library, PubMed, Cochrane Library, among others, identifying studies that assessed the relationship between capnography values and the variables involved in blood acid-base balance. 19 studies were found, two were reviews and 17 were observational studies. In nine studies, capnography values were correlated with carbon dioxide (CO2), eight with bicarbonate (HCO3), three with lactate, and four with blood pH. most studies have found a good correlation between capnography values and blood biomarkers, suggesting the usefulness of this parameter to detect patients at risk of severe metabolic change, in a fast, economical and accurate way. avaliar a utilidade da capnografia para a detecção de alterações metabólicas em pacientes com respiração espontânea, no contexto das emergências e dos cuidados intensivos. pesquisa bibliográfica estruturada aprofundada, nas bases de dados EBSCOhost, Biblioteca Virtual em Saúde, PubMed, Cochrane Library, entre outras, identificando estudos que avaliavam a relação entre os valores da capnografia e as variáveis envolvidas no equilíbrio ácido-base sanguíneo. foram levantados 19 estudos, dois eram revisões e 17 eram estudos observacionais. Em nove estudos, os valores capnográficos foram correlacionados com o dióxido de carbono (CO2), em oito com o bicarbonato (HCO3), em três com o lactato, e em quatro com o pH sanguíneo. na maioria dos estudos foi observada uma correlação adequada entre os valores capnográficos e os biomarcadores sanguíneos, sugerindo a utilidade deste parâmetro para a identificação de pacientes com risco de sofrer uma alteração metabólica grave, de uma forma rápida, econômica e precisa. explorar la utilidad de la capnografía para la detecci

  1. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  2. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  3. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting

    Science.gov (United States)

    Siegelmann-Danieli, Nava; Farkash, Ariel; Katzir, Itzhak; Vesterman Landes, Janet; Rotem Rabinovich, Hadas; Lomnicky, Yossef; Carmeli, Boaz; Parush-Shear-Yashuv, Naama

    2016-01-01

    Background Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC) setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes. Methods The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological) per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP), FP plus oxaliplatin (FP-O), or FP plus irinotecan (FP-I) in the first-line between 9/2006 and 12/2013. Results The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group). For the entire cohort, median overall survival (OS) was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT) pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p <0.002; notably, these patients were older). Patients who received both FP-O- and FP-I-based regimens achieved numerically longer OS vs. those who received only one of these regimens (22.1 [19.9–24.0] vs. 18.9 [15.5–21.9] months). Among patients assessed for wild-type KRAS and treated with subsequent anti-EGFR agent, OS was 25.4 months and 18.7 months for 124 treated vs. 37 non-treated patients (non-significant). Cox analysis (controlling for age and gender) identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of

  4. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Directory of Open Access Journals (Sweden)

    Nava Siegelmann-Danieli

    Full Text Available Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes.The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP, FP plus oxaliplatin (FP-O, or FP plus irinotecan (FP-I in the first-line between 9/2006 and 12/2013.The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group. For the entire cohort, median overall survival (OS was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p <0.002; notably, these patients were older. Patients who received both FP-O- and FP-I-based regimens achieved numerically longer OS vs. those who received only one of these regimens (22.1 [19.9-24.0] vs. 18.9 [15.5-21.9] months. Among patients assessed for wild-type KRAS and treated with subsequent anti-EGFR agent, OS was 25.4 months and 18.7 months for 124 treated vs. 37 non-treated patients (non-significant. Cox analysis (controlling for age and gender identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton

  5. Evaluating the Auto-MODS assay, a novel tool for tuberculosis diagnosis for use in resource-limited settings.

    Science.gov (United States)

    Wang, Linwei; Mohammad, Sohaib H; Chaiyasirinroje, Boonchai; Li, Qiaozhi; Rienthong, Somsak; Rienthong, Dhanida; Nedsuwan, Supalert; Mahasirimongkol, Surakameth; Yasui, Yutaka

    2015-01-01

    There is an urgent need for simple, rapid, and affordable diagnostic tests for tuberculosis (TB) to combat the great burden of the disease in developing countries. The microscopic observation drug susceptibility assay (MODS) is a promising tool to fill this need, but it is not widely used due to concerns regarding its biosafety and efficiency. This study evaluated the automated MODS (Auto-MODS), which operates on principles similar to those of MODS but with several key modifications, making it an appealing alternative to MODS in resource-limited settings. In the operational setting of Chiang Rai, Thailand, we compared the performance of Auto-MODS with the gold standard liquid culture method in Thailand, mycobacterial growth indicator tube (MGIT) 960 plus the SD Bioline TB Ag MPT64 test, in terms of accuracy and efficiency in differentiating TB and non-TB samples as well as distinguishing TB and multidrug-resistant (MDR) TB samples. Sputum samples from clinically diagnosed TB and non-TB subjects across 17 hospitals in Chiang Rai were consecutively collected from May 2011 to September 2012. A total of 360 samples were available for evaluation, of which 221 (61.4%) were positive and 139 (38.6%) were negative for mycobacterial cultures according to MGIT 960. Of the 221 true-positive samples, Auto-MODS identified 212 as positive and 9 as negative (sensitivity, 95.9%; 95% confidence interval [CI], 92.4% to 98.1%). Of the 139 true-negative samples, Auto-MODS identified 135 as negative and 4 as positive (specificity, 97.1%; 95% CI, 92.8% to 99.2%). The median time to culture positivity was 10 days, with an interquartile range of 8 to 13 days for Auto-MODS. Auto-MODS is an effective and cost-sensitive alternative diagnostic tool for TB diagnosis in resource-limited settings. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  6. OH regeneration from methacrolein oxidation investigated in the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Fuchs, H.; Acir, I.-H.; Bohn, B.; Brauers, T.; Dorn, H.-P.; Häseler, R.; Hofzumahaus, A.; Holland, F.; Kaminski, M.; Li, X.; Lu, K.; Lutz, A.; Nehr, S.; Rohrer, F.; Tillmann, R.; Wegener, R.; Wahner, A.

    2014-08-01

    Hydroxyl radicals (OH) are the most important reagent for the oxidation of trace gases in the atmosphere. OH concentrations measured during recent field campaigns in isoprene-rich environments were unexpectedly large. A number of studies showed that unimolecular reactions of organic peroxy radicals (RO2) formed in the initial reaction step of isoprene with OH play an important role for the OH budget in the atmosphere at low mixing ratios of nitrogen monoxide (NO) of less than 100 pptv. It has also been suggested that similar reactions potentially play an important role for RO2 from other compounds. Here, we investigate the oxidation of methacrolein (MACR), one major oxidation product of isoprene, by OH in experiments in the simulation chamber SAPHIR under controlled atmospheric conditions. The experiments show that measured OH concentrations are approximately 50% larger than calculated by the Master Chemical Mechanism (MCM) for conditions of the experiments (NO mixing ratio of 90 pptv). The analysis of the OH budget reveals an OH source that is not accounted for in MCM, which is correlated with the production rate of RO2 radicals from MACR. In order to balance the measured OH destruction rate, 0.77 OH radicals (1σ error: ± 0.31) need to be additionally reformed from each reaction of OH with MACR. The strong correlation of the missing OH source with the production of RO2 radicals is consistent with the concept of OH formation from unimolecular isomerization and decomposition reactions of RO2. The comparison of observations with model calculations gives a lower limit of 0.03 s-1 for the reaction rate constant if the OH source is attributed to an isomerization reaction of MACR-1-OH-2-OO and MACR-2-OH-2-OO formed in the MACR + OH reaction as suggested in the literature (Crounse et al., 2012). This fast isomerization reaction would be a competitor to the reaction of this RO2 species with a minimum of 150 pptv NO. The isomerization reaction would be the dominant

  7. The spreadsheet as a tool for teaching set theory: Part 1 – an Excel lesson plan to help solve Sudokus

    Directory of Open Access Journals (Sweden)

    Stephen J Sugden

    2008-04-01

    Full Text Available This paper is intended to be used in the classroom. It describes essentially every step of the construction of an Excel model to help solve Sudoku puzzles. For those up to moderate difficulty, it will usually solve the puzzle to completion. For the more difficult ones, it still provides a platform for decision support. The paper may be found useful for a lesson in which students, who, having some basic knowledge of Excel, are learning some of its lesser-known features, such as conditional formatting. It also generates a useful tool for working with Sudoku puzzles, from the very easiest right up to the ones often labelled as fiendish or diabolical. Fundamental mathematical concepts such as set intersection, set partition and reduction of set partition to singletons are very graphically illustrated by the present Excel model for Sudoku. Prominent spreadsheet concepts presented here are conditional formatting, names, COUNTIF, CONCATENATE. The paper is accompanied by a completed Excel model, constructed by using the steps described herein. No VBA code is employed; the whole thing is done with Excel formulas and conditional formatting.

  8. The e-Reader — an Educational or an Entertainment Tool? e-Readers in an Academic Setting

    Directory of Open Access Journals (Sweden)

    Peter Ahlroos

    2012-01-01

    Full Text Available In this paper the authors will discuss a pilot project conducted at the Tritonia Academic Library, Vaasa, in Finland, from September 2010 until May 2011. The project was designed to investigate the application of e-readers in academic settings and to learn how teachers and students experience the use of e-readers in academic education. Four groups of students and one group of teachers used Kindle readers for varied periods of time in different courses. The course material and the textbooks were downloaded on the e-readers. The feedback from the participants was collected through questionnaires and teacher interviews. The results suggest that the e-reader is a future tool for learning, though some features need to be improved before e-readers can really enable efficient learning and researching.

  9. Validating a set of tools designed to assess the perceived quality of training of pediatric residency programs.

    Science.gov (United States)

    Da Dalt, Liviana; Anselmi, Pasquale; Furlan, Sara; Carraro, Silvia; Baraldi, Eugenio; Robusto, Egidio; Perilongo, Giorgio

    2015-01-20

    The Paediatric Residency Program (PRP) of Padua, Italy, developed a set of questionnaires to assess the quality of the training provided by each faculty member, the quality of the professional experience the residents experienced during the various rotations and the functioning of the Resident Affair Committee (RAC), named respectively: "Tutor Assessment Questionnaire" (TAQ), "Rotation Assessment Questionnaire" (RAQ), and RAC Assessment Questionnaire". The process that brought to their validation are herein presented. Between July 2012 and July 2013, 51 residents evaluated 26 tutors through the TAQ, and 25 rotations through the RAQ. Forty-eight residents filled the RAC Assessment Questionnaire. The three questionnaires were validated through a many-facet Rasch measurement analysis. In their final form, the questionnaires produced measures that were valid, reliable, unidimensional, and free from gender biases. TAQ and RAQ distinguished tutors and rotations into 5-6 levels of different quality and effectiveness. The three questionnaires allowed the identification of strengths and weaknesses of tutors, rotations, and RAC. The agreement observed among judges was coherent to the predicted values, suggesting that no particular training is required for developing a shared interpretation of the items. The work herein presented serves to enrich the armamentarium of tools that resident medical programs can use to monitor their functioning. A larger application of these tools will serve to consolidate and refine further the results presented.

  10. Integrated Variable-Fidelity Tool Set for Modeling and Simulation of Aeroservothermoelasticity-Propulsion (ASTE-P) Effects for Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  11. Integrated Variable-Fidelity Tool Set For Modeling and Simulation of Aeroservothermoelasticity -Propulsion (ASTE-P) Effects For Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  12. Isotope effect in the formation of H2 from H2CO studied at the atmospheric simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    R. Koppmann

    2010-06-01

    Full Text Available Formaldehyde of known, near-natural isotopic composition was photolyzed in the SAPHIR atmosphere simulation chamber under ambient conditions. The isotopic composition of the product H2 was used to determine the isotope effects in formaldehyde photolysis. The experiments are sensitive to the molecular photolysis channel, and the radical channel has only an indirect effect and cannot be effectively constrained. The molecular channel kinetic isotope effect KIEmol, the ratio of photolysis frequencies j(HCHO→CO+H2/j(HCDO→CO+HD at surface pressure, is determined to be KIEmol=1.63−0.046+0.038. This is similar to the kinetic isotope effect for the total removal of HCHO from a recent relative rate experiment (KIEtot=1.58±0.03, which indicates that the KIEs in the molecular and radical photolysis channels at surface pressure (≈100 kPa may not be as different as described previously in the literature.

  13. 铣削加工中心对刀方式分析及案例探讨%Analysis of Cutting Tool Setting Case for Milling Machining Center

    Institute of Scientific and Technical Information of China (English)

    李志梅

    2012-01-01

    The tool setting and tool length offset setting are important before NC milling. The different tool setting method results in the different setting of the tool length offset. Based on the FANUC-0iM NC system, this paper presents the different tool setting and the different details, combined with the concrete cases. tool length offset setting for milling machining center in%对刀与刀具长度补偿值的设定是数控加工前很重要的一个环节。对刀方式不同,刀长补的设置也不同。本文结合具体零件加工实例,以FANUC-OiM数控系统为媒介,详尽阐述了铣削加工中心申各种对刀方式及相应刀长补的设定不同。

  14. SAPHIRE (scintillator avalanche photoconductor with high resolution emitter readout) for low dose x-ray imaging: spatial resolution.

    Science.gov (United States)

    Li, Dan; Zhao, Wei

    2008-07-01

    An indirect flat panel imager (FPI) with programmable avalanche gain and field emitter array (FEA) readout is being investigated for low-dose and high resolution x-ray imaging. It is made by optically coupling a structured x-ray scintillator, e.g., thallium (Tl) doped cesium iodide (CsI), to an amorphous selenium (a-Se) avalanche photoconductor called high-gain avalanche rushing amorphous photoconductor (HARP). The charge image created by the scintillator/HARP (SHARP) combination is read out by the electron beams emitted from the FEA. The proposed detector is called scintillator avalanche photoconductor with high resolution emitter readout (SAPHIRE). The programmable avalanche gain of HARP can improve the low dose performance of indirect FPI while the FEA can be made with pixel sizes down to 50 microm. Because of the avalanche gain, a high resolution type of CsI (Tl), which has not been widely used in indirect FPI due to its lower light output, can be used to improve the high spatial frequency performance. The purpose of the present article is to investigate the factors affecting the spatial resolution of SAPHIRE. Since the resolution performance of the SHARP combination has been well studied, the focus of the present work is on the inherent resolution of the FEA readout method. The lateral spread of the electron beam emitted from a 50 microm x 50 microm pixel FEA was investigated with two different electron-optical designs: mesh-electrode-only and electrostatic focusing. Our results showed that electrostatic focusing can limit the lateral spread of electron beams to within the pixel size of down to 50 microm. Since electrostatic focusing is essentially independent of signal intensity, it will provide excellent spatial uniformity.

  15. Initial validation of the prekindergarten Classroom Observation Tool and goal setting system for data-based coaching.

    Science.gov (United States)

    Crawford, April D; Zucker, Tricia A; Williams, Jeffrey M; Bhavsar, Vibhuti; Landry, Susan H

    2013-12-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based coaching model. We examined psychometric characteristics of the COT and explored how coaches and teachers used the COT goal-setting system. The study included 193 coaches working with 3,909 pre-k teachers in a statewide professional development program. Classrooms served 3 and 4 year olds (n = 56,390) enrolled mostly in Title I, Head Start, and other need-based pre-k programs. Coaches used the COT during a 2-hr observation at the beginning of the academic year. Teachers collected progress-monitoring data on children's language, literacy, and math outcomes three times during the year. Results indicated a theoretically supported eight-factor structure of the COT across language, literacy, and math instructional domains. Overall interrater reliability among coaches was good (.75). Although correlations with an established teacher observation measure were small, significant positive relations between COT scores and children's literacy outcomes indicate promising predictive validity. Patterns of goal-setting behaviors indicate teachers and coaches set an average of 43.17 goals during the academic year, and coaches reported that 80.62% of goals were met. Both coaches and teachers reported the COT was a helpful measure for enhancing quality of Tier 1 instruction. Limitations of the current study and implications for research and data-based coaching efforts are discussed.

  16. Preparing for the future: a review of tools and strategies to support autonomous goal setting for children and youth with autism spectrum disorders.

    Science.gov (United States)

    Hodgetts, Sandra; Park, Elly

    2017-03-01

    Despite recognized benefits, current clinical practice rarely includes direct input from children and youth with autism spectrum disorder (ASD) in setting rehabilitation goals. This study reviews tools and evidence-based strategies to assist with autonomous goal settings for children and youth with ASD. This study included two components: (1) A scoping review of existing tools and strategies to assist with autonomous goal setting in individuals with ASD and (2) a chart review of inter-disciplinary service plan goals for children and youth with ASD. Eleven data sources, evaluating five different tools to assist with autonomous goal setting for children and youth with ASD, were found. Three themes emerged from the integration of the scoping review and chart review, which are discussed in the paper: (1) generalizability of findings, (2) adaptations to support participation and (3) practice implications. Children and youth with ASD can participate in setting rehabilitation goals, but few tools to support their participation have been evaluated, and those tools that do exist do not align well with current services foci. Visual aids appear to be one effective support, but further research on effective strategies for meaningful engagement in autonomous goal setting for children and youth with ASD is warranted. Implications for rehabilitation Persons with ASD are less self-determined than their peers. Input into one's own rehabilitation goals and priorities is an important component of self-determination. Few tools exist to help engage children and youth with ASD in setting their own rehabilitation goals. An increased focus on identifying, developing and evaluating effective tools and strategies to facilitate engagement of children and youth with ASD in setting their own rehabilitation goals is warranted.

  17. Test Review for Preschool-Wide Evaluation Tool (PreSET) Manual: Assessing Universal Program-Wide Positive Behavior Support in Early Childhood

    Science.gov (United States)

    Rodriguez, Billie Jo

    2013-01-01

    The Preschool-Wide Evaluation Tool (PreSET; Steed & Pomerleau, 2012) is published by Paul H. Brookes Publishing Company in Baltimore, MD. The PreSET purports to measure universal and program-wide features of early childhood programs' implementation fidelity of program-wide positive behavior intervention and support (PW-PBIS) and is,…

  18. The interprofessional socialization and valuing scale: a tool for evaluating the shift toward collaborative care approaches in health care settings.

    Science.gov (United States)

    King, Gillian; Shaw, Lynn; Orchard, Carole A; Miller, Stacy

    2010-01-01

    There is a need for tools by which to evaluate the beliefs, behaviors, and attitudes that underlie interprofessional socialization and collaborative practice in health care settings. This paper introduces the Interprofessional Socialization and Valuing Scale (ISVS), a 24-item self-report measure based on concepts in the interprofessional literature concerning shifts in beliefs, behaviors, and attitudes that underlie interprofessional socialization. The ISVS was designed to measure the degree to which transformative learning takes place, as evidenced by changed assumptions and worldviews, enhanced knowledge and skills concerning interprofessional collaborative teamwork, and shifts in values and identities. The scales of the ISVS were determined using principal components analysis. The principal components analysis revealed three scales accounting for approximately 49% of the variance in responses: (a) Self-Perceived Ability to Work with Others, (b) Value in Working with Others, and (c) Comfort in Working with Others. These empirically derived scales showed good fit with the conceptual basis of the measure. The ISVS provides insight into the abilities, values, and beliefs underlying socio-cultural aspects of collaborative and authentic interprofessional care in the workplace, and can be used to evaluate the impact of interprofessional education efforts, in house team training, and workshops.

  19. Evaluation of a Smartphone Decision-Support Tool for Diarrheal Disease Management in a Resource-Limited Setting.

    Science.gov (United States)

    Haque, Farhana; Ball, Robyn L; Khatun, Selina; Ahmed, Mujaddeed; Kache, Saraswati; Chisti, Mohammod Jobayer; Sarker, Shafiqul Alam; Maples, Stace D; Pieri, Dane; Vardhan Korrapati, Teja; Sarnquist, Clea; Federspiel, Nancy; Rahman, Muhammad Waliur; Andrews, Jason R; Rahman, Mahmudur; Nelson, Eric Jorge

    2017-01-01

    The emergence of mobile technology offers new opportunities to improve clinical guideline adherence in resource-limited settings. We conducted a clinical pilot study in rural Bangladesh to evaluate the impact of a smartphone adaptation of the World Health Organization (WHO) diarrheal disease management guidelines, including a modality for age-based weight estimation. Software development was guided by end-user input and evaluated in a resource-limited district and sub-district hospital during the fall 2015 cholera season; both hospitals lacked scales which necessitated weight estimation. The study consisted of a 6 week pre-intervention and 6 week intervention period with a 10-day post-discharge follow-up. Standard of care was maintained throughout the study with the exception that admitting clinicians used the tool during the intervention. Inclusion criteria were patients two months of age and older with uncomplicated diarrheal disease. The primary outcome was adherence to guidelines for prescriptions of intravenous (IV) fluids, antibiotics and zinc. A total of 841 patients were enrolled (325 pre-intervention; 516 intervention). During the intervention, the proportion of prescriptions for IV fluids decreased at the district and sub-district hospitals (both p < 0.001) with risk ratios (RRs) of 0.5 and 0.2, respectively. However, when IV fluids were prescribed, the volume better adhered to recommendations. The proportion of prescriptions for the recommended antibiotic azithromycin increased (p < 0.001 district; p = 0.035 sub-district) with RRs of 6.9 (district) and 1.6 (sub-district) while prescriptions for other antibiotics decreased; zinc adherence increased. Limitations included an absence of a concurrent control group and no independent dehydration assessment during the pre-intervention. Despite limitations, opportunities were identified to improve clinical care, including better assessment, weight estimation, and fluid/ antibiotic selection. These findings

  20. Evaluation of a Smartphone Decision-Support Tool for Diarrheal Disease Management in a Resource-Limited Setting

    Science.gov (United States)

    Khatun, Selina; Ahmed, Mujaddeed; Kache, Saraswati; Chisti, Mohammod Jobayer; Sarker, Shafiqul Alam; Maples, Stace D.; Pieri, Dane; Vardhan Korrapati, Teja; Sarnquist, Clea; Federspiel, Nancy; Rahman, Muhammad Waliur; Andrews, Jason R.; Rahman, Mahmudur; Nelson, Eric Jorge

    2017-01-01

    The emergence of mobile technology offers new opportunities to improve clinical guideline adherence in resource-limited settings. We conducted a clinical pilot study in rural Bangladesh to evaluate the impact of a smartphone adaptation of the World Health Organization (WHO) diarrheal disease management guidelines, including a modality for age-based weight estimation. Software development was guided by end-user input and evaluated in a resource-limited district and sub-district hospital during the fall 2015 cholera season; both hospitals lacked scales which necessitated weight estimation. The study consisted of a 6 week pre-intervention and 6 week intervention period with a 10-day post-discharge follow-up. Standard of care was maintained throughout the study with the exception that admitting clinicians used the tool during the intervention. Inclusion criteria were patients two months of age and older with uncomplicated diarrheal disease. The primary outcome was adherence to guidelines for prescriptions of intravenous (IV) fluids, antibiotics and zinc. A total of 841 patients were enrolled (325 pre-intervention; 516 intervention). During the intervention, the proportion of prescriptions for IV fluids decreased at the district and sub-district hospitals (both p < 0.001) with risk ratios (RRs) of 0.5 and 0.2, respectively. However, when IV fluids were prescribed, the volume better adhered to recommendations. The proportion of prescriptions for the recommended antibiotic azithromycin increased (p < 0.001 district; p = 0.035 sub-district) with RRs of 6.9 (district) and 1.6 (sub-district) while prescriptions for other antibiotics decreased; zinc adherence increased. Limitations included an absence of a concurrent control group and no independent dehydration assessment during the pre-intervention. Despite limitations, opportunities were identified to improve clinical care, including better assessment, weight estimation, and fluid/ antibiotic selection. These findings

  1. 锥柄对刀器在数控车床精确对刀中的运用%Application of Tool Checking Instrument with Taper Shank in CNC Lathe Precise Tool Setting

    Institute of Scientific and Technical Information of China (English)

    陈向荣; 朱晓波; 何春生; 袁光德

    2012-01-01

    设计一种锥柄式呜音光电对刀器,用于数控车床的对刀操作.该对刀器采用锥度配合直接与机床主轴装配,定位精确,装夹方便,对刀时通过鸣音发光,操作直观,对刀精度不受卡盘的影响.%A kind of taper-type sound photoelectric tool checking instrument was designed for tool setting of CNC lathe. In this tool checking instrument, taper was used to fit with the machine spindle. So positioning is accurate, clamping is easy, operation is intuitive and accuracy of the tool setting is not affected by chuck.

  2. OH Oxidation of α-Pinene in the Atmosphere Simulation Chamber SAPHIR: Investigation of the Role of Pinonaldehyde Photolysis as an HO2 Source

    Science.gov (United States)

    Kaminski, M.; Acir, I. H.; Bohn, B.; Dorn, H. P.; Fuchs, H.; Häseler, R.; Hofzumahaus, A.; Li, X.; Rohrer, F.; Tillmann, R.; Wegener, R.; Kiendler-Scharr, A.; Wahner, A.

    2015-12-01

    About one third of the land surface is covered by forests, emitting approximately 75% of the total biogenic volatile organic compounds (BVOCs). The main atmospheric sink of these BVOCs during daytime is the oxidation by the hydroxyl radical (OH). Over the last decades field campaigns investigating the radical chemistry in forested regions showed that atmospheric chemistry models are often not able to describe the measured OH concentration well. At low NO concentrations and an OH reactivity dominated by BVOCs the OH was underestimated. This discrepancy could only partly be explained by the discovery of new OH regeneration pathways in the isoprene oxidation mechanism. Field campaigns in the U.S.A and Finland (Kim 2013 ACP, Hens 2014 ACP) demonstrated that in monoterpene (e.g. α-pinene) dominated environments model calculations also underpredict the observed HO2 and OH concentrations significantly even if the OH budget was closed by the measured OH production and destruction terms. These observations suggest the existence of an unaccounted source of HO2. One potential HO2 source in forests is the photolysis of monoterpene degradation products such as aldehydes. In the present study the photochemical degradation mechanism of α-pinene was investigated in the Jülich atmosphere simulation chamber SAPHIR. The focus of this study was in particular on the investigation of the role of pinonaldehyde, a main first generation product of α-pinene, as a possible HO2 source. For that purpose the pinonaldehyde yields of the reaction α-pinene + OH were determined at ambient monoterpene concentrations (<5 ppb) under low NOx as well as high NOx conditions. The pinonaldehyde yield under high NOx conditions (30.5 %) is in agreement with literature values of Wisthaler (2001 AE) and Aschmann (2002 JGR), under low NOx conditions the yield (10.8 %) is approximately a factor of three lower than the value published by Eddingsaas (2012 ACP). In a second set of experiments the photolysis

  3. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    Directory of Open Access Journals (Sweden)

    Judith Kwasa

    Full Text Available To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD for use by primary health care workers (HCW which would be feasible to implement in resource-limited settings.In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need.A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic.The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20% of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65. This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  4. Pilot Testing and Implementation of a mHealth tool for Non-communicable Diseases in a Humanitarian Setting

    Science.gov (United States)

    Doocy, Shannon; Paik, Kenneth; Lyles, Emily; Tam, Hok Hei; Fahed, Zeina; Winkler, Eric; Kontunen, Kaisa; Mkanna, Abdalla; Burnham, Gilbert

    2017-01-01

    settings, to optimize the benefit of such tools. PMID:28744410

  5. Image navigation and registration performance assessment tool set for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    Science.gov (United States)

    De Luccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-05-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99. 73rd percentile of the errors accumulated over a 24 hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  6. Use of a tool-set by Pan troglodytes troglodytes to obtain termites (Macrotermes) in the periphery of the Dja Biosphere Reserve, southeast Cameroon.

    Science.gov (United States)

    Deblauwe, Isra; Guislain, Patrick; Dupain, Jef; Van Elsacker, Linda

    2006-12-01

    At the northern periphery of the Dja Biosphere Reserve (southeastern Cameroon) we recorded a new use of a tool-set by Pan troglodytes troglodytes to prey on Macrotermes muelleri, M. renouxi, M. lilljeborgi, and M. nobilis. We recovered 79 puncturing sticks and 47 fishing probes at 17 termite nests between 2002 and 2005. The mean length of the puncturing sticks (n = 77) and fishing probes (n = 45) was 52 cm and 56 cm, respectively, and the mean diameter was 9 mm and 4.5 mm, respectively. Sixty-eight percent of 138 chimpanzee fecal samples contained major soldiers of four Macrotermes species. The chimpanzees in southeastern Cameroon appeared to be selective in their choice of plant material to make their tools. The tools found at our study site resemble those from other sites in this region. However, in southeastern Cameroon only one tool-set type was found, whereas two tool-set types have been reported in Congo. Our study suggests that, along with the different vegetation types and the availability of plant material around termite nests, the nest and gallery structure and foraging behavior of the different Macrotermes spp. at all Central African sites must be investigated before we can attribute differences in tool-use behavior to culture.

  7. Air Quality uFIND: User-oriented Tool Set for Air Quality Data Discovery and Access

    Science.gov (United States)

    Hoijarvi, K.; Robinson, E. M.; Husar, R. B.; Falke, S. R.; Schultz, M. G.; Keating, T. J.

    2012-12-01

    Historically, there have been major impediments to seamless and effective data usage encountered by both data providers and users. Over the last five years, the international Air Quality (AQ) Community has worked through forums such as the Group on Earth Observations AQ Community of Practice, the ESIP AQ Working Group, and the Task Force on Hemispheric Transport of Air Pollution to converge on data format standards (e.g., netCDF), data access standards (e.g., Open Geospatial Consortium Web Coverage Services), metadata standards (e.g., ISO 19115), as well as other conventions (e.g., CF Naming Convention) in order to build an Air Quality Data Network. The centerpiece of the AQ Data Network is the web service-based tool set: user-oriented Filtering and Identification of Networked Data. The purpose of uFIND is to provide rich and powerful facilities for the user to: a) discover and choose a desired dataset by navigation through the multi-dimensional metadata space using faceted search, b) seamlessly access and browse datasets, and c) use uFINDs facilities as a web service for mashups with other AQ applications and portals. In a user-centric information system such as uFIND, the user experience is improved by metadata that includes the general fields for discovery as well as community-specific metadata to narrow the search beyond space, time and generic keyword searches. However, even with the community-specific additions, the ISO 19115 records were formed in compliance with the standard, so that other standards-based search interface could leverage this additional information. To identify the fields necessary for metadata discovery we started with the ISO 19115 Core Metadata fields and fields that were needed for a Catalog Service for the Web (CSW) Record. This fulfilled two goals - one to create valid ISO 19115 records and the other to be able to retrieve the records through a Catalog Service for the Web query. Beyond the required set of fields, the AQ Community added

  8. A Proposal: Modification for Instruments and Tools Used in the Science Laboratory Setting for Students with Disabilities

    Science.gov (United States)

    Kogan, Denis

    2015-01-01

    The purpose of this action research proposal is to create a Modification of Instruments and Tools in Science (MITS) program to address the need for providing Students With Disabilities (SWDs) appropriate access to scientific tools and techniques of scientific inquiry. This proposal contains a review of literature on SWDs, differentiating…

  9. Subcortical brain segmentation of two dimensional T1-weighted data sets with FMRIB's Integrated Registration and Segmentation Tool (FIRST

    Directory of Open Access Journals (Sweden)

    Michael Amann

    2015-01-01

    Full Text Available Brain atrophy has been identified as an important contributing factor to the development of disability in multiple sclerosis (MS. In this respect, more and more interest is focussing on the role of deep grey matter (DGM areas. Novel data analysis pipelines are available for the automatic segmentation of DGM using three-dimensional (3D MRI data. However, in clinical trials, often no such high-resolution data are acquired and hence no conclusions regarding the impact of new treatments on DGM atrophy were possible so far. In this work, we used FMRIB's Integrated Registration and Segmentation Tool (FIRST to evaluate the possibility of segmenting DGM structures using standard two-dimensional (2D T1-weighted MRI. In a cohort of 70 MS patients, both 2D and 3D T1-weighted data were acquired. The thalamus, putamen, pallidum, nucleus accumbens, and caudate nucleus were bilaterally segmented using FIRST. Volumes were calculated for each structure and for the sum of basal ganglia (BG as well as for the total DGM. The accuracy and reliability of the 2D data segmentation were compared with the respective results of 3D segmentations using volume difference, volume overlap and intra-class correlation coefficients (ICCs. The mean differences for the individual substructures were between 1.3% (putamen and −25.2% (nucleus accumbens. The respective values for the BG were −2.7% and for DGM 1.3%. Mean volume overlap was between 89.1% (thalamus and 61.5% (nucleus accumbens; BG: 84.1%; DGM: 86.3%. Regarding ICC, all structures showed good agreement with the exception of the nucleus accumbens. The results of the segmentation were additionally validated through expert manual delineation of the caudate nucleus and putamen in a subset of the 3D data. In conclusion, we demonstrate that subcortical segmentation of 2D data are feasible using FIRST. The larger subcortical GM structures can be segmented with high consistency. This forms the basis for the application of

  10. Comprehensive development and testing of the ASIST-GBV, a screening tool for responding to gender-based violence among women in humanitarian settings.

    Science.gov (United States)

    Wirtz, A L; Glass, N; Pham, K; Perrin, N; Rubenstein, L S; Singh, S; Vu, A

    2016-01-01

    Conflict affected refugees and internally displaced persons (IDPs) are at increased vulnerability to gender-based violence (GBV). Health, psychosocial, and protection services have been implemented in humanitarian settings, but GBV remains under-reported and available services under-utilized. To improve access to existing GBV services and facilitate reporting, the ASIST-GBV screening tool was developed and tested for use in humanitarian settings. This process was completed in four phases: 1) systematic literature review, 2) qualitative research that included individual interviews and focus groups with GBV survivors and service providers, respectively, 3) pilot testing of the developed screening tool, and 4) 3-month implementation testing of the screening tool. Research was conducted among female refugees, aged ≥15 years in Ethiopia, and female IDPs, aged ≥18 years in Colombia. The systematic review and meta-analysis identified a range of GBV experiences and estimated a 21.4 % prevalence of sexual violence (95 % CI:14.9-28.7) among conflict-affected populations. No existing screening tools for GBV in humanitarian settings were identified. Qualitative research with GBV survivors in Ethiopia and Colombia found multiple forms of GBV experienced by refugees and IDPs that occurred during conflict, in transit, and in displaced settings. Identified forms of violence were combined into seven key items on the screening tool: threats of violence, physical violence, forced sex, sexual exploitation, forced pregnancy, forced abortion, and early or forced marriage. Cognitive testing further refined the tool. Pilot testing in both sites demonstrated preliminary feasibility where 64.8 % of participants in Ethiopia and 44.9 % of participants in Colombia were identified with recent (last 12 months) cases of GBV. Implementation testing of the screening tool, conducted as a routine service in camp/district hospitals, allowed for identification of GBV cases and referrals to

  11. SAPHIR - a multi-scale, multi-resolution modeling environment targeting blood pressure regulation and fluid homeostasis.

    Science.gov (United States)

    Thomas, S; Abdulhay, Enas; Baconnier, Pierre; Fontecave, Julie; Francoise, Jean-Pierre; Guillaud, Francois; Hannaert, Patrick; Hernandez, Alfredo; Le Rolle, Virginie; Maziere, Pierre; Tahi, Fariza; Zehraoui, Farida

    2007-01-01

    We present progress on a comprehensive, modular, interactive modeling environment centered on overall regulation of blood pressure and body fluid homeostasis. We call the project SAPHIR, for "a Systems Approach for PHysiological Integration of Renal, cardiac, and respiratory functions". The project uses state-of-the-art multi-scale simulation methods. The basic core model will give succinct input-output (reduced-dimension) descriptions of all relevant organ systems and regulatory processes, and it will be modular, multi-resolution, and extensible, in the sense that detailed submodules of any process(es) can be "plugged-in" to the basic model in order to explore, eg. system-level implications of local perturbations. The goal is to keep the basic core model compact enough to insure fast execution time (in view of eventual use in the clinic) and yet to allow elaborate detailed modules of target tissues or organs in order to focus on the problem area while maintaining the system-level regulatory compensations.

  12. Atmospheric photochemistry of aromatic hydrocarbons: Analysis of OH budgets during SAPHIR chamber experiments and evaluation of MCMv3.2

    Science.gov (United States)

    Nehr, S.; Bohn, B.; Brauers, T.; Dorn, H.; Fuchs, H.; Häseler, R.; Hofzumahaus, A.; Li, X.; Lu, K.; Rohrer, F.; Tillmann, R.; Wahner, A.

    2012-12-01

    Aromatic hydrocarbons, almost exclusively originating from anthropogenic sources, comprise a significant fraction of volatile organic compounds observed in urban air. The photo-oxidation of aromatics results in the formation of secondary pollutants and impacts air quality in cities, industrialized areas, and districts of dense traffic. Up-to-date photochemical oxidation schemes of the Master Chemical Mechanism (MCMv3.2) exhibit moderate performance in simulating aromatic compound degradation observed during previous environmental chamber studies. To obtain a better understanding of aromatic photo-oxidation mechanisms, we performed experiments with a number of aromatic hydrocarbons in the outdoor atmosphere simulation chamber SAPHIR located in Jülich, Germany. These chamber studies were designed to derive OH turnover rates exclusively based on experimental data. Simultaneous measurements of NOx (= NO + NO2), HOx (= OH + HO2), and the total OH loss rate constant k(OH) facilitate a detailed analysis of the OH budgets during photo-oxidation experiments. The OH budget analysis was complemented by numerical model simulations using MCMv3.2. Despite MCM's tendency to overestimate k(OH) and to underpredict radical concentrations, the OH budgets are reasonably balanced for all investigated aromatics. However, the results leave some scope for OH producing pathways that are not considered in the current MCMv3.2. An improved reaction mechanism, derived from MCMv3.2 sensitivity studies, is presented. The model performance is basically improved by changes of the mechanistic representation of ring fragmentation channels.

  13. The evolution of OPUS: A set of web-based GPS processing tools offered by the National Geodetic Survey

    Science.gov (United States)

    Weston, Dr.; Mader, Dr.; Schenewerk, Dr.

    2012-04-01

    The Online Positioning User Service (OPUS) is a suite of web-based GPS processing tools that were initially developed by the National Geodetic Survey approximately eleven years ago. The first version, known as OPUS static (OPUS-S), processes L1 and L2 carrier-phase data in native receiver and RINEX formats. Datasets submitted to OPUS-S must be between two and 48 hours in duration and pass several quality control steps before being passed onto the positioning algorithm. OPUS-S was designed to select five nearby CORS to form baselines that are processed independently. The best three solutions are averaged to produce a final set of coordinates. The current version of OPUS-S has been optimized to accept and process GPS data from any location in the continental United States, Alaska, Hawaii and the Caribbean. OPUS Networks (OPUS-Net), one of the most recently developed versions and currently in beta testing, has many of the same processing characteristics and dataset requirements as OPUS-S but with one significant difference. OPUS-Net selects up to 10 IGS reference sites and three regional CORS to perform a simultaneous least squares adjustment with the user-submitted data. The CORS stations are primarily used to better estimate the troposphere while the position of the unknown station and the three CORS reference stations are determined from the more precisely known and monitored IGS reference stations. Additional enhancements to OPUS-Net are the implementation of absolute antenna patterns and ocean tides (FES2004), using reference station coordinates in IGS08 reference frame, as well as using improved phase ambiguity integer fixing and troposphere modeling (GPT and GMF a priori models). OPUS Projects, the final version of OPUS to be reviewed in this paper, is a complete web-based, GPS data processing and analysis environment. The main idea behind OPUS Projects is that one or more managers can define numerous, independent GPS projects. Each newly defined project is

  14. A Set of Web-based Tools for Integrating Scientific Research and Decision-Making through Systems Thinking

    Science.gov (United States)

    Currently, many policy and management decisions are made without considering the goods and services humans derive from ecosystems and the costs associated with protecting them. This approach is unlikely to be sustainable. Conceptual frameworks provide a tool for capturing, visual...

  15. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 1 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  16. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 2 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  17. Implementation of information and analysis support of the industrial enterprise’s logistical management based on the tools of the fuzzy set theory

    Directory of Open Access Journals (Sweden)

    Greyz G.M.

    2017-01-01

    Full Text Available Management of industrial enterprises’ logistical systems is based on application of rather heterogeneous and not always certain information. Presence of different types of uncertainty in the complex hierarchical system of industrial enterprises’ logistical management gives grounds for analysis support of management solutions based on the fuzzy set theory. Use of the fuzzy set theory allows to link together and adequately consider all the necessary heterogeneous information. In this regard, information on functioning of the logistical system must be presented in a specific form as membership functions. It is justified in the article that the tools of the fuzzy set theory can be applied for description of parameters of the industrial enterprises’ logistical system and justification of decision-taking in the sphere of logistical management. Within the framework of the system of information and analysis support of industrial enterprises’ logistical management it is proposed to use tools of problem “determination of the fuzzy set image” and its variety – “definition of the sub direct fuzzy set image” in order to choose the best variant of combination of key efficiency indicators of logistical management complying with the present complex of criteria. Application of the fuzzy set theory also allows to determine fuzzy values of factors, as a result of which the enterprise’s logistical system has obtained the existing or objective set of features. For analysis of factors influencing the key efficiency indicators of the industrial enterprise’s logistical management it is proposed to use tools of problem “definition of the fuzzy set pre-image at a fuzzy binary relation”.

  18. Development of a Suicidal Ideation Detection Tool for Primary Healthcare Settings: Using Open Access Online Psychosocial Data.

    Science.gov (United States)

    Meyer, Denny; Abbott, Jo-Anne; Rehm, Imogen; Bhar, Sunil; Barak, Azy; Deng, Gary; Wallace, Klaire; Ogden, Edward; Klein, Britt

    2017-04-01

    Suicidal patients often visit healthcare professionals in their last month before suicide, but medical practitioners are unlikely to raise the issue of suicide with patients because of time constraints and uncertainty regarding an appropriate approach. A brief tool called the e-PASS Suicidal Ideation Detector (eSID) was developed for medical practitioners to help detect the presence of suicidal ideation (SI) in their clients. If SI is detected, the system alerts medical practitioners to address this issue with a client. The eSID tool was developed due to the absence of an easy-to-use, evidence-based SI detection tool for general practice. The tool was developed using binary logistic regression analyses of data provided by clients accessing an online psychological assessment function. Ten primary healthcare professionals provided advice regarding the use of the tool. The analysis identified eleven factors in addition to the Kessler-6 for inclusion in the model used to predict the probability of recent SI. The model performed well across gender and age groups 18-64 (AUR 0.834, 95% CI 0.828-0.841, N = 16,703). Healthcare professionals were interviewed; they recommended that the tool be incorporated into existing medical software systems and that additional resources be supplied, tailored to the level of risk identified. The eSID is expected to trigger risk assessments by healthcare professionals when this is necessary. Initial reactions of healthcare professionals to the tool were favorable, but further testing and in situ development are required.

  19. Measuring multidisciplinary team effectiveness in a ward-based healthcare setting: development of the team functioning assessment tool.

    Science.gov (United States)

    Sutton, Gigi; Liao, Jenny; Jimmieson, Nerina L; Restubog, Simon Lloyd D

    2011-01-01

    Nontechnical skills relating to team functioning are vital to the effective delivery of patient care and safety. In this study, we develop a reliable behavioral marker tool for assessing nontechnical skills that are critical to the success of ward-based multidisciplinary healthcare teams. The Team Functioning Assessment Tool (TFAT) was developed and refined using a literature review, focus groups, card-sorting exercise, field observations, and final questionnaire evaluation and refinement process. Results demonstrated that Clinical Planning, Executive Tasks, and Team Relations are important facets of effective multidisciplinary healthcare team functioning. The TFAT was also shown to yield acceptable inter-rater agreement.

  20. New insights into the degradation of terpenoids with OH: a study of the OH budget in the atmosphere simulation chamber SAPHIR

    Science.gov (United States)

    Kaminski, Martin; Fuchs, Hendrik; Acir, Ismail-Hakki; Bohn, Birger; Brauers, Theo; Dorn, Hans-Peter; Häseler, Rolf; Hofzumahaus, Andreas; Li, Xin; Lutz, Anna; Nehr, Sascha; Rohrer, Franz; Tillmann, Ralf; Wegener, Robert; Kiendler-Scharr, Astrid; Wahner, Andreas

    2014-05-01

    The hydroxyl radical (OH) is the main oxidation agent in the atmosphere during daytime. Recent field campaigns studying the radical chemistry in forested areas showed large discrepancies between measured and modeled OH concentration at low NOx conditions and when OH reactivity was dominated by VOC. These observations were only partially explained by the evidence for new efficient hydroxyl radical regeneration pathways in the isoprene oxidation mechanism. The question arises if other reactive VOCs with high global emission rates are also capable of additional OH recycling. Beside isoprene, monoterpenes and 2-methyl-3-buten-2-ol (MBO) are the volatile organic compounds (VOC) with the highest global emission rates. Due to their high reactivity towards OH monoterpenes and MBO can dominate the radical chemistry of the atmosphere in forested areas under certain conditions. In the present study the photochemical degradation mechanism of α-pinene, β-pinene, limonene, myrcene and MBO was investigated in the Jülich atmosphere simulation chamber SAPHIR. The focus of this study was in particular on the investigation of the OH budget in the degradation process. The photochemical degradation of these terpenoids was studied in a dedicated series of experiments in the years 2012 and 2013. The SAPHIR chamber was equipped with instrumentation to measure radicals (OH, HO2, RO2), the total OH reactivity, all important OH precursors (O3, HONO, HCHO), the parent VOC, its main oxidation products and photolysis frequencies to investigate the radical budget in the SAPHIR chamber. All experiments were carried out under low NOx conditions (≤ 2ppb) and atmospheric terpenoid concentrations (≤ 5ppb) with and without addition of ozone into the SAPHIR chamber. For the investigation of the OH budget all measured OH production terms were compared to the measured OH destruction. Within the limits of accuracy of the instruments the OH budget was balanced in all cases. Consequently unaccounted

  1. A new online software tool for pressure ulcer monitoring as an educational instrument for unified nursing assessment in clinical settings

    Directory of Open Access Journals (Sweden)

    Andrea Pokorná

    2016-07-01

    Full Text Available Data collection and evaluation of that data is crucial for effective quality management and naturally also for prevention and treatment of pressure ulcers. Data collected in a uniform manner by nurses in clinical practice could be used for further analyses. Data about pressure ulcers are collected to differing degrees of quality based on the local policy of the given health care facility and in relation to the nurse’s actual level of knowledge concerning pressure ulcer identification and use of objective scales (i.e. categorization of pressure ulcers. Therefore, we have developed software suitable for data collection which includes some educational tools to promote unified reporting of data by nurses. A description of this software and some educational and learning components of the tool is presented herein. The planned process of clinical application of the newly developed software is also briefly mentioned. The discussion is focused on the usability of the online reporting tool and possible further development of the tool.

  2. Analysis of metolachlor ethane sulfonic acid chirality in groundwater: A tool for dating groundwater movement in agricultural settings

    Science.gov (United States)

    Chemical chirality of pesticides can be a useful tool for studying environmental processes. The chiral forms of metolachlor ethane sulfonic acid (MESA), an abundant metabolite of metolachlor, and metolachlor were examined over a 6 year period in groundwater and a groundwater-fed stream in a riparia...

  3. Xbox Kinect Gaming Systems as a Supplemental Tool within a Physical Education Setting: Third and Fourth Grade Students' Perspectives

    Science.gov (United States)

    Shewmake, Cole J.; Merrie, Michael D.; Calleja, Paul

    2015-01-01

    Literature indicates that technology, including exergaming, is popular among adolescents and can be used as a supplemental tool in the physical education classroom. Therefore, the purpose of this study was to examine third and fourth grade students' perceived enjoyment and exertion levels toward exergaming in relation to traditional physical…

  4. Analogies as Tools for Meaning Making in Elementary Science Education: How Do They Work in Classroom Settings?

    Science.gov (United States)

    Guerra-Ramos, Maria Teresa

    2011-01-01

    In this paper there is a critical overview of the role of analogies as tools for meaning making in science education, their advantages and disadvantages. Two empirical studies on the use of analogies in primary classrooms are discussed and analysed. In the first study, the "string circuit" analogy was used in the teaching of electric circuits with…

  5. Using a Mobile Dichotomous Key iPad Application as a Scaffolding Tool in a Museum Setting

    Science.gov (United States)

    Knight, Kathryn; Davies, Randall S.

    2016-01-01

    This study tested an iPad application using a dichotomous key as a scaffolding tool to help students make more detailed observations as they identified various species of birds on display in a museum of natural science. The Mobile Dichotomous Key (MDK) iPad application was used by groups of fifth- and seventh-grade students. Analysis of the…

  6. Initial Validation of the Prekindergarten Classroom Observation Tool and Goal Setting System for Data-Based Coaching

    Science.gov (United States)

    Crawford, April D.; Zucker, Tricia A.; Williams, Jeffrey M.; Bhavsar, Vibhuti; Landry, Susan H.

    2013-01-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based…

  7. Phame: a novel phase metrology tool of Carl Zeiss for in-die phase measurements under scanner relevant optical settings

    Science.gov (United States)

    Perlitz, Sascha; Buttgereit, Ute; Scherübl, Thomas

    2007-03-01

    Meeting the demands of the lithography mask manufacturing industry moving toward 45nm and 32nm node for in-die phase metrology on phase shifting masks, Zeiss is currently developing an optical phase measurement tool (Phame TM), providing the capability of extending process control from large CD test features to in-die phase shifting features with high spatial resolution. In collaboration with Intel, the necessity of designing this optical metrology tool according to the optical setup of a lithographic exposure tool (scanner) has been researched to be fundamental for the acquisition of phase information generated from features the size of the used wavelength. Main cause is the dependence of the image phase of a scanner on polarization and the angle of incidence of the illumination light due to rigorous effects, and on the imaging NA of the scanner due to the loss of phase information in the imaging pupil. The resulting scanner phase in the image plane only coincides with the etch-depth equivalent phase for large test features, exceeding the size of the in-die feature by an order of magnitude. In this paper we introduce the Phame TM phase metrology tool, using a 193nm light source with the optical capability of phase measurement at scanner NA up to the equivalent of a NA1.6 immersion scanner, under varying, scanner relevant angle of incidence for EAPSMs and CPLs, and with the possibility of polarizing the illuminating light. New options for phase shifting mask process control on in-die features will be outlined with first measurement results.

  8. Breast examination as a cost-effective screening tool in a clinical practice setting in Ibadan, Nigeria

    Directory of Open Access Journals (Sweden)

    Adetola M. Ogunbode

    2013-01-01

    Full Text Available Background: Breast cancer is a disease of public health importance. It results in high morbidity and mortality in women worldwide. The high morbidity and mortality from breast cancer can be decreased by measures targeted at early detection such as screening. Breast examination as a screening tool for breast cancer in developing countries is advocated in view of its costeffectiveness.Method: The article selection method was obtained from primary and secondary literature sources which included original research articles, case control studies, review articles, proceedings, transactions and textbooks. The authors cited a clinical audit and articles published between 1988 and 2011. The search strategy included the use of internet search engines. This review was part of a larger research and the study protocol was approved by the University of Ibadan/University College Hospital, Ibadan Institutional Review Board (UI/UCH IRB. Clinical trial registration number-NHREC/05/01/2008a.Results: Breast self-examination (BSE and clinical breast examination (CBE as screening tools for breast cancer were analysed in detail.Conclusion: Breast examination is a screening tool that is cost-effective and reliable and should be encouraged in resource-constrained countries. Given the high cost and expertise required for mammography, current efforts at screening for breast cancer in developing countries should rely more on a combination of BSE and CBE.

  9. Breast examination as a cost-effective screening tool in a clinical practice setting in Ibadan, Nigeria

    Directory of Open Access Journals (Sweden)

    Adetola M. Ogunbode

    2013-01-01

    Full Text Available Background: Breast cancer is a disease of public health importance. It results in high morbidity and mortality in women worldwide. The high morbidity and mortality from breast cancer can be decreased by measures targeted at early detection such as screening. Breast examination as a screening tool for breast cancer in developing countries is advocated in view of its costeffectiveness.Method: The article selection method was obtained from primary and secondary literature sources which included original research articles, case control studies, review articles, proceedings, transactions and textbooks. The authors cited a clinical audit and articles published between 1988 and 2011. The search strategy included the use of internet search engines. This review was part of a larger research and the study protocol was approved by the University of Ibadan/University College Hospital, Ibadan Institutional Review Board (UI/UCH IRB. Clinical trial registration number-NHREC/05/01/2008a.Results: Breast self-examination (BSE and clinical breast examination (CBE as screening tools for breast cancer were analysed in detail.Conclusion: Breast examination is a screening tool that is cost-effective and reliable and should be encouraged in resource-constrained countries. Given the high cost and expertise required for mammography, current efforts at screening for breast cancer in developing countries should rely more on a combination of BSE and CBE.

  10. Investigation of the formaldehyde differential absorption cross section at high and low spectral resolution in the simulation chamber SAPHIR

    Directory of Open Access Journals (Sweden)

    T. Brauers

    2007-07-01

    Full Text Available The results from a simulation chamber study on the formaldehyde (HCHO absorption cross section in the UV spectral region are presented. We performed 4 experiments at ambient HCHO concentrations with simultaneous measurements of two DOAS instruments in the atmosphere simulation chamber SAPHIR in Jülich. The two instruments differ in their spectral resolution, one working at 0.2 nm (broad-band, BB-DOAS, the other at 2.7 pm (high-resolution, HR-DOAS. Both instruments use dedicated multi reflection cells to achieve long light path lengths of 960 m and 2240 m, respectively, inside the chamber. During two experiments HCHO was injected into the clean chamber by thermolysis of well defined amounts of para-formaldehyde reaching mixing rations of 30 ppbV at maximum. The HCHO concentration calculated from the injection and the chamber volume agrees with the BB-DOAS measured value when the absorption cross section of Meller and Moortgat (2000 and the temperature coefficient of Cantrell (1990 were used for data evaluation. In two further experiments we produced HCHO in-situ from the ozone + ethene reaction which was intended to provide an independent way of HCHO calibration through the measurements of ozone and ethene. However, we found an unexpected deviation from the current understanding of the ozone + ethene reaction when CO was added to suppress possible oxidation of ethene by OH radicals. The reaction of the Criegee intermediate with CO could be 240 times slower than currently assumed. Based on the BB-DOAS measurements we could deduce a high-resolution cross section for HCHO which was not measured directly so far.

  11. The Keithley System 570 Data Acquisition Workstation: A tool for setting up control and data acquisition in psychophysiological experiments.

    NARCIS (Netherlands)

    Wijker, W.; van der Molen, M.W.; Molenaar, P.C.M.; Maarsse, F.J.; Mulder, L.J.M.

    1991-01-01

    Discusses an experimental psychophysiological set-up using the Keithley System 570 Data Acquisition Workstation. Keithley System consists of hardware and software components which are capable of dealing with most of the demands required in experimental psychology. Reviews the hardware and software o

  12. Investigation of spatial resolution and temporal performance of SAPHIRE (scintillator avalanche photoconductor with high resolution emitter readout) with integrated electrostatic focusing

    Science.gov (United States)

    Scaduto, David A.; Lubinsky, Anthony R.; Rowlands, John A.; Kenmotsu, Hidenori; Nishimoto, Norihito; Nishino, Takeshi; Tanioka, Kenkichi; Zhao, Wei

    2014-03-01

    We have previously proposed SAPHIRE (scintillator avalanche photoconductor with high resolution emitter readout), a novel detector concept with potentially superior spatial resolution and low-dose performance compared with existing flat-panel imagers. The detector comprises a scintillator that is optically coupled to an amorphous selenium photoconductor operated with avalanche gain, known as high-gain avalanche rushing photoconductor (HARP). High resolution electron beam readout is achieved using a field emitter array (FEA). This combination of avalanche gain, allowing for very low-dose imaging, and electron emitter readout, providing high spatial resolution, offers potentially superior image quality compared with existing flat-panel imagers, with specific applications to fluoroscopy and breast imaging. Through the present collaboration, a prototype HARP sensor with integrated electrostatic focusing and nano- Spindt FEA readout technology has been fabricated. The integrated electron-optic focusing approach is more suitable for fabricating large-area detectors. We investigate the dependence of spatial resolution on sensor structure and operating conditions, and compare the performance of electrostatic focusing with previous technologies. Our results show a clear dependence of spatial resolution on electrostatic focusing potential, with performance approaching that of the previous design with external mesh-electrode. Further, temporal performance (lag) of the detector is evaluated and the results show that the integrated electrostatic focusing design exhibits comparable or better performance compared with the mesh-electrode design. This study represents the first technical evaluation and characterization of the SAPHIRE concept with integrated electrostatic focusing.

  13. An indirect flat-panel detector with avalanche gain for low dose x-ray imaging: SAPHIRE (scintillator avalanche photoconductor with high resolution emitter readout)

    Science.gov (United States)

    Zhao, Wei; Li, Dan; Rowlands, J. A.; Egami, N.; Takiguchi, Y.; Nanba, M.; Honda, Y.; Ohkawa, Y.; Kubota, M.; Tanioka, K.; Suzuki, K.; Kawai, T.

    2008-03-01

    An indirect flat-imager with programmable avalanche gain and field emitter array (FEA) readout is being investigated for low-dose x-ray imaging with high resolution. It is made by optically coupling a structured x-ray scintillator CsI (Tl) to an amorphous selenium (a-Se) avalanche photoconductor called HARP (high-gain avalanche rushing photoconductor). The charge image created by HARP is read out by electron beams generated by the FEA. The proposed detector is called SAPHIRE (Scintillator Avalanche Photoconductor with HIgh Resolution Emitter readout). The avalanche gain of HARP depends on both a-Se thickness and applied electric field E Se. At E Se of > 80 V/μm, the avalanche gain can enhance the signal at low dose (e.g. fluoroscopy) and make the detector x-ray quantum noise limited down to a single x-ray photon. At high exposure (e.g. radiography), the avalanche gain can be turned off by decreasing E Se to < 70 V/μm. In this paper the imaging characteristics of the FEA readout method, including the spatial resolution and noise, were investigated experimentally using a prototype optical HARP-FEA image sensor. The potential x-ray imaging performance of SAPHIRE, especially the aspect of programmable gain to ensure wide dynamic range and x-ray quantum noise limited performance at the lowest exposure in fluoroscopy, was investigated.

  14. Calibrated Peer Review: A New Tool for Integrating Information Literacy Skills in Writing-Intensive Large Classroom Settings.

    OpenAIRE

    Fosmire, Michael

    2010-01-01

    Calibrated Peer Review™ (CPR) is a program that can significantly enhance the ability to integrate intensive information literacy exercises into large classroom settings. CPR is founded on a solid pedagogic base for learning, and it is formulated in such a way that information skills can easily be inserted. However, there is no mention of its application for information literacy in the library literature. A sample implementation of CPR in a course co-taught by science disciplinary faculty and...

  15. The PTSD Practitioner Registry: An Innovative Tracking, Dissemination, and Support Tool for Providers in Military and Nonmilitary Settings

    Science.gov (United States)

    2015-10-01

    Military and Nonmilitary Settings PRINCIPAL INVESTIGATOR : Josef I. Ruzek, Ph.D. RECIPIENT: Palo Alto Veterans for Research and Education Palo...AND ADDRESS(ES) NAND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBERPalo Alto Veterans for Research and Education Palo Alto, CA 94304 9...proposed registry survey. In Phase II, an RCT will be conducted to evaluate the impact of registry participation on practices/CPG awareness

  16. Cone beam computed tomography (CBCT) as a tool for the analysis of nonhuman skeletal remains in a medico-legal setting.

    Science.gov (United States)

    Lucena, Joaquin; Mora, Esther; Rodriguez, Lucia; Muñoz, Mariela; Cantin, Mario G; Fonseca, Gabriel M

    2016-09-01

    To confirm the nature and forensic significance of questioned skeletal material submitted a medico-legal setting is a relatively common procedure, although not without difficulties when the remains are fragmented or burned. Different methodologies have been described for this purpose, many of them invasive, time and money consuming or dependent on the availability of the analytical instrument. We present a case in which skeletal material with unusual conditions of preservation and curious discovery was sent to a medico-legal setting to determine its human/nonhuman origin. A combined strategy of imagenological procedures (macroscopic, radiographic and cone beam computed tomography - CBCT-technology) was performed as non-invasive and rapid methods to assess the nonhuman nature of the material, specifically of pig (Sus scrofa) origin. This hypothesis was later confirmed by DNA analysis. CBCT data sets provide accurate three-dimensional reconstructions, which demonstrate its reliable use as a forensic tool.

  17. 数控铣削加工中的刀具长度补偿设定方式分析%Analysis of Set of Tool Length Offset in NC Milling

    Institute of Scientific and Technical Information of China (English)

    李志梅; 魏本建

    2011-01-01

    The tool setting and tool length offset setting are important before NC milling. The different tool setting method, the different topi length offset is set. Based on the FANUC - OiM NC system,' This paper presents the different tool setting and the different tool length offset setting in detail combined with the concrete case.%对刀与刀具长度补偿值的设定是数控加工前的一个重要环节.对刀方式不同,刀长补的设置也不同.本文结合具体零件加工实例,以FANUC - 0iM数控系统为媒介,详尽阐述了各种对刀方式及相应刀长补的不同设定.

  18. Assessing capacity in the setting of self-neglect: development of a novel screening tool for decision-making capacity.

    Science.gov (United States)

    Naik, Aanand D; Pickens, Sabrina; Burnett, Jason; Lai, James M; Dyer, Carmel Bitondo

    2006-01-01

    Compared with older adults with disabilities and those who autonomously choose to live in squalor, self-neglect syndrome arises from a predicate state of vulnerability in frail older adults. This state of vulnerability is characteristically associated with a decline in decision-making capacity regarding the ability to care for and protect oneself. We developed the COMP Screen to evaluate vulnerable older adults to identify potential gaps in decision-making capacity using a screening tool. A total of 182 older adults were evaluated and consistent declines in cognitive ability and decision-making processes were present in this population. However, there were no significant differences between elders referred for self-neglect and matched older adults. These findings suggest that declines in decision-making processes are not uncommon in vulnerable older adults but traditional conceptualizations of decision-making capacity may be inadequate for differentiating the capacity for self-care and protection in elders who self-neglect.

  19. Prioritizing live bird markets at risk of avian influenza H5N1 virus contamination for intervention: a simple tool for low resource settings.

    Science.gov (United States)

    Samaan, Gina; Indriani, Risa; Carrasco, Luis Roman; Lokuge, Kamalini; Cook, Alex R; Kelly, Paul M; Adjid, Rma

    2012-12-01

    Live bird markets (LBMs) are at risk of contamination with the avian influenza H5N1 virus. There are a number of methods for prioritizing LBMs for intervention to curb the risk of contamination. Selecting a method depends on diagnostic objective and disease prevalence. In a low resource setting, options for prioritization are constricted by the cost of and resources available for tool development and administration, as well as the resources available for intervention. In this setting, tools can be developed using previously collected data on risk factors for contamination, and translated into prediction equations, including decision trees (DTs). DTs are a graphical type of classifier that combine simple questions about the data in an intuitive way. DTs can be used to develop tools tailored to different diagnostic objectives. To demonstrate the utility of this method, risk factor data arising from a previous cross-sectional study in 83 LBMs in Indonesia were used to construct DTs. A DT with high specificity was selected for the initial stage of an LBM intervention campaign in which authorities aim to focus intervention resources on a small set of LBMs that are at near-certain risk of contamination. Another DT with high sensitivity was selected for later stages in an intervention campaign in which authorities aim to detect and prioritize all LBMs with the risk factors for virus contamination. The best specific DT achieved specificity of 77% and the best sensitive DT achieved sensitivity of 90%. The specific DT had two variables: the size of the duck population in the LBM and the human population density in the LBM's district. The sensitive DT had three variables: LBM location, whether solid waste was removed from the LBM daily and whether the LBM was zoned to separate the bird holding, slaughtering and sale areas. High specificity or sensitivity will be preferred by authorities depending on the stage of the intervention campaign. The study demonstrates that simple

  20. metAlignID: A high-throughout sofware tool set for automated detection of trace level contaminants in comprehensive LECO two-dimensional gas chromatography time-of-flight mass spectrometry data

    NARCIS (Netherlands)

    Lommen, A.; Kamp, van der H.J.; Kools, H.J.; Lee, van der M.K.; Weg, van der G.

    2012-01-01

    A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has b

  1. Development of a Cancer Risk Prediction Tool for Use in the UK Primary Care and Community Settings.

    Science.gov (United States)

    Lophatananon, Artitaya; Usher-Smith, Juliet; Campbell, Jackie; Warcaba, Joanne; Silarova, Barbora; Waters, Erika A; Colditz, Graham A; Muir, Kenneth R

    2017-07-01

    Several multivariable risk prediction models have been developed to asses an individual's risk of developing specific cancers. Such models can be used in a variety of settings for prevention, screening, and guiding investigations and treatments. Models aimed at predicting future disease risk that contains lifestyle factors may be of particular use for targeting health promotion activities at an individual level. This type of cancer risk prediction is not yet available in the UK. We have adopted the approach used by the well-established U.S.-derived "YourCancerRisk" model for use in the UK population, which allow users to quantify their individual risk of developing individual cancers relative to the population average risk. The UK version of "YourCancerRisk" computes 10-year cancer risk estimates for 11 cancers utilizing UK figures for prevalence of risk factors and cancer incidence. Because the prevalence of risk factors and the incidence rates for cancer are different between the U.S. and the UK population, this UK model provides more accurate estimates of risks for a UK population. Using an example of breast cancer and data from UK Biobank cohort, we demonstrate that the individual risk factor estimates are similar for the U.S. and UK populations. Assessment of the performance and validation of the multivariate model predictions based on a binary score confirm the model's applicability. The model can be used to estimate absolute and relative cancer risk for use in Primary Care and community settings and is being used in the community to guide lifestyle change. Cancer Prev Res; 10(7); 421-30. ©2017 AACR. ©2017 American Association for Cancer Research.

  2. Quantitative and qualitative assessment of diurnal variability in tropospheric humidity using SAPHIR on-board Megha-Tropiques

    Science.gov (United States)

    Uma, K. N.; Das, Siddarth Shankar

    2016-08-01

    The global diurnal variability of relative humidity (RH) from August 2012 to May 2014 is discussed for the first time using 'Sounder for Atmospheric Profiling of Humidity in the Inter-tropical Regions (SAPHIR)', a microwave humidity sounder onboard Megha-Tropiques (MT). It is superior to other microwave satellite humidity sounders in terms of its higher repetitive cycle in the tropics owing to its low-inclination orbit and the availability of six dedicated humidity sounding channels. The six layers obtained are 1000-850, 850-700, 700-550, 550-400, 400-250 and 250-100 hPa. Three hourly data over a month has been combined using equivalent day analysis to attain a composite profile of complete diurnal cycle in each grid (2.5°×2.5°). A distinct diurnal variation is obtained over the continental and the oceanic regions at all the layers. The magnitude in the lower tropospheric humidity (LTH), middle tropospheric humidity (MTH) and the upper tropospheric humidity (UTH) show a large variability over the continental regions compared to that over oceans. The monthly variability of the diurnal variation over the years has also been discussed by segregating into five different continental and four different oceanic regions. Afternoon peaks dominate in the LTH over the land and the desert regions. The MTH is found to vary between the evening and the early morning hours over different geographical regions and not as consistent as that of the LTH. The UTH maximum magnitude is generally observed during the early morning hours, over the continents. Interestingly, the Oceanic regions are found to have a dominant magnitude in the afternoon hours similar to that of the continents in the LTH, evening maximum in the MTH and the early morning maximum in the UTH. The underlying mechanisms involved in the variability of humidity over different regions are also discussed. The study reveals the complexity involved in the understanding the diurnal variability over the continents and open

  3. Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation Electricore, Inc.

    Energy Technology Data Exchange (ETDEWEB)

    Daye, Tony [Green Power Labs (GPL), San Diego, CA (United States)

    2013-09-30

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  4. Tool for evaluating research implementation challenges: a sense-making protocol for addressing implementation challenges in complex research settings.

    Science.gov (United States)

    Simpson, Kelly M; Porter, Kristie; McConnell, Eleanor S; Colón-Emeric, Cathleen; Daily, Kathryn A; Stalzer, Alyson; Anderson, Ruth A

    2013-01-02

    Many challenges arise in complex organizational interventions that threaten research integrity. This article describes a Tool for Evaluating Research Implementation Challenges (TECH), developed using a complexity science framework to assist research teams in assessing and managing these challenges. During the implementation of a multi-site, randomized controlled trial (RCT) of organizational interventions to reduce resident falls in eight nursing homes, we inductively developed, and later codified the TECH. The TECH was developed through processes that emerged from interactions among research team members and nursing home staff participants, including a purposive use of complexity science principles. The TECH provided a structure to assess challenges systematically, consider their potential impact on intervention feasibility and fidelity, and determine actions to take. We codified the process into an algorithm that can be adopted or adapted for other research projects. We present selected examples of the use of the TECH that are relevant to many complex interventions. Complexity theory provides a useful lens through which research procedures can be developed to address implementation challenges that emerge from complex organizations and research designs. Sense-making is a group process in which diverse members interpret challenges when available information is ambiguous; the groups' interpretations provide cues for taking action. Sense-making facilitates the creation of safe environments for generating innovative solutions that balance research integrity and practical issues. The challenges encountered during implementation of complex interventions are often unpredictable; however, adoption of a systematic process will allow investigators to address them in a consistent yet flexible manner, protecting fidelity. Research integrity is also protected by allowing for appropriate adaptations to intervention protocols that preserve the feasibility of 'real world

  5. PatentMatrix: an automated tool to survey patents related to large sets of genes or proteins

    Directory of Open Access Journals (Sweden)

    de Rinaldis Emanuele

    2007-09-01

    Full Text Available Abstract Background The number of patents associated with genes and proteins and the amount of information contained in each patent often present a real obstacle to the rapid evaluation of the novelty of findings associated to genes from an intellectual property (IP perspective. This assessment, normally carried out by expert patent professionals, can therefore become cumbersome and time consuming. Here we present PatentMatrix, a novel software tool for the automated analysis of patent sequence text entries. Methods and Results PatentMatrix is written in the Awk language and requires installation of the Derwent GENESEQ™ patent sequence database under the sequence retrieval system SRS. The software works by taking as input two files: i a list of genes or proteins with the associated GENESEQ™ patent sequence accession numbers ii a list of keywords describing the research context of interest (e.g. 'lung', 'cancer', 'therapeutics', 'diagnostics'. The GENESEQ™ database is interrogated through the SRS system and each patent entry of interest is screened for the occurrence of user-defined keywords. Moreover, the software extracts the basic information useful for a preliminary assessment of the IP coverage of each patent from the GENESEQ™ database. As output, two tab-delimited files are generated which provide the user with a detailed and an aggregated view of the results. An example is given where the IP position of five genes is evaluated in the context of 'development of antibodies for cancer treatment' Conclusion PatentMatrix allows a rapid survey of patents associated with genes or proteins in a particular area of interest as defined by keywords. It can be efficiently used to evaluate the IP-related novelty of scientific findings and to rank genes or proteins according to their IP position.

  6. Steganography and steganalysis in voice-over IP scenarios: operational aspects and first experiences with a new steganalysis tool set

    Science.gov (United States)

    Dittmann, Jana; Hesse, Danny; Hillert, Reyk

    2005-03-01

    Based on the knowledge and experiences from existing image steganalysis techniques, the overall objective of the paper is to evaluate existing audio steganography with a special focus on attacks in ad-hoc end-to-end media communications on the example of Voice over IP (VoIP) scenarios. One aspect is to understand operational requirements of recent steganographic techniques for VoIP applications. The other aspect is to elaborate possible steganalysis approaches applied to speech data. In particular we have examined existing VoIP applications with respect to their extensibility to steganographic algorithms. We have also paid attention to the part of steganalysis in PCM audio data which allows us to detect hidden communication while a running VoIP communication with the usage of the PCM codec. In our impelementation we use Jori's Voice over IP library by Jori Liesenborgs (JVOIPLIB) that provides primitives for a voice over IP communication. Finally we show first results of our prototypic implementation which extents the common VoIP scenario by the new feature of steganography. We also show the results for our PCM steganalyzer framework that is able to detect this kind of hidden communication by using a set of 13 first and second order statistics.

  7. [Prioritizing prescriptions in a ambulatory care setting: a tool to achieve appropriateness of care in public health management].

    Science.gov (United States)

    Semeraro, V; Zantedeschi, E; Pasquarella, A; Guerrera, C; Guasticchi, G

    2012-01-01

    Waiting lists are one of the main Public Health issues within developed countries. To promote appropriateness about General Practitioners' (GPs) prescriptions, during 2009 the project "Priority setting in outpatient prescriptions" in Latium Region has been approved. Regional referees, Latium Public Health Agency managers and advisors, managers and advisors of three Local Public Health Units (LPHUs) within the Latium region and some voluntarily recruited General Practitioners (each one with more than 800 patients enrolled) were included in a team work with the duty to develop the project. During two selected months of 2010, 46 GPs have forwarded overall 2.229 medical prescriptions. The six most numerous prescriptions were picked out and analyzed by the team work. 42% of these prescriptions were identified as belonging to category D of the priority level--"standard", while 42% and 41% of prescriptions bore the expressions of "control" and "diagnostic purpose" respectively. Among these ones, 75% were represented by bilateral mammography, prescribed to women aged between 50 and 69 years: but for those people bilateral mammography is already provided free of charge within the regional program of breast cancer screening, making the routine prescription by their physician a useless duplication, unacceptable in a healthcare system of good quality. Therefore at the conclusion of the project, the team work suggests proper standards be applied by healthcare professionals and GPs to achieve a significant objective: mammography appropriateness prescriptions.

  8. SAPHIRE: Stress and Pulmonary Hypertension in Rheumatoid Evaluation—A Prevalence Study

    Directory of Open Access Journals (Sweden)

    G. E. M. Reeves

    2016-01-01

    Full Text Available Pulmonary artery hypertension (PAH is a disorder of elevated resistance in the pulmonary arterial vessels, reflected by elevation of measured pulmonary artery pressure (PAP, and presenting with breathlessness and, if untreated, progressing to right heart failure and death. The heightened prevalence of PAH in populations with underlying systemic autoimmune conditions, particularly scleroderma and its variants, is well recognised, consistent with the proposed autoimmune contribution to PAH pathogenesis, along with disordered thrombotic, inflammatory, and mitogenic factors. Rheumatoid arthritis (RA is one of a group of systemic autoimmune conditions featuring inflammatory symmetrical erosive polyarthropathy as its hallmark. This study explored the prevalence of PAH in a population of unselected individuals with RA, using exercise echocardiography (EchoCG. The high prevalence of EchoCG-derived elevation of PAP (EDEPP in this population (14% suggests that, like other autoimmune conditions, RA may be a risk factor for PAH. Patients with RA may therefore represent another population for whom PAH screening with noninvasive tools such as EchoCG may be justified.

  9. Evaluation of visual inspection with acetic acid and Lugol's iodine as cervical cancer screening tools in a low-resource setting.

    Science.gov (United States)

    Qureshi, Sabuhi; Das, Vinta; Zahra, Fatima

    2010-01-01

    In view of the failure of cytology screening programmes for cervical cancer in developing countries, the World Health Organization suggested unaided visual inspection of the cervix after an application of acetic acid (VIA) and Lugol's iodine (VILI) as alternative screening methods. Our study evaluates the effectiveness of VIA and VILI compared to Pap smear as screening methods for carcinoma of the cervix in a low-resource setting. Three hundred and twenty-eight women were subjected to a Pap smear test, VIA, VILI and colposcopy. The results were as follows: Pap smear test (20.83%, specificity 98.38%), VIA (55.5%, 71.39%) and VILI (86.84%, 48.93%). Although VIA and VILI are less specific in comparison to the Pap smear test, they are more sensitive in detecting pre-invasive lesions. Hence VIA and VILI can be used as cervical cancer screening tools in low-resource settings.

  10. Measurement of the reaction {gamma}p{yields}K{sup 0}{sigma}{sup +} for photon energies up to 2.65 GeV with the SAPHIR detector at ELSA; Messung der Reaktion {gamma}p {yields} K{sup 0}{sigma}{sup +} fuer Photonenergien bis 2.65 GeV mit dem SAPHIR-Detektor an ELSA

    Energy Technology Data Exchange (ETDEWEB)

    Lawall, R.

    2004-01-01

    The reaction {gamma}p {yields} K{sup 0}{sigma}{sup +} was measured with the SAPHIR-detector at ELSA during the run periods 1997 and 1998. Results were obtained for cross sections in the photon energy range from threshold up to 2.65 GeV for all production angles and for the {sigma}{sup +}-polarization. Emphasis has been put on the determination and reduction of the contributions of background reactions and the comparison with other measurements and theoretical predictions. (orig.)

  11. Disease management index of potential years of life lost as a tool for setting priorities in national disease control using OECD health data.

    Science.gov (United States)

    Jang, Sung-In; Nam, Jung-Mo; Choi, Jongwon; Park, Eun-Cheol

    2014-03-01

    Limited healthcare resources make it necessary to maximize efficiency in disease management at the country level by priority-setting according to disease burden. To make the best priority settings, it is necessary to measure health status and have standards for its judgment, as well as consider disease management trends among nations. We used 17 International Classification of Diseases (ICD) categories of potential years of life lost (YPLL) from Organization for Economic Co-operation and Development (OECD) health data for 2012, 37 disease diagnoses YPLL from OECD health data for 2009 across 22 countries and disability-adjusted life years (DALY) from the World Health Organization (WHO). We set a range of 1-1 for each YPLL per disease in a nation (position value for relative comparison, PARC). Changes over 5 years were also accounted for in this disease management index (disease management index, DMI). In terms of ICD categories, the DMI indicated specific areas for priority setting for different countries with regard to managing disease treatment and diagnosis. Our study suggests that DMI is a realistic index that reflects trend changes over the past 5 years to the present state, and PARC is an easy index for identifying relative status. Moreover, unlike existing indices, DMI and PARC make it easy to conduct multiple comparisons among countries and diseases. DMI and PARC are therefore useful tools for policy implications and for future studies incorporating them and other existing indexes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. De-MetaST-BLAST: a tool for the validation of degenerate primer sets and data mining of publicly available metagenomes.

    Directory of Open Access Journals (Sweden)

    Christopher A Gulvik

    Full Text Available Development and use of primer sets to amplify nucleic acid sequences of interest is fundamental to studies spanning many life science disciplines. As such, the validation of primer sets is essential. Several computer programs have been created to aid in the initial selection of primer sequences that may or may not require multiple nucleotide combinations (i.e., degeneracies. Conversely, validation of primer specificity has remained largely unchanged for several decades, and there are currently few available programs that allows for an evaluation of primers containing degenerate nucleotide bases. To alleviate this gap, we developed the program De-MetaST that performs an in silico amplification using user defined nucleotide sequence dataset(s and primer sequences that may contain degenerate bases. The program returns an output file that contains the in silico amplicons. When De-MetaST is paired with NCBI's BLAST (De-MetaST-BLAST, the program also returns the top 10 nr NCBI database hits for each recovered in silico amplicon. While the original motivation for development of this search tool was degenerate primer validation using the wealth of nucleotide sequences available in environmental metagenome and metatranscriptome databases, this search tool has potential utility in many data mining applications.

  13. A standard operating protocol (SOP) and minimum data set (MDS) for nursing and medical handover: considerations for flexible standardization in developing electronic tools.

    Science.gov (United States)

    Turner, Paul; Wong, Ming Chao; Yee, Kwang Chien

    2009-01-01

    As part of Australia's participation in the World Health Organization, the Australian Commission on Safety and Quality in Health Care (ACSQHC) is the leading federal government technical agency involved in the area of clinical handover improvement. The ACSQHC has funded a range of handover improvement projects in Australia including one at the Royal Hobart Hospital (RHH), Tasmania. The RHH project aims to investigate the potential for generalizable and transferable clinical handover solutions throughout the medical and nursing disciplines. More specifically, this project produced an over-arching minimum data set (MDS) and over-arching standardized operating protocol (SOP) based on research work on nursing and medical shift-to-shift clinical handover in general medicine, general surgery and emergency medicine. The over-arching MDS consists of five headings: situational awareness, patient identification, history and information, responsibility and tasks and accountability. The over-arching SOP has five phases: preparation; design; implementation; evaluation; and maintenance. This paper provides an overview of the project and the approach taken. It considers the implications of these standardized operating protocols and minimum data sets for developing electronic clinical handover support tools. Significantly, the paper highlights a human-centred design approach that actively involves medical and nursing staff in data collection, analysis, interpretation, and systems design. This approach reveals the dangers of info-centrism when considering electronic tools, as information emerges as the only factor amongst many others that influence the efficiency and effectiveness of clinical handover.

  14. To select the best tool for generating 3D maintenance data and to set the detailed process for obtaining the 3D maintenance data

    Science.gov (United States)

    Prashanth, B. N.; Roy, Kingshuk

    2017-07-01

    Three Dimensional (3D) maintenance data provides a link between design and technical documentation creating interactive 3D graphical training and maintenance material. It becomes difficult for an operator to always go through huge paper manuals or come running to the computer for doing maintenance of a machine which makes the maintenance work fatigue. Above being the case, a 3D animation makes maintenance work very simple since, there is no language barrier. The research deals with the generation of 3D maintenance data of any given machine. The best tool for obtaining the 3D maintenance is selected and the tool is analyzed. Using the same tool, a detailed process for extracting the 3D maintenance data for any machine is set. This project aims at selecting the best tool for obtaining 3D maintenance data and to select the detailed process for obtaining 3D maintenance data. 3D maintenance reduces use of big volumes of manuals which creates human errors and makes the work of an operator fatiguing. Hence 3-D maintenance would help in training and maintenance and would increase productivity. 3Dvia when compared with Cortona 3D and Deep Exploration proves to be better than them. 3Dvia is good in data translation and it has the best renderings compared to the other two 3D maintenance software. 3Dvia is very user friendly and it has various options for creating 3D animations. Its Interactive Electronic Technical Publication (IETP) integration is also better than the other two software. Hence 3Dvia proves to be the best software for obtaining 3D maintenance data of any machine.

  15. Comparison of Relative Humidity obtained from SAPHIR on board Megha-Tropiques and Ground based Microwave Radiometer Profiler over an equatorial station

    Science.gov (United States)

    Renju, Ramachandran Pillai; Uma, K. N.; Krishna Moorthy, K.; Mathew, Nizy; Raju C, Suresh

    A comparison has been made between the SAPHIR on board Megha-Tropiques (MT) derived Relative Humidity (RH (%)) with that derived from a ground based multi-frequency Microwave Radiometer Profiler (MRP) observations over an equatorial station Thiruvananthapuram (8.5(°) N and 76.9(°) E) for a one year period. As a first step, the validation of MRP has been made against the radiosonde for two years (2010 and 2011) during the Indian monsoon period July-September. This analysis shows a wet bias below 6 km and dry bias above. The comparison between the MRP and the MT derived RH has been made at five different altitudinal levels (0.75, 2.25, 4.0, 6.25 and 9.2 km range) strictly under clear sky condition. The regression analysis between the two reveals very good correlation (>0.8) in the altitudinal layer of 2.25 to 6.25 km. The differences between the two observations had also been explained interms of percentage of occurrence between MT and the MRP at each altitudinal layer. About 70-80% of the time, the difference in the RH is found to below 10% at first three layer. The RMSE of 2% is observed at almost all the height layers. The differences have been attributed to the different measurement and retrieval techniques involved in the ground based and satellite based measurements. Since MRP frequecy channels are not sensitive to small water vapor variabilities above 6 km, large differences are observed. Radiative Transfer computation for the channels of both MRP and SAPHIR will be carried out to understand the variabilities.

  16. The cost-effectiveness of monitoring strategies for antiretroviral therapy of HIV infected patients in resource-limited settings: software tool.

    Directory of Open Access Journals (Sweden)

    Janne Estill

    Full Text Available The cost-effectiveness of routine viral load (VL monitoring of HIV-infected patients on antiretroviral therapy (ART depends on various factors that differ between settings and across time. Low-cost point-of-care (POC tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring.We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL, POC-VL, and laboratory-based VL monitoring, with different frequencies. We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs. We calculated incremental cost-effectiveness ratios (ICER. We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs.Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months, where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure.Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal

  17. Transdisciplinary Research on Cancer-Healing Systems Between Biomedicine and the Maya of Guatemala: A Tool for Reciprocal Reflexivity in a Multi-Epistemological Setting.

    Science.gov (United States)

    Berger-González, Mónica; Stauffacher, Michael; Zinsstag, Jakob; Edwards, Peter; Krütli, Pius

    2016-01-01

    Transdisciplinarity (TD) is a participatory research approach in which actors from science and society work closely together. It offers means for promoting knowledge integration and finding solutions to complex societal problems, and can be applied within a multiplicity of epistemic systems. We conducted a TD process from 2011 to 2014 between indigenous Mayan medical specialists from Guatemala and Western biomedical physicians and scientists to study cancer. Given the immense cultural gap between the partners, it was necessary to develop new methods to overcome biases induced by ethnocentric behaviors and power differentials. This article describes this intercultural cooperation and presents a method of reciprocal reflexivity (Bidirectional Emic-Etic tool) developed to overcome them. As a result of application, researchers observed successful knowledge integration at the epistemic level, the social-organizational level, and the communicative level throughout the study. This approach may prove beneficial to others engaged in facilitating participatory health research in complex intercultural settings.

  18. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    Science.gov (United States)

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  19. Using the lives saved tool (LiST) to model mHealth impact on neonatal survival in resource-limited settings.

    Science.gov (United States)

    Jo, Youngji; Labrique, Alain B; Lefevre, Amnesty E; Mehl, Garrett; Pfaff, Teresa; Walker, Neff; Friberg, Ingrid K

    2014-01-01

    While the importance of mHealth scale-up has been broadly emphasized in the mHealth community, it is necessary to guide scale up efforts and investment in ways to help achieve the mortality reduction targets set by global calls to action such as the Millennium Development Goals, not merely to expand programs. We used the Lives Saved Tool (LiST)--an evidence-based modeling software--to identify priority areas for maternal and neonatal health services, by formulating six individual and combined interventions scenarios for two countries, Bangladesh and Uganda. Our findings show that skilled birth attendance and increased facility delivery as targets for mHealth strategies are likely to provide the biggest mortality impact relative to other intervention scenarios. Although further validation of this model is desirable, tools such as LiST can help us leverage the benefit of mHealth by articulating the most appropriate delivery points in the continuum of care to save lives.

  20. Using the lives saved tool (LiST to model mHealth impact on neonatal survival in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Youngji Jo

    Full Text Available While the importance of mHealth scale-up has been broadly emphasized in the mHealth community, it is necessary to guide scale up efforts and investment in ways to help achieve the mortality reduction targets set by global calls to action such as the Millennium Development Goals, not merely to expand programs. We used the Lives Saved Tool (LiST--an evidence-based modeling software--to identify priority areas for maternal and neonatal health services, by formulating six individual and combined interventions scenarios for two countries, Bangladesh and Uganda. Our findings show that skilled birth attendance and increased facility delivery as targets for mHealth strategies are likely to provide the biggest mortality impact relative to other intervention scenarios. Although further validation of this model is desirable, tools such as LiST can help us leverage the benefit of mHealth by articulating the most appropriate delivery points in the continuum of care to save lives.

  1. Improvement of visual debugging tool. Shortening the elapsed time for getting data and adding new functions to compare/combine a set of visualized data

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2001-03-01

    The visual debugging tool 'vdebug' has been improved, which was designed for the debugging of programs for scientific computing. Improved were the following two points; (1) shortening the elapsed time required for getting appropriate data to visualize; (2) adding new functions which enable to compare and/or combine a set of visualized data originated from two or more different programs. As for shortening elapsed time for getting data, with the improved version of 'vdebug', we could achieve the following results; over hundred times shortening the elapsed time with dbx, pdbx of SX-4 and over ten times with ndb of SR2201. As for the new functions to compare/combine visualized data, it was confirmed that we could easily checked the consistency between the computational results obtained in each calculational steps on two different computers: SP and ONYX. In this report, we illustrate how the tool 'vdebug' has been improved with an example. (author)

  2. MicroPattern: a web-based tool for microbe set enrichment analysis and disease similarity calculation based on a list of microbes

    Science.gov (United States)

    Ma, Wei; Huang, Chuanbo; Zhou, Yuan; Li, Jianwei; Cui, Qinghua

    2017-01-01

    The microbiota colonized on human body is renowned as “a forgotten organ” due to its big impacts on human health and disease. Recently, microbiome studies have identified a large number of microbes differentially regulated in a variety of conditions, such as disease and diet. However, methods for discovering biological patterns in the differentially regulated microbes are still limited. For this purpose, here, we developed a web-based tool named MicroPattern to discover biological patterns for a list of microbes. In addition, MicroPattern implemented and integrated an algorithm we previously presented for the calculation of disease similarity based on disease-microbe association data. MicroPattern first grouped microbes into different sets based on the associated diseases and the colonized positions. Then, for a given list of microbes, MicroPattern performed enrichment analysis of the given microbes on all of the microbe sets. Moreover, using MicroPattern, we can also calculate disease similarity based on the shared microbe associations. Finally, we confirmed the accuracy and usefulness of MicroPattern by applying it to the changed microbes under the animal-based diet condition. MicroPattern is freely available at http://www.cuilab.cn/micropattern. PMID:28071710

  3. CRISPR multitargeter: a web tool to find common and unique CRISPR single guide RNA targets in a set of similar sequences.

    Directory of Open Access Journals (Sweden)

    Sergey V Prykhozhij

    Full Text Available Genome engineering has been revolutionized by the discovery of clustered regularly interspaced palindromic repeats (CRISPR and CRISPR-associated system genes (Cas in bacteria. The type IIB Streptococcus pyogenes CRISPR/Cas9 system functions in many species and additional types of CRISPR/Cas systems are under development. In the type II system, expression of CRISPR single guide RNA (sgRNA targeting a defined sequence and Cas9 generates a sequence-specific nuclease inducing small deletions or insertions. Moreover, knock-in of large DNA inserts has been shown at the sites targeted by sgRNAs and Cas9. Several tools are available for designing sgRNAs that target unique locations in the genome. However, the ability to find sgRNA targets common to several similar sequences or, by contrast, unique to each of these sequences, would also be advantageous. To provide such a tool for several types of CRISPR/Cas system and many species, we developed the CRISPR MultiTargeter software. Similar DNA sequences in question are duplicated genes and sets of exons of different transcripts of a gene. Thus, we implemented a basic sgRNA target search of input sequences for single-sgRNA and two-sgRNA/Cas9 nickase targeting, as well as common and unique sgRNA target searches in 1 a set of input sequences; 2 a set of similar genes or transcripts; or 3 transcripts a single gene. We demonstrate potential uses of the program by identifying unique isoform-specific sgRNA sites in 71% of zebrafish alternative transcripts and common sgRNA target sites in approximately 40% of zebrafish duplicated gene pairs. The design of unique targets in alternative exons is helpful because it will facilitate functional genomic studies of transcript isoforms. Similarly, its application to duplicated genes may simplify multi-gene mutational targeting experiments. Overall, this program provides a unique interface that will enhance use of CRISPR/Cas technology.

  4. CRISPR multitargeter: a web tool to find common and unique CRISPR single guide RNA targets in a set of similar sequences.

    Science.gov (United States)

    Prykhozhij, Sergey V; Rajan, Vinothkumar; Gaston, Daniel; Berman, Jason N

    2015-01-01

    Genome engineering has been revolutionized by the discovery of clustered regularly interspaced palindromic repeats (CRISPR) and CRISPR-associated system genes (Cas) in bacteria. The type IIB Streptococcus pyogenes CRISPR/Cas9 system functions in many species and additional types of CRISPR/Cas systems are under development. In the type II system, expression of CRISPR single guide RNA (sgRNA) targeting a defined sequence and Cas9 generates a sequence-specific nuclease inducing small deletions or insertions. Moreover, knock-in of large DNA inserts has been shown at the sites targeted by sgRNAs and Cas9. Several tools are available for designing sgRNAs that target unique locations in the genome. However, the ability to find sgRNA targets common to several similar sequences or, by contrast, unique to each of these sequences, would also be advantageous. To provide such a tool for several types of CRISPR/Cas system and many species, we developed the CRISPR MultiTargeter software. Similar DNA sequences in question are duplicated genes and sets of exons of different transcripts of a gene. Thus, we implemented a basic sgRNA target search of input sequences for single-sgRNA and two-sgRNA/Cas9 nickase targeting, as well as common and unique sgRNA target searches in 1) a set of input sequences; 2) a set of similar genes or transcripts; or 3) transcripts a single gene. We demonstrate potential uses of the program by identifying unique isoform-specific sgRNA sites in 71% of zebrafish alternative transcripts and common sgRNA target sites in approximately 40% of zebrafish duplicated gene pairs. The design of unique targets in alternative exons is helpful because it will facilitate functional genomic studies of transcript isoforms. Similarly, its application to duplicated genes may simplify multi-gene mutational targeting experiments. Overall, this program provides a unique interface that will enhance use of CRISPR/Cas technology.

  5. Evaluation of circulating cathodic antigen (CCA) urine-cassette assay as a survey tool for Schistosoma mansoni in different transmission settings within Bugiri District, Uganda.

    Science.gov (United States)

    Adriko, M; Standley, C J; Tinkitina, B; Tukahebwa, E M; Fenwick, A; Fleming, F M; Sousa-Figueiredo, J C; Stothard, J R; Kabatereine, N B

    2014-08-01

    Diagnosis of schistosomiasis at the point-of-care (POC) is a growing topic in neglected tropical disease research. There is a need for diagnostic tests which are affordable, sensitive, specific, user-friendly, rapid, equipment-free and delivered to those who need it, and POC is an important tool for disease mapping and guiding mass deworming. The aim of present study was to evaluate the relative diagnostic performance of two urine-circulating cathodic antigen (CCA) cassette assays, one commercially available and the other in experimental production, against results obtained using the standard Kato-Katz faecal smear method (six thick smears from three consecutive days), as a 'gold-standard', for Schistosoma mansoni infection in different transmission settings in Uganda. Our study was conducted among 500 school children randomly selected across 5 schools within Bugiri district, adjacent to Lake Victoria in Uganda. Considering results from the 469 pupils who provided three stool samples for the six Kato-Katz smears, 293 (76%) children had no infection, 109 (23%) were in the light intensity category, while 42 (9%) and 25 (5%) were in the moderate and heavy intensity categories respectively. Following performance analysis of CCA tests in terms of sensitivity, specificity, negative and positive predictive values, overall performance of the commercially available CCA test was more informative than single Kato-Katz faecal smear microscopy, the current operational field standard for disease mapping. The current CCA assay is therefore a satisfactory method for surveillance of S. mansoni in an area where disease endemicity is declining due to control interventions. With the recent resolution on schistosomiasis elimination by the 65th World Health Assembly, the urine POC CCA test is an attractive tool to augment and perhaps replace the Kato-Katz sampling within ongoing control programmes.

  6. Thoracic ultrasound: An adjunctive and valuable imaging tool in emergency, resource-limited settings and for a sustainable monitoring of patients

    Science.gov (United States)

    Trovato, Francesca M; Catalano, Daniela; Trovato, Guglielmo M

    2016-01-01

    Imaging workup of patients referred for elective assessment of chest disease requires an articulated approach: Imaging is asked for achieving timely diagnosis. The concurrent or subsequent use of thoracic ultrasound (TUS) with conventional (chest X-rays-) and more advanced imaging procedures (computed tomography and magnetic resonance imaging) implies advantages, limitations and actual problems. Indeed, despite TUS may provide useful imaging of pleura, lung and heart disease, emergency scenarios are currently the most warranted field of application of TUS: Pleural effusion, pneumothorax, lung consolidation. This stems from its role in limited resources subsets; actually, ultrasound is an excellent risk reducing tool, which acts by: (1) increasing diagnostic certainty; (2) shortening time to definitive therapy; and (3) decreasing problems from blind procedures that carry an inherent level of complications. In addition, paediatric and newborn disease are particularly suitable for TUS investigation, aimed at the detection of congenital or acquired chest disease avoiding, limiting or postponing radiological exposure. TUS improves the effectiveness of elective medical practice, in resource-limited settings, in small point of care facilities and particularly in poorer countries. Quality and information provided by the procedure are increased avoiding whenever possible artefacts that can prevent or mislead the achievement of the correct diagnosis. Reliable monitoring of patients is possible, taking into consideration that appropriate expertise, knowledge, skills, training, and even adequate equipment’s suitability are not always and everywhere affordable or accessible. TUS is complementary imaging procedure for the radiologist and an excellent basic diagnostic tool suitable to be shared with pneumologists, cardiologists and emergency physicians.

  7. Evaluating the Implementation and Feasibility of a Web-Based Tool to Support Timely Identification and Care for the Frail Population in Primary Healthcare Settings

    Directory of Open Access Journals (Sweden)

    Beverley Lawson

    2017-07-01

    Full Text Available Background Understanding and addressing the needs of frail persons is an emerging health priority for Nova Scotia and internationally. Primary healthcare (PHC providers regularly encounter frail persons in their daily clinical work. However, routine identification and measurement of frailty is not standard practice and, in general, there is a lack of awareness about how to identify and respond to frailty. A web-based tool called the Frailty Portal was developed to aid in identifying, screening, and providing care for frail patients in PHC settings. In this study, we will assess the implementation feasibility and impact of the Frailty Portal to: (1 support increased awareness of frailty among providers and patients, (2 identify the degree of frailty within individual patients, and (3 develop and deliver actions to respond to frailtyl in community PHC practice. Methods This study will be approached using a convergent mixed method design where quantitative and qualitative data are collected concurrently, in this case, over a 9-month period, analyzed separately, and then merged to summarize, interpret and produce a more comprehensive understanding of the initiative’s feasibility and scalability. Methods will be informed by the ‘Implementing the Frailty Portal in Community Primary Care Practice’ logic model and questions will be guided by domains and constructs from an implementation science framework, the Consolidated Framework for Implementation Research (CFIR. Discussion The ‘Frailty Portal’ aims to improve access to, and coordination of, primary care services for persons experiencing frailty. It also aims to increase primary care providers’ ability to care for patients in the context of their frailty. Our goal is to help optimize care in the community by helping community providers gain the knowledge they may lack about frailty both in general and in their practice, support improved identification of frailty with the use of screening

  8. My46: a web-based tool for self-guided management of genomic test results in research and clinical settings

    Science.gov (United States)

    Tabor, Holly K.; Jamal, Seema M.; Yu, Joon-Ho; Crouch, Julia M.; Shankar, Aditi G.; Dent, Karin M.; Anderson, Nick; Miller, Damon A.; Futral, Brett T.; Bamshad, Michael J.

    2016-01-01

    A major challenge to implementing precision medicine is the need for an efficient and cost-effective strategy for returning individual genomic test results that is easily scalable and can be incorporated into multiple models of clinical practice. My46 is a web-based tool for managing the return of genetic results that was designed and developed to support a wide range of approaches to results disclosure, ranging from traditional face-to-face disclosure to self-guided models. My46 has five key functions: set and modify results return preferences, return results, educate, manage return of results, and assess return of results. These key functions are supported by six distinct modules and a suite of features that enhance the user experience, ease site navigation, facilitate knowledge sharing, and enable results return tracking. My46 is a potentially effective solution for returning results and supports current trends toward shared decision-making between patient and provider and patient-driven health management. PMID:27632689

  9. My46: a Web-based tool for self-guided management of genomic test results in research and clinical settings.

    Science.gov (United States)

    Tabor, Holly K; Jamal, Seema M; Yu, Joon-Ho; Crouch, Julia M; Shankar, Aditi G; Dent, Karin M; Anderson, Nick; Miller, Damon A; Futral, Brett T; Bamshad, Michael J

    2017-04-01

    A major challenge to implementing precision medicine is the need for an efficient and cost-effective strategy for returning individual genomic test results that is easily scalable and can be incorporated into multiple models of clinical practice. My46 is a Web-based tool for managing the return of genetic results that was designed and developed to support a wide range of approaches to disclosing results, ranging from traditional face-to-face disclosure to self-guided models. My46 has five key functions: set and modify results-return preferences, return results, educate, manage the return of results, and assess the return of results. These key functions are supported by six distinct modules and a suite of features that enhance the user experience, ease site navigation, facilitate knowledge sharing, and enable results-return tracking. My46 is a potentially effective solution for returning results and supports current trends toward shared decision making between patients and providers and patient-driven health management.Genet Med 19 4, 467-475.

  10. Sharing My Music with You: The Musical Presentation as a Tool for Exploring, Examining and Enhancing Self-Awareness in a Group Setting

    Science.gov (United States)

    Bensimon, Moshe; Amir, Dorit

    2010-01-01

    Musical presentation (MP) is a diagnostic and therapeutic music therapy tool which focuses on the participant's emotional exploration and awareness-insight development. Using this tool people present themselves through music of their choice and subsequently receive feedback from their peers. This study investigates MP as a tool for enhancing…

  11. A framework and a set of tools called Nutting models to estimate retention capacities and loads of nitrogen and phosphorus in rivers at catchment and national level (France)

    Science.gov (United States)

    Legeay, Pierre-Louis; Moatar, Florentina; Dupas, Rémi; Gascuel-Odoux, Chantal

    2016-04-01

    The Nutting-N and Nutting-P models (Dupas et al., 2013, 2015) have been developed to estimate Nitrogen and Phosphorus nonpoint-source emissions to surface water, using readily available data. These models were inspired from US model SPARROW (Smith al., 1997) and European model GREEN (Grizzetti et al., 2008), i.e. statistical approaches consisting of linking nitrogen and phosphorus surplus to catchment's land and rivers characteristics to find the catchment relative retention capacities. The nutrient load (L) at the outlet of each catchment is expressed as: L=R*(B*DS+PS) [1] where DS is diffuse sources (i.e. surplus in kg.ha-1/yr-1 for N, P storage in soil for P), PS is point sources from domestic and industrial origin (kg.ha-1.yr-1), R and B are the river system and basin reduction factor, respectively and they combine observed variables and calibrated parameters. The model was calibrated on independent catchments for the 2005-2009 and 2008-2012 periods. Variables were selected according to Bayesian Information Criterion (BIC) in order to optimize the predictive performance of the models. From these basic models, different improvements have been realized to build a framework and a set of tools: 1) a routing module has been added in order to improve estimations on 4 or 5 stream order, i.e. upscaling the basic Nutting approach; 2) a territorial module, in order to test the models at local scale (from 500 to 5000 km²); 3) a seasonal estimation has been investigated. The basic approach as well territorial application will be illustrated. These tools allow water manager to identify areas at risk where high nutrients loads are estimated, as well areas where retention is potentially high and can buffer high nutrient sources. References Dupas R., Curie F., Gascuel-Odoux C., Moatar F., Delmas M., Parnaudeau, V., Durand P., 2013. Assessing N emissions in surface water at the national level: Comparison of country-wide vs. regionalized models. Science of the Total Environment

  12. A tool to measure the attributes of receiving IV therapy in a home versus hospital setting: the Multiple Sclerosis Relapse Management Scale (MSRMS

    Directory of Open Access Journals (Sweden)

    Chataway Jeremy

    2011-09-01

    Full Text Available Abstract Background Intravenous steroids are routinely used to treat disabling relapses in multiple sclerosis (MS. Theoretically, the infusion could take place at home, rather than in hospital. Findings from other patient populations suggest that patients may find the experiences of home relapse management more desirable. However, formal comparison of these two settings, from the patients' point of view, was prevented by the lack of a clinical scale. We report the development of a rating scale to measure patient's experiences of relapse management that allowed this question to be answered confidently. Methods Scale development had three stages. First, in-depth interviews of 21 MS patients generated a conceptual model and pool of potential scale items. Second, these items were administered to 160 people with relapsing-remitting MS. Standard psychometric techniques were used to develop a scale. Third, the psychometric properties of the scale were evaluated in a randomised controlled trial of 138 patients whose relapses were managed either at home or hospital. Results A preliminary conceptual model with eight dimensions, and a pool of 154 items was generated. From this we developed the MS Relapse Management Scale (MSRMS, a 42-item with four subscales: access to care (6 items, coordination of care (11 items, information (7 items, interpersonal care (18 items. The MSRMS subscales satisfied most psychometric criteria but had notable floor effects. Conclusions The MSRMS is a reliable and valid measure of patients' experiences of MS relapse management. The high floor effects suggest most respondents had positive care experiences. Results demonstrate that patients' experiences of relapse management can be measured, and that the MSRMS is a powerful tool for determining which services to develop, support and ultimately commission.

  13. metAlignID: a high-throughput software tool set for automated detection of trace level contaminants in comprehensive LECO two-dimensional gas chromatography time-of-flight mass spectrometry data.

    Science.gov (United States)

    Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J

    2012-11-09

    A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly

  14. The STarT Back Screening Tool and Individual Psychological Measures: Evaluation of Prognostic Capabilities for Low Back Pain Clinical Outcomes in Outpatient Physical Therapy Settings

    Science.gov (United States)

    Bishop, Mark D.; Fritz, Julie M.; Robinson, Michael E.; Asal, Nabih R.; Nisenzon, Anne N.

    2013-01-01

    Background Psychologically informed practice emphasizes routine identification of modifiable psychological risk factors being highlighted. Objective The purpose of this study was to test the predictive validity of the STarT Back Screening Tool (SBT) in comparison with single-construct psychological measures for 6-month clinical outcomes. Design This was an observational, prospective cohort study. Methods Patients (n=146) receiving physical therapy for low back pain were administered the SBT and a battery of psychological measures (Fear-Avoidance Beliefs Questionnaire physical activity scale and work scale [FABQ-PA and FABQ-W, respectively], Pain Catastrophizing Scale [PCS], 11-item version of the Tampa Scale of Kinesiophobia [TSK-11], and 9-item Patient Health Questionnaire [PHQ-9]) at initial evaluation and 4 weeks later. Treatment was at the physical therapist's discretion. Clinical outcomes consisted of pain intensity and self-reported disability. Prediction of 6-month clinical outcomes was assessed for intake SBT and psychological measure scores using multiple regression models while controlling for other prognostic variables. In addition, the predictive capabilities of intake to 4-week changes in SBT and psychological measure scores for 6-month clinical outcomes were assessed. Results Intake pain intensity scores (β=.39 to .45) and disability scores (β=.47 to .60) were the strongest predictors in all final regression models, explaining 22% and 24% and 43% and 48% of the variance for the respective clinical outcome at 6 months. Neither SBT nor psychological measure scores improved prediction of 6-month pain intensity. The SBT overall scores (β=.22) and SBT psychosocial scores (β=.25) added to the prediction of disability at 6 months. Four-week changes in TSK-11 scores (β=−.18) were predictive of pain intensity at 6 months. Four-week changes in FABQ-PA scores (β=−.21), TSK-11 scores (β=−.20) and SBT overall scores (β=−.18) were predictive of

  15. A comparison of tools used for tuberculosis diagnosis in resource-limited settings: a case study at Mubende referral hospital, Uganda.

    Directory of Open Access Journals (Sweden)

    Adrian Muwonge

    Full Text Available BACKGROUND: This study compared TB diagnostic tools and estimated levels of misdiagnosis in a resource-limited setting. Furthermore, we estimated the diagnostic utility of three-TB-associated predictors in an algorithm with and without Direct Ziehl-Neelsen (DZM. MATERIALS AND METHODS: Data was obtained from a cross-sectional study in 2011 conducted at Mubende regional referral hospital in Uganda. An individual was included if they presented with a two weeks persistent cough and or lymphadenitis/abscess. 344 samples were analyzed on DZM in Mubende and compared to duplicates analyzed on direct fluorescent microscopy (DFM, growth on solid and liquid media at Makerere University. Clinical variables from a questionnaire and DZM were used to predict TB status in multivariable logistic and Cox proportional hazard models, while optimization and visualization was done with receiver operating characteristics curve and algorithm-charts in Stata, R and Lucid-Charts respectively. RESULTS: DZM had a sensitivity and specificity of 36.4% (95% CI = 24.9-49.1 and 97.1%(95% CI = 94.4-98.7 compared to DFM which had a sensitivity and specificity of 80.3%(95% CI = 68.7-89.1 and 97.1%(95% CI = 94.4-98.7 respectively. DZM false negative results were associated with patient's HIV status, tobacco smoking and extra-pulmonary tuberculosis. One of the false negative cases was infected with multi drug resistant TB (MDR. The three-predictor screening algorithm with and without DZM classified 50% and 33% of the true cases respectively, while the adjusted algorithm with DZM classified 78% of the true cases. CONCLUSION: The study supports the concern that using DZM alone risks missing majority of TB cases, in this case we found nearly 60%, of who one was an MDR case. Although adopting DFM would reduce this proportion to 19%, the use of a three-predictor screening algorithm together with DZM was almost as good as DFM alone. It's utility is whoever subject to HIV

  16. A Set of Web based Assistant Learning Tools based on Concept Map Theory%基于概念图的个性化自主学习分析模型及实现

    Institute of Scientific and Technical Information of China (English)

    申瑞民; 汤轶阳

    2001-01-01

    Based on the Concept Map theory,the paper establishes an assistant learning analysis model on concept map in the application of eduction assessment field, which is composed of the construction, analysis, storage and feedback modules. According to the model, we develop a set of assistant learning tools based on Web.

  17. Study on the Optimum Choice of Cutting Tools in NC Machining Based on the Polychromatic Sets Theory%基于多色集合理论的数控加工刀具优化选择研究

    Institute of Scientific and Technical Information of China (English)

    刘恩福; 刘晓阳; 方忆湘

    2012-01-01

    基于建立刀具选择的关系模型并以围道布尔矩阵的形式表示,通过多色集合中着色的逻辑运算,找到满足加工条件的候选刀具集合并对其进行模糊优选;综合考虑换刀时间,加工顺序以及其它相关加工信息对刀具选择的影响,使选择刀具在整体上述到最佳化.%A relational model of cutting tools choice was established, which was shown with a contour Boolean matrix. Through the coloring logical operation, the cutting tools candidate set was calculated to meet machining conditions. With the candidate cutting tools set, the fuzzy optimization was made for every part feature. And with considering the effects of the time of cutting tools exchange, processing sequence and other relevant information for the selection of cutting tools, the choice of cutting tools is optimized as a whole during machining.

  18. How many research nurses for how many clinical trials in an oncology setting? Definition of the Nursing Time Required by Clinical Trial-Assessment Tool (NTRCT-AT).

    Science.gov (United States)

    Milani, Alessandra; Mazzocco, Ketti; Stucchi, Sara; Magon, Giorgio; Pravettoni, Gabriella; Passoni, Claudia; Ciccarelli, Chiara; Tonali, Alessandra; Profeta, Teresa; Saiani, Luisa

    2017-02-01

    Few resources are available to quantify clinical trial-associated workload, needed to guide staffing and budgetary planning. The aim of the study is to describe a tool to measure clinical trials nurses' workload expressed in time spent to complete core activities. Clinical trials nurses drew up a list of nursing core activities, integrating results from literature searches with personal experience. The final 30 core activities were timed for each research nurse by an outside observer during daily practice in May and June 2014. Average times spent by nurses for each activity were calculated. The "Nursing Time Required by Clinical Trial-Assessment Tool" was created as an electronic sheet that combines the average times per specified activities and mathematic functions to return the total estimated time required by a research nurse for each specific trial. The tool was tested retrospectively on 141 clinical trials. The increasing complexity of clinical research requires structured approaches to determine workforce requirements. This study provides a tool to describe the activities of a clinical trials nurse and to estimate the associated time required to deliver individual trials. The application of the proposed tool in clinical research practice could provide a consistent structure for clinical trials nursing workload estimation internationally.

  19. Spatial Analysis in Educational Administration: Exploring the Role of G.I.S. (Geographical Information Systems) as an Evaluative Tool in the Public School Board Setting.

    Science.gov (United States)

    Brown, Robert S.; Baird, William; Rosolen, Lisa

    In January 1998, seven school boards amalgamated to form the Toronto District School Board, a board responsible for 600 schools. To deal with the complexities of the new entity, researchers have been using geographical information systems (GIS). GIS are computer-based tools for mapping. They store information as a collection of thematic layers or…

  20. Development of a screening tool predicting the transition from acute to chronic low back pain for patients in a GP setting: Protocol of a multinational prospective cohort study

    Directory of Open Access Journals (Sweden)

    Bajracharya Suraj

    2008-12-01

    Full Text Available Abstract Background Low back pain (LBP is by far the most prevalent and costly musculoskeletal problem in our society today. Following the recommendations of the Multinational Musculoskeletal Inception Cohort Study (MMICS Statement, our study aims to define outcome assessment tools for patients with acute LBP and the time point at which chronic LBP becomes manifest and to identify patient characteristics which increase the risk of chronicity. Methods Patients with acute LBP will be recruited from clinics of general practitioners (GPs in New Zealand (NZ and Switzerland (CH. They will be assessed by postal survey at baseline and at 3, 6, 12 weeks and 6 months follow-up. Primary outcome will be disability as measured by the Oswestry Disability Index (ODI; key secondary endpoints will be general health as measured by the acute SF-12 and pain as measured on the Visual Analogue Scale (VAS. A subgroup analysis of different assessment instruments and baseline characteristics will be performed using multiple linear regression models. This study aims to examine 1. Which biomedical, psychological, social, and occupational outcome assessment tools are identifiers for the transition from acute to chronic LBP and at which time point this transition becomes manifest 2. Which psychosocial and occupational baseline characteristics like work status and period of work absenteeism influence the course from acute to chronic LBP 3. Differences in outcome assessment tools and baseline characteristics of patients in NZ compared with CH. Discussion This study will develop a screening tool for patients with acute LBP to be used in GP clinics to access the risk of developing chronic LBP. In addition, biomedical, psychological, social, and occupational patient characteristics which influence the course from acute to chronic LBP will be identified. Furthermore, an appropriate time point for follow-ups will be given to detect this transition. The generalizability of our

  1. Underwater Photogrammetry, Coded Target and Plenoptic Technology: a Set of Tools for Monitoring Red Coral in Mediterranean Sea in the Framework of the "perfect" Project

    Science.gov (United States)

    Drap, P.; Royer, J. P.; Nawaf, M. M.; Saccone, M.; Merad, D.; López-Sanz, À.; Ledoux, J. B.; Garrabou, J.

    2017-02-01

    PErfECT "Photogrammetry, gEnetic, Ecology for red coral ConservaTion" is a project leaded by the Laboratoire des Sciences de lInformation et des Systmes (LSIS - UMR 7296 CNRS) from the Aix-Marseille University (France) in collaboration with the Spanish National Agency for Scientific Research (CSIC, Spain). The main objective of the project is to develop innovative Tools for the conservation of the Mediterranean red coral, Corallium rubrum. PErfECT was funded by the Total Fundation. The adaptation of digital photogrammetric techniques for use in submarine is rapidly increasing in recent years. In fact, these techniques are particularly well suited for use in underwater environments. PErfECT developed different photogrammetry tools to enhance the red coral population surveys based in: (i) automatic orientation on coded quadrats, (ii) use of NPR (Non Photo realistic Rendering) techniques, (iii) the calculation of distances between colonies within local populations and finally (iv) the use of plenoptic approaches in underwater conditions.

  2. The Electronic Patient Reported Outcome Tool: Testing Usability and Feasibility of a Mobile App and Portal to Support Care for Patients With Complex Chronic Disease and Disability in Primary Care Settings

    Science.gov (United States)

    Gill, Ashlinder; Khan, Anum Irfan; Hans, Parminder Kaur; Kuluski, Kerry; Cott, Cheryl

    2016-01-01

    Background People experiencing complex chronic disease and disability (CCDD) face some of the greatest challenges of any patient population. Primary care providers find it difficult to manage multiple discordant conditions and symptoms and often complex social challenges experienced by these patients. The electronic Patient Reported Outcome (ePRO) tool is designed to overcome some of these challenges by supporting goal-oriented primary care delivery. Using the tool, patients and providers collaboratively develop health care goals on a portal linked to a mobile device to help patients and providers track progress between visits. Objectives This study tested the usability and feasibility of adopting the ePRO tool into a single interdisciplinary primary health care practice in Toronto, Canada. The Fit between Individuals, Fask, and Technology (FITT) framework was used to guide our assessment and explore whether the ePRO tool is: (1) feasible for adoption in interdisciplinary primary health care practices and (2) usable from both the patient and provider perspectives. This usability pilot is part of a broader user-centered design development strategy. Methods A 4-week pilot study was conducted in which patients and providers used the ePRO tool to develop health-related goals, which patients then monitored using a mobile device. Patients and providers collaboratively set goals using the system during an initial visit and had at least 1 follow-up visit at the end of the pilot to discuss progress. Focus groups and interviews were conducted with patients and providers to capture usability and feasibility measures. Data from the ePRO system were extracted to provide information regarding tool usage. Results Six providers and 11 patients participated in the study; 3 patients dropped out mainly owing to health issues. The remaining 8 patients completed 210 monitoring protocols, equal to over 1300 questions, with patients often answering questions daily. Providers and patients

  3. The use and effectiveness of electronic clinical decision support tools in the ambulatory/primary care setting: a systematic review of the literature

    Directory of Open Access Journals (Sweden)

    Cathy Bryan

    2008-07-01

    Conclusion Although there is validation that CDSS has the potential to produce statistically significant improvement in outcomes, there is much variability among the types and methods of CDSS implementation and resulting effectiveness. As CDSS will likely continue to be at the forefront of the march toward effective standards-based care, more work needs to be done to determine effective implementation strategies for the use of CDSS across multiple settings and patient populations.

  4. How to sell a condom? The impact of demand creation tools on male and female condom sales in resource limited settings.

    Science.gov (United States)

    Terris-Prestholt, Fern; Windmeijer, Frank

    2016-07-01

    Despite condoms being cheap and effective in preventing HIV, there remains an 8billion shortfall in condom use in risky sex-acts. Social marketing organisations apply private sector marketing approaches to sell public health products. This paper investigates the impact of marketing tools, including promotion and pricing, on demand for male and female condoms in 52 countries between 1997 and 2009. A static model differentiates drivers of demand between products, while a dynamic panel data estimator estimates their short- and long-run impacts. Products are not equally affected: female condoms are not affected by advertising, but highly affected by interpersonal communication and HIV prevalence. Price and promotion have significant short- and long-run effects, with female condoms far more sensitive to price than male condoms. The design of optimal distribution strategies for new and existing HIV prevention technologies must consider both product and target population characteristics.

  5. The Wavelet ToolKat: A set of tools for the analysis of series through wavelet transforms. Application to the channel curvature and the slope control of three free meandering rivers in the Amazon basin.

    Science.gov (United States)

    Vaudor, Lise; Piegay, Herve; Wawrzyniak, Vincent; Spitoni, Marie

    2016-04-01

    The form and functioning of a geomorphic system result from processes operating at various spatial and temporal scales. Longitudinal channel characteristics thus exhibit complex patterns which vary according to the scale of study, might be periodic or segmented, and are generally blurred by noise. Describing the intricate, multiscale structure of such signals, and identifying at which scales the patterns are dominant and over which sub-reach, could help determine at which scales they should be investigated, and provide insights into the main controlling factors. Wavelet transforms aim at describing data at multiple scales (either in time or space), and are now exploited in geophysics for the analysis of nonstationary series of data. They provide a consistent, non-arbitrary, and multiscale description of a signal's variations and help explore potential causalities. Nevertheless, their use in fluvial geomorphology, notably to study longitudinal patterns, is hindered by a lack of user-friendly tools to help understand, implement, and interpret them. We have developed a free application, The Wavelet ToolKat, designed to facilitate the use of wavelet transforms on temporal or spatial series. We illustrate its usefulness describing longitudinal channel curvature and slope of three freely meandering rivers in the Amazon basin (the Purus, Juruá and Madre de Dios rivers), using topographic data generated from NASA's Shuttle Radar Topography Mission (SRTM) in 2000. Three types of wavelet transforms are used, with different purposes. Continuous Wavelet Transforms are used to identify in a non-arbitrary way the dominant scales and locations at which channel curvature and slope vary. Cross-wavelet transforms, and wavelet coherence and phase are used to identify scales and locations exhibiting significant channel curvature and slope co-variations. Maximal Overlap Discrete Wavelet Transforms decompose data into their variations at a series of scales and are used to provide

  6. Rough Neutrosophic Sets

    OpenAIRE

    Said Broumi; Florentin Smarandache; Mamoni Dhar

    2013-01-01

     Both neutrosophic sets theory and rough sets theory are emerging as powerful tool for managing uncertainty, indeterminate, incomplete and imprecise information. In this paper we develop an hybrid structure called rough neutrosophic sets and studied their properties. 

  7. Design of Manual Use Tool Setting Probe in Sinumerik 810D/840D%Sinumerik810D/840D手动使用对刀仪的设计

    Institute of Scientific and Technical Information of China (English)

    阮启伟

    2011-01-01

    介绍了Sinumerik 810D/840D数控系统中,使用对刀仪在JOG方式下手动对刀的方法,利用NC VAR软件实现有关刀补的系统变量在S7-300 PLC中的读写,由PLC控制所有的对刀动作以及刀补值的写入,详细给出了设计过程,并提供了部分PLC程序.%This paper introduces the methods of tool manual measurement by using tool setting probe in the JOG mode of Sinumerik 810D/840D. NC VAR software is used to achieve the read-write about tool compensation system variables of the S7-300 PLC and the PLC is used to control the all tools actions and read-in of compensation values The paper also shows particular design process and provides parts of PLC programs.

  8. Nursing Management Minimum Data Set: Cost-Effective Tool To Demonstrate the Value of Nurse Staffing in the Big Data Science Era.

    Science.gov (United States)

    Pruinelli, Lisiane; Delaney, Connie W; Garciannie, Amy; Caspers, Barbara; Westra, Bonnie L

    2016-01-01

    There is a growing body of evidence of the relationship of nurse staffing to patient, nurse, and financial outcomes. With the advent of big data science and developing big data analytics in nursing, data science with the reuse of big data is emerging as a timely and cost-effective approach to demonstrate nursing value. The Nursing Management Minimum Date Set (NMMDS) provides standard administrative data elements, definitions, and codes to measure the context where care is delivered and, consequently, the value of nursing. The integration of the NMMDS elements in the current health system provides evidence for nursing leaders to measure and manage decisions, leading to better patient, staffing, and financial outcomes. It also enables the reuse of data for clinical scholarship and research.

  9. Exploiting biospectroscopy as a novel screening tool for cervical cancer: towards a framework to validate its accuracy in a routine clinical setting.

    Science.gov (United States)

    Purandare, Nikhil C; Trevisan, Júlio; Patel, Imran I; Gajjar, Ketan; Mitchell, Alana L; Theophilou, Georgios; Valasoulis, George; Martin, Mary; von Bünau, Günther; Kyrgiou, Maria; Paraskevaidis, Evangelos; Martin-Hirsch, Pierre L; Prendiville, Walter J; Martin, Francis L

    2013-11-01

    Biospectroscopy is an emerging field that harnesses the platform of physical sciences with computational analysis in order to shed novel insights on biological questions. An area where this approach seems to have potential is in screening or diagnostic clinical settings, where there is an urgent need for new approaches to objectively interrogate large numbers of samples in an objective fashion with acceptable levels of sensitivity and specificity. This review outlines the benefits of biospectroscopy in screening for precancer lesions of the cervix due to its ability to separate different grades of dysplasia. It evaluates the feasibility of introducing this technique into cervical screening programs on the basis of its ability to identify biomarkers of progression within derived spectra ('biochemical‑cell fingerprints').

  10. Exploiting biospectroscopy as a novel screening tool for cervical cancer: towards a framework to validate its accuracy in a routine clinical setting.

    LENUS (Irish Health Repository)

    Purandare, Nikhil C

    2013-11-01

    Biospectroscopy is an emerging field that harnesses the platform of physical sciences with computational analysis in order to shed novel insights on biological questions. An area where this approach seems to have potential is in screening or diagnostic clinical settings, where there is an urgent need for new approaches to objectively interrogate large numbers of samples in an objective fashion with acceptable levels of sensitivity and specificity. This review outlines the benefits of biospectroscopy in screening for precancer lesions of the cervix due to its ability to separate different grades of dysplasia. It evaluates the feasibility of introducing this technique into cervical screening programs on the basis of its ability to identify biomarkers of progression within derived spectra (\\'biochemical‑cell fingerprints\\').

  11. Rationale and study protocol for a multi-component Health Information Technology (HIT) screening tool for depression and post-traumatic stress disorder in the primary care setting.

    Science.gov (United States)

    Biegler, Kelly; Mollica, Richard; Sim, Susan Elliott; Nicholas, Elisa; Chandler, Maria; Ngo-Metzger, Quyen; Paigne, Kittya; Paigne, Sompia; Nguyen, Danh V; Sorkin, Dara H

    2016-09-01

    The prevalence rate of depression in primary care is high. Primary care providers serve as the initial point of contact for the majority of patients with depression, yet, approximately 50% of cases remain unrecognized. The under-diagnosis of depression may be further exacerbated in limited English-language proficient (LEP) populations. Language barriers may result in less discussion of patients' mental health needs and fewer referrals to mental health services, particularly given competing priorities of other medical conditions and providers' time pressures. Recent advances in Health Information Technology (HIT) may facilitate novel ways to screen for depression and other mental health disorders in LEP populations. The purpose of this paper is to describe the rationale and protocol of a clustered randomized controlled trial that will test the effectiveness of an HIT intervention that provides a multi-component approach to delivering culturally competent, mental health care in the primary care setting. The HIT intervention has four components: 1) web-based provider training, 2) multimedia electronic screening of depression and PTSD in the patients' primary language, 3) Computer generated risk assessment scores delivered directly to the provider, and 4) clinical decision support. The outcomes of the study include assessing the potential of the HIT intervention to improve screening rates, clinical detection, provider initiation of treatment, and patient outcomes for depression and post-traumatic stress disorder (PTSD) among LEP Cambodian refugees who experienced war atrocities and trauma during the Khmer Rouge. This technology has the potential to be adapted to any LEP population in order to facilitate mental health screening and treatment in the primary care setting.

  12. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    Energy Technology Data Exchange (ETDEWEB)

    Cormier, Dallas [San Diego Gas & Electric, CA (United States); Edra, Sherwin [San Diego Gas & Electric, CA (United States); Espinoza, Michael [San Diego Gas & Electric, CA (United States); Daye, Tony [Green Power Labs, San Diego, CA (United States); Kostylev, Vladimir [Green Power Labs, San Diego, CA (United States); Pavlovski, Alexandre [Green Power Labs, San Diego, CA (United States); Jelen, Deborah [Electricore, Inc., Valencia, CA (United States)

    2014-12-29

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  13. Development process of an assessment tool for disruptive behavior problems in cross-cultural settings: the Disruptive Behavior International Scale - Nepal version (DBIS-N).

    Science.gov (United States)

    Burkey, Matthew D; Ghimire, Lajina; Adhikari, Ramesh P; Kohrt, Brandon A; Jordans, Mark J D; Haroz, Emily; Wissow, Lawrence

    2016-01-01

    Systematic processes are needed to develop valid measurement instruments for disruptive behavior disorders (DBDs) in cross-cultural settings. We employed a four-step process in Nepal to identify and select items for a culturally valid assessment instrument: 1) We extracted items from validated scales and local free-list interviews. 2) Parents, teachers, and peers (n=30) rated the perceived relevance and importance of behavior problems. 3) Highly rated items were piloted with children (n=60) in Nepal. 4) We evaluated internal consistency of the final scale. We identified 49 symptoms from 11 scales, and 39 behavior problems from free-list interviews (n=72). After dropping items for low ratings of relevance and severity and for poor item-test correlation, low frequency, and/or poor acceptability in pilot testing, 16 items remained for the Disruptive Behavior International Scale-Nepali version (DBIS-N). The final scale had good internal consistency (α=0.86). A 4-step systematic approach to scale development including local participation yielded an internally consistent scale that included culturally relevant behavior problems.

  14. Quick Search Tool in the Standard Value of Chinas Terrestrial Climate Data Set Information%中国地面气候标准值数据集资料的快速检索工具

    Institute of Scientific and Technical Information of China (English)

    纪秀艳; 张崇辉; 刘冠男

    2013-01-01

    针对《中国地面气候标准值数据集(1981 ~2010)数据文件数码众多,文件格式繁杂,手工检索困难的状况,根据数据集说明文件,编写自动检索工具,实现对整编资料的快速检索和导出,提高检索效率.%Aiming at the situation of numerous digital,complex file formats,difficult to manually retrieve of The Standard Value of The Surface Climate Data Set (1981-2010),according to the data set documentation,preparation of automatic retrieval tools,realizing quick retrieval of the reorganization information and export and improve the retrieval efficiency.

  15. Fish welfare assurance system: initial steps to set up an effective tool to safeguard and monitor farmed fish welfare at a company level.

    Science.gov (United States)

    van de Vis, J W; Poelman, M; Lambooij, E; Bégout, M-L; Pilarczyk, M

    2012-02-01

    The objective was to take a first step in the development of a process-oriented quality assurance (QA) system for monitoring and safeguarding of fish welfare at a company level. A process-oriented approach is focused on preventing hazards and involves establishment of critical steps in a process that requires careful control. The seven principles of the Hazard Analysis Critical Control Points (HACCP) concept were used as a framework to establish the QA system. HACCP is an internationally agreed approach for management of food safety, which was adapted for the purpose of safeguarding and monitoring the welfare of farmed fish. As the main focus of this QA system is farmed fish welfare assurance at a company level, it was named Fish Welfare Assurance System (FWAS). In this paper we present the initial steps of setting up FWAS for on growing of sea bass (Dicentrarchus labrax), carp (Cyprinus carpio) and European eel (Anguilla anguilla). Four major hazards were selected, which were fish species dependent. Critical Control Points (CCPs) that need to be controlled to minimize or avoid the four hazards are presented. For FWAS, monitoring of CCPs at a farm level is essential. For monitoring purposes, Operational Welfare Indicators (OWIs) are needed to establish whether critical biotic, abiotic, managerial and environmental factors are controlled. For the OWIs we present critical limits/target values. A critical limit is the maximum or minimum value to which a factor must be controlled at a critical control point to prevent, eliminate or reduce a hazard to an acceptable level. For managerial factors target levels are more appropriate than critical limits. Regarding the international trade of farmed fish products, we propose that FWAS needs to be standardized in aquaculture chains. For this standardization a consensus on the concept of fish welfare, methods to assess welfare objectively and knowledge on the needs of farmed fish are required.

  16. HEDIS Limited Data Set

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Healthcare Effectiveness Data and Information Set (HEDIS) is a tool used by more than 90 percent of Americas health plans to measure performance on important...

  17. Assessment of variation in the alberta context tool: the contribution of unit level contextual factors and specialty in Canadian pediatric acute care settings

    Directory of Open Access Journals (Sweden)

    Cummings Greta G

    2011-10-01

    Full Text Available Abstract Background There are few validated measures of organizational context and none that we located are parsimonious and address modifiable characteristics of context. The Alberta Context Tool (ACT was developed to meet this need. The instrument assesses 8 dimensions of context, which comprise 10 concepts. The purpose of this paper is to report evidence to further the validity argument for ACT. The specific objectives of this paper are to: (1 examine the extent to which the 10 ACT concepts discriminate between patient care units and (2 identify variables that significantly contribute to between-unit variation for each of the 10 concepts. Methods 859 professional nurses (844 valid responses working in medical, surgical and critical care units of 8 Canadian pediatric hospitals completed the ACT. A random intercept, fixed effects hierarchical linear modeling (HLM strategy was used to quantify and explain variance in the 10 ACT concepts to establish the ACT's ability to discriminate between units. We ran 40 models (a series of 4 models for each of the 10 concepts in which we systematically assessed the unique contribution (i.e., error variance reduction of different variables to between-unit variation. First, we constructed a null model in which we quantified the variance overall, in each of the concepts. Then we controlled for the contribution of individual level variables (Model 1. In Model 2, we assessed the contribution of practice specialty (medical, surgical, critical care to variation since it was central to construction of the sampling frame for the study. Finally, we assessed the contribution of additional unit level variables (Model 3. Results The null model (unadjusted baseline HLM model established that there was significant variation between units in each of the 10 ACT concepts (i.e., discrimination between units. When we controlled for individual characteristics, significant variation in the 10 concepts remained. Assessment of the

  18. 大直径气举反循环成套钻具的研制%Development of Complete Set of Large Diameter Air-lift Reverse Circulation Drilling Tool

    Institute of Scientific and Technical Information of China (English)

    袁志坚

    2014-01-01

    The paper introduces a complete set of large diameter air-lift reverse circulation drilling tool about its develop-ment background, design principle, structure dimensions and main technical parameters.This set of drilling tool, designed by interflush, has the advantages of great intensity and convenient connection, and the section area of tubes for ventilation, mud transfortation and mud recharge are fully considered, it can meet the needs of air-lift reverse circulation drilling for large diameter engineering well.%介绍了大直径气举反循环成套钻具的研制背景、设计原则、结构尺寸和主要技术参数。该套钻具不仅强度大,连接便捷,而且充分考虑了通气、通泥浆、泥浆补给三大通道面积等因素,采用内平设计。经生产试验,完全能满足大直径工程井气举反循环钻进的需要。

  19. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    Tools for design management are on the agenda in building projects in order to set targets, to choose and prioritise between alternative environmental solutions, to involve stakeholders and to document, evaluate and benchmark. Different types of tools are available, but what can we learn from...... the use or lack of use of current tools in the development of future design tools for sustainable buildings? Why are some used while others are not? Who is using them? The paper deals with design management, with special focus on sustainable building in Denmark, and the challenge of turning the generally...... vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...

  20. Dendroidal sets

    NARCIS (Netherlands)

    Weiss, I.

    2007-01-01

    The thesis introduces the new concept of dendroidal set. Dendroidal sets are a generalization of simplicial sets that are particularly suited to the study of operads in the context of homotopy theory. The relation between operads and dendroidal sets is established via the dendroidal nerve functor wh

  1. Soft sets combined with interval valued intuitionistic fuzzy sets of type-2 and rough sets

    OpenAIRE

    Anjan Mukherjee; Abhijit Saha

    2015-01-01

    Fuzzy set theory, rough set theory and soft set theory are all mathematical tools dealing with uncertainties. The concept of type-2 fuzzy sets was introduced by Zadeh in 1975 which was extended to interval valued intuitionistic fuzzy sets of type-2 by the authors.This paper is devoted to the discussions of the combinations of interval valued intuitionistic sets of type-2, soft sets and rough sets.Three different types of new hybrid models, namely-interval valued intuitionistic fuzzy soft sets...

  2. The new diatom training set from the Polish Baltic coast and diatom-based transfer functions as a tool for understanding past changes in the southern Baltic coastal lakes

    Science.gov (United States)

    Lutyńska, Monika; Szpikowska, Grażyna; Woszczyk, Michał; Suchińska, Anita; Burchardt, Lubomira; Messyasz, Beata

    2014-05-01

    The transfer function method has been developed as a useful tool for reconstruction of the past environmental changes. It is based on the assumption that the modern species, which ecological requirements are known, can be used to quantitative reconstructions of the past changes. The aim of the study was to gather test sets and to build diatom-based transfer function, which can be used to reconstruct changes in the trophic state and salinity in the coastal lakes on the Polish Baltic coast. In the previous years there were several attempts made to reconstruct these parameters in lagoonal waters on the Baltic coasts in Germany, Denmark, Finland, Netherland, Sweden and Norway. But so far there is no diatom test set and transfer function was built for the Polish coastal lakes. We sample diatoms form 12 lakes located along the polish Baltic coast. At the same time we monitor the physical-chemical conditions in the lakes, which includes lake water chemical composition (chlorides, phosphorous and sulphur), pH, salinity, conductivity, temperature, dissolved oxygen. We collect samples form the lakes as well as from the Baltic Sea and we analyse the whole phytoplankton composition, however the special focus in put on diatoms. The results of the analysis show seasonal changes in the chemical and physical water properties. The diatom assemblage composition and species frequency also changed significantly. This study is a contribution to the projects: NN 306 064 640 financed by National Science Centre, Poland and Virtual Institute ICLEA (Integrated Climate and Landscape Evolution Analysis) funded by the Helmholtz Association.

  3. CMS offline web tools

    CERN Document Server

    Metson, S; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Evans, D; Fanfani, A; Feichtinger, D; Kavka, C; Kuznetsov, V; Van Lingen, F; Newbold, D; Tuura, L; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments.

  4. Sustainability Tools Inventory - Initial Gaps Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influenc...

  5. Soft sets combined with interval valued intuitionistic fuzzy sets of type-2 and rough sets

    Directory of Open Access Journals (Sweden)

    Anjan Mukherjee

    2015-03-01

    Full Text Available Fuzzy set theory, rough set theory and soft set theory are all mathematical tools dealing with uncertainties. The concept of type-2 fuzzy sets was introduced by Zadeh in 1975 which was extended to interval valued intuitionistic fuzzy sets of type-2 by the authors.This paper is devoted to the discussions of the combinations of interval valued intuitionistic sets of type-2, soft sets and rough sets.Three different types of new hybrid models, namely-interval valued intuitionistic fuzzy soft sets of type-2, soft rough interval valued intuitionistic fuzzy sets of type-2 and soft interval valued intuitionistic fuzzy rough sets of type-2 are proposed and their properties are derived.

  6. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2011-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often no

  7. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2010-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often

  8. Using Growth Norms to Set Instructional Goals for Struggling Students

    Science.gov (United States)

    Haas, Lindsay B.; Stickney, Eric M.; Ysseldyke, James E.

    2016-01-01

    The authors examined the extent to which classroom teachers in naturalistic settings used a Goal-Setting Tool to set instructional goals for struggling students, the kinds of goals they set, their progress monitoring practices with and without goals, and the extent to which students gain more when a goal-setting tool is used. The goal-setting tool…

  9. Constructive Sets in Computable Sets

    Institute of Scientific and Technical Information of China (English)

    傅育熙

    1997-01-01

    The original interpretation of the constructive set theory CZF in Martin-Loef's type theory uses the‘extensional identity types’.It is generally believed that these‘types’do not belong to type theory.In this paper it will be shown that the interpretation goes through without identity types.This paper will also show that the interpretation can be given in an intensional type theory.This reflects the computational nature of the interpretation.This computational aspect is reinforced by an ω-Set moel of CZF.

  10. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  11. Large Crater Clustering tool

    Science.gov (United States)

    Laura, Jason; Skinner, James A.; Hunter, Marc A.

    2017-08-01

    In this paper we present the Large Crater Clustering (LCC) tool set, an ArcGIS plugin that supports the quantitative approximation of a primary impact location from user-identified locations of possible secondary impact craters or the long-axes of clustered secondary craters. The identification of primary impact craters directly supports planetary geologic mapping and topical science studies where the chronostratigraphic age of some geologic units may be known, but more distant features have questionable geologic ages. Previous works (e.g., McEwen et al., 2005; Dundas and McEwen, 2007) have shown that the source of secondary impact craters can be estimated from secondary impact craters. This work adapts those methods into a statistically robust tool set. We describe the four individual tools within the LCC tool set to support: (1) processing individually digitized point observations (craters), (2) estimating the directional distribution of a clustered set of craters, back projecting the potential flight paths (crater clusters or linearly approximated catenae or lineaments), (3) intersecting projected paths, and (4) intersecting back-projected trajectories to approximate the local of potential source primary craters. We present two case studies using secondary impact features mapped in two regions of Mars. We demonstrate that the tool is able to quantitatively identify primary impacts and supports the improved qualitative interpretation of potential secondary crater flight trajectories.

  12. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    Science.gov (United States)

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  13. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    Science.gov (United States)

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  14. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    Science.gov (United States)

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  15. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    Science.gov (United States)

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  16. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    Science.gov (United States)

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  17. Dynamic multimedia annotation tool

    Science.gov (United States)

    Pfund, Thomas; Marchand-Maillet, Stephane

    2001-12-01

    Annotating image collections is crucial for different multimedia applications. Not only this provides an alternative access to visual information but it is a critical step to perform the evaluation of content-based image retrieval systems. Annotation is a tedious task so that there is a real need for developing tools that lighten the work of annotators. The tool should be flexible and offer customization so as to make the annotator the most comfortable. It should also automate the most tasks as possible. In this paper, we present a still image annotation tool that has been developed with the aim of being flexible and adaptive. The principle is to create a set of dynamic web pages that are an interface to a SQL database. The keyword set is fixed and every image receives from concurrent annotators a set of keywords along with time stamps and annotator Ids. Each annotator has the possibility of going back and forth within the collection and its previous annotations. He is helped by a number of search services and customization options. An administrative section allows the supervisor to control the parameter of the annotation, including the keyword set, given via an XML structure. The architecture of the tool is made flexible so as to accommodate further options through its development.

  18. Studying on Tool's Feed Speed of (A-C) Double Swing Sets Five-axis Machine%(A-C)式双摆台五轴机床刀具进给速度的研究

    Institute of Scientific and Technical Information of China (English)

    王虎奇; 张健; 唐清春

    2013-01-01

    This paper mainly due to the phenomenon that the rational selection of the feed rate of the tool in five-axis processing will affect the quality of machined surface,then proposes a method of the rotation correction.On the basis of the post-processing algorithm in referencing BV100 five-axis machine tools and combined with the rotation correction method under JAVA language environment,then develops a the BV100 five-axis machine tools dedicated post-processing software.It verifies the rotation correction which has a better practicality in the multi-axis CNC machining when machining of an impeller's blades comparison.%文章主要针对五轴加工中因刀具的进给速度选择不当会影响加工表面质量等问题,提出了一种旋转修正法解决此问题.在参考BV100五轴联动机床的后置处理算法的基础上.在JAVA语言环境下,结合旋转修正法,开发了BV100五轴联动机床的专用后置处理软件.通过对某叶轮的叶片进行加工比较验证了旋转修正法在多轴数控加工中具有良好的实用性.

  19. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...

  20. SHAREPOINT SITE CREATING AND SETTING

    Directory of Open Access Journals (Sweden)

    Oleksandr V. Tebenko

    2011-02-01

    Full Text Available Tools for sites building that offer users the ability to work together, an actual theme in information society and modern Web technologies. This article considers the SharePoint system, which enables to create sites of any complexity, including large portals with a complex structure of documents. Purpose of this article is to consider the main points of site creating and its setting with tools of SharePoint system, namely: a site template creating and configuring, web application environment to create and configure Web applications, change of existing and creation of new theme site, a web part setting.

  1. Hierarchical Sets: Analyzing Pangenome Structure through Scalable Set Visualizations

    OpenAIRE

    Pedersen, Thomas Lin

    2017-01-01

    The increase in available microbial genome sequences has resulted in an increase in the size of the pangenomes being analyzed. Current pangenome visualizations are not intended for the pangenome sizes possible today and new approaches are necessary in order to convert the increase in available information to increase in knowledge. As the pangenome data structure is essentially a collection of sets we explore the potential for scalable set visualization as a tool for pangenome analysis. We pre...

  2. Hybrid Platforms, Tools, and Resources

    Science.gov (United States)

    Linder, Kathryn E.; Bruenjes, Linda S.; Smith, Sarah A.

    2017-01-01

    This chapter discusses common tools and resources for building a hybrid course in a higher education setting and provides recommendations for best practices in Learning Management Systems and Open Educational Resources.

  3. Tools for Distributed Systems Monitoring

    Directory of Open Access Journals (Sweden)

    Kufel Łukasz

    2016-11-01

    Full Text Available The management of distributed systems infrastructure requires dedicated set of tools. The one tool that helps visualize current operational state of all systems and notify when failure occurs is available within monitoring solution. This paper provides an overview of monitoring approaches for gathering data from distributed systems and what are the major factors to consider when choosing a monitoring solution. Finally we discuss the tools currently available on the market.

  4. The GNEMRE Dendro Tool.

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  5. Intelligent virtual reality in the setting of fuzzy sets

    Science.gov (United States)

    Dockery, John; Littman, David

    1992-01-01

    The authors have previously introduced the concept of virtual reality worlds governed by artificial intelligence. Creation of an intelligent virtual reality was further proposed as a universal interface for the handicapped. This paper extends consideration of intelligent virtual realty to a context in which fuzzy set principles are explored as a major tool for implementing theory in the domain of applications to the disabled.

  6. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  7. Rough Set Theory over Fuzzy Lattices

    Institute of Scientific and Technical Information of China (English)

    Guilong Liu

    2006-01-01

    Rough set theory, proposed by Pawlak in 1982, is a tool for dealing with uncertainty and vagueness aspects of knowledge model. The main idea of roug h sets corresponds to the lower and upper approximations based on equivalence relations. This paper studies the rough set and its extension. In our talk, we present a linear algebra approach to rough set and its extension, give an equivalent definition of the lower and upper approximations of rough set based on the characteristic function of sets, and then we explain the lower and upper approximations as the colinear map and linear map of sets, respectively. Finally, we define the rough sets over fuzzy lattices, which cover the rough set and fuzzy rough set, and the independent axiomatic systems are constructed to characterize the lower and upper approximations of rough set over fuzzy lattices, respectively, based on inner and outer products. The axiomatic systems unify the axiomization of Pawlak's rough sets and fuzzy rough sets.

  8. An application of two MIFs-based tools (Volsurf+ and Pentacle) to binary QSAR: the case of a palinurin-related data set of non-ATP competitive glycogen synthase kinase 3β (GSK-3β) inhibitors.

    Science.gov (United States)

    Ermondi, Giuseppe; Caron, Giulia; Pintos, Isela Garcia; Gerbaldo, Michela; Pérez, Manuel; Pérez, Daniel I; Gándara, Zoila; Martínez, Ana; Gómez, Generosa; Fall, Yagamare

    2011-03-01

    VolSurf+ and GRIND descriptors extract the information present in MIFs calculated by GRID: the first are simpler to interpret and generally applied to ADME-Tox topics, whereas the latter are more sophisticated and thus more suited for pharmacodynamics events. Here we present a study which compares binary QSAR models obtained with VolSurf+ descriptors and GRIND for a data set of non-ATP competitive GSK-3β inhibitors chemically related to palinurin for which the biological activity is expressed in binary format. Results suggest not only that the simpler Volsurf+ descriptors are good enough to predict and chemically interpret the investigated phenomenon but also a bioactive conformation of palinurin which may guide future design of ATP non-competitive GSK-3 inhibitors. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  9. $K^0$-$\\Sigma^+$ Photoproduction with SAPHIR

    CERN Document Server

    Bennhold, C

    1998-01-01

    Preliminary results of the analysis of the reaction p(gamma,K0)Sigma+ are presented. We show the first measurement of the differential cross section and much improved data for the total cross section than previous data. The data are compared with model predictions from different isobar and quark models that give a good description of p(gamma,K+)Lambda and p(gamma,K+)Sigma0 data in the same energy range. None of the models yield an adequate description of the data at all energies.

  10. Four FACTs Spiritual Assessment Tool.

    Science.gov (United States)

    LaRocca-Pitts, Mark

    2015-01-01

    The Four FACTs Spiritual Assessment Tool combines the Four Fs and the FACT Spiritual Assessment Tool of LaRocca-Pitts into a single tool. The Four FACTs Tool is specifically designed for beginning students, but can also meet the needs of professional chaplains. Though designed for use in an acute care setting, it can be easily adapted for other settings. The Four FACTs Tool is easy to learn and to use and it gathers and evaluates relevant clinical information that can then be used to develop a plan of care. In its shortened form, as ACT, it informs how the chaplain can be fully present with patients and their families, especially in a time of crisis.

  11. On Characterization of Rough Type-2 Fuzzy Sets

    OpenAIRE

    Tao Zhao; Zhenbo Wei

    2016-01-01

    Rough sets theory and fuzzy sets theory are important mathematical tools to deal with uncertainties. Rough fuzzy sets and fuzzy rough sets as generalizations of rough sets have been introduced. Type-2 fuzzy set provides additional degree of freedom, which makes it possible to directly handle high uncertainties. In this paper, the rough type-2 fuzzy set model is proposed by combining the rough set theory with the type-2 fuzzy set theory. The rough type-2 fuzzy approximation operators induced f...

  12. Handbook of Open Source Tools

    CERN Document Server

    Koranne, Sandeep

    2011-01-01

    Handbook of Open Source Tools introduces a comprehensive collection of advanced open source tools useful in developing software applications. The book contains information on more than 200 open-source tools which include software construction utilities for compilers, virtual-machines, database, graphics, high-performance computing, OpenGL, geometry, algebra, graph theory , GUIs and more. Special highlights for software construction utilities and application libraries are included. Each tool is covered in the context of a real like application development setting. This unique handbook presents

  13. Mathematical tools

    Science.gov (United States)

    Capozziello, Salvatore; Faraoni, Valerio

    In this chapter we discuss certain mathematical tools which are used extensively in the following chapters. Some of these concepts and methods are part of the standard baggage taught in undergraduate and graduate courses, while others enter the tool-box of more advanced researchers. These mathematical methods are very useful in formulating ETGs and in finding analytical solutions.We begin by studying conformal transformations, which allow for different representations of scalar-tensor and f(R) theories of gravity, in addition to being useful in GR. We continue by discussing variational principles in GR, which are the basis for presenting ETGs in the following chapters. We close the chapter with a discussion of Noether symmetries, which are used elsewhere in this book to obtain analytical solutions.

  14. New speckle-tracking algorithm for right ventricular volume analysis from three-dimensional echocardiographic data sets: validation with cardiac magnetic resonance and comparison with the previous analysis tool.

    Science.gov (United States)

    Muraru, Denisa; Spadotto, Veronica; Cecchetto, Antonella; Romeo, Gabriella; Aruta, Patrizia; Ermacora, Davide; Jenei, Csaba; Cucchini, Umberto; Iliceto, Sabino; Badano, Luigi P

    2016-11-01

    (i) To validate a new software for right ventricular (RV) analysis by 3D echocardiography (3DE) against cardiac magnetic resonance (CMR); (ii) to assess the accuracy of different measurement approaches; and (iii) to explore any benefits vs. the previous software. We prospectively studied with 3DE and CMR 47 patients (14-82 years, 28 men) having a wide range of RV end-diastolic volumes (EDV 82-354 mL at CMR) and ejection fractions (EF 34-81%). Multi-beat RV 3DE data sets were independently analysed with the new software using both automated and manual editing options, as well as with the previous software. RV volume reproducibility was tested in 15 random patients. RV volumes and EF measurements by the new software had an excellent accuracy (bias ± SD: -15 ± 24 mL for EDV; 1.4 ± 4.9% for EF) and reproducibility compared with CMR, provided that the RV borders automatically tracked by software were systematically edited by operator. The automated analysis option underestimated the EDV, overestimated the ESV, and largely underestimated the EF (bias ± SD: -17 ± 10%). RV volumes measured with the new software using manual editing showed similar accuracy, but lower inter-observer variability and shorter analysis time (3-5') in comparison with the previous software. Novel vendor-independent 3DE software enables an accurate, reproducible and faster quantitation of RV volumes and ejection fraction. Rather than optional, systematic verification of border tracking quality and manual editing are mandatory to ensure accurate 3DE measurements. These findings are relevant for echocardiography laboratories aiming to implement 3DE for RV analysis for both research and clinical purposes. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  15. Dataflow Design Tool: User's Manual

    Science.gov (United States)

    Jones, Robert L., III

    1996-01-01

    The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.

  16. Rough Sets in Approximate Solution Space

    Institute of Scientific and Technical Information of China (English)

    Hui Sun; Wei Tian; Qing Liu

    2006-01-01

    As a new mathematical theory, Rough sets have been applied to processing imprecise, uncertain and in complete data. It has been fruitful in finite and non-empty set. Rough sets, however, are only served as the theoretic tool to discretize the real function. As far as the real function research is concerned, the research to define rough sets in the real function is infrequent. In this paper, we exploit a new method to extend the rough set in normed linear space, in which we establish a rough set,put forward an upper and lower approximation definition, and make a preliminary research on the property of the rough set. A new tool is provided to study the approximation solutions of differential equation and functional variation in normed linear space. This research is significant in that it extends the application of rough sets to a new field.

  17. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  18. Long Term Care Minimum Data Set (MDS)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Long-Term Care Minimum Data Set (MDS) is a standardized, primary screening and assessment tool of health status that forms the foundation of the comprehensive...

  19. Long Term Care Minimum Data Set (MDS)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Long-Term Care Minimum Data Set (MDS) is a standardized, primary screening and assessment tool of health status that forms the foundation of the comprehensive...

  20. Center for Corporate Climate Leadership Goal Setting

    Science.gov (United States)

    EPA provides tools and recognition for companies setting aggressive GHG reduction goals, which can galvanize reduction efforts at a company and often leads to the identification of many additional reduction opportunities.

  1. Accurate Topological Measures for Rough Sets

    OpenAIRE

    2015-01-01

    Data granulation is considered a good tool of decision making in various types of real life applications. The basic ideas of data granulation have appeared in many fields, such as interval analysis, quantization, rough set theory, Dempster-Shafer theory of belief functions, divide and conquer, cluster analysis, machine learning, databases, information retrieval, and many others. Some new topological tools for data granulation using rough set approximations are initiated. Moreover, some topolo...

  2. Tool Gear: Infrastructure for Parallel Tools

    Energy Technology Data Exchange (ETDEWEB)

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  3. Automatic Calibration Of Manual Machine Tools

    Science.gov (United States)

    Gurney, Rex D.

    1990-01-01

    Modified scheme uses data from multiple positions and eliminates tedious positioning. Modification of computer program adapts calibration system for convenient use with manually-controlled machine tools. Developed for use on computer-controlled tools. Option added to calibration program allows data on random tool-axis positions to be entered manually into computer for reduction. Instead of setting axis to predetermined positions, operator merely sets it at variety of arbitrary positions.

  4. SITE DESIGN SETTING IN SHAREPOINT

    Directory of Open Access Journals (Sweden)

    Oleksii V. Tebenko

    2010-10-01

    Full Text Available Creation and promotion of the site is one of way to implement of ICT in education. To build modern sites and large portals, to avoid the process of content creation and management tools, when new content is added, dynamic model for page building is used. The article deals with means and methods of dynamic page template in SharePoint. Purpose of the article is to analyze the key components of SharePoint for dynamic pages, such as setting and changing master pages, standard types of spaceholders master pages, setting aggregates content and consideration of standard types of SharePoint spaceholders.

  5. INPHO project. Task 1: the setting of electron lines - beam dynamics; Projet INPHO. Tache 1: mise en place des lignes - faisceaulogie. Bilan de la phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Anselmetti, S.; Baze, J.M.; Brasseur, A.; Cazanou, M.; Cazaux, S.; Coadou, B.; Congretel, G.; Contrepois, P.; Curtoni, A.; Denis, J.F.; Desmons, M.; Dorlot, M.; Fontaine, M.; Jablonka, M.; Jannin, J.L.; Joly, J.M.; Launay, J.C.; Lotode, A.; Luong, M.; Mattei, P.; Nardin, P.; Perrin, J.L.; Saudemont, J.C.; Veyssiere, C. [CEA Saclay, Dept. d' Astrophysique, de Physique des Particules, de Physique Nucleaire et de l' Instrumentation Associee, 91- Gif sur Yvette (France); Laine, F. [CEA Saclay, Dept. des Technologies du Capteur et du Signal (DRT/LIST/DETECS/SSTM/L2MA), 91 - Gif-sur-Yvette (France)

    2005-07-01

    The INPHO project aims at upgrading and optimizing the SAPHIR installation that is dedicated to the measurement (through the detection of photofission reactions) of radioactive wastes containing transuranium elements. Some modifications have been made during the phase I of the upgrading: -) the supply of power between the 2 parts of the accelerator has been modified. Now the setting of the beam energy does not imply to compensate for a phase shift; -) the vacuum level of the accelerator has been improved, it passed from 10{sup -6} torr to 7.10{sup -8} torr); and current measurers have been set on the electron line (there were no direct diagnostics previously). Other modifications are planned for the phase II of the upgrading. It concerns: -) the power supply of the electron gun; -) the control system; and the power supply of the klystron. In parallel with the phase II, feasibility studies have been led for the design of an electron line that will allow the electron-photon converter target to be as near as possible to the waste package to probe. (A.C.)

  6. Cyber Security Evaluation Tool

    Energy Technology Data Exchange (ETDEWEB)

    2009-08-03

    CSET is a desktop software tool that guides users through a step-by-step process to assess their control system network security practices against recognized industry standards. The output from CSET is a prioritized list of recommendations for improving the cyber security posture of your organization’s ICS or enterprise network. CSET derives the recommendations from a database of cybersecurity standards, guidelines, and practices. Each recommendation is linked to a set of actions that can be applied to enhance cybersecurity controls.

  7. System Service Test Operations Procedure ’Tool Sets’.

    Science.gov (United States)

    compatibility with related equipment, transportability, maintainability, and reliability. It provides guidance for evaluation of maintenance of the test items with an analysis method for evaluating maintenance literature. (Author)

  8. An Evaluation of CET-SET

    Institute of Scientific and Technical Information of China (English)

    张雨兰

    2011-01-01

    CET-SET refers to the spoken English test affiliated to College English Test,Bands 4 and 6.It becomes a tool for assessing college students' English speaking ability.This essay mainly evaluates CET-SET in terms of test form,types,assessment and backwash e

  9. Ready, Set, Respect! GLSEN's Elementary School Toolkit

    Science.gov (United States)

    Gay, Lesbian and Straight Education Network (GLSEN), 2012

    2012-01-01

    "Ready, Set, Respect!" provides a set of tools to help elementary school educators ensure that all students feel safe and respected and develop respectful attitudes and behaviors. It is not a program to be followed but instead is designed to help educators prepare themselves for teaching about and modeling respect. The toolkit responds to…

  10. Hierarchical Sets: Analyzing Pangenome Structure through Scalable Set Visualizations

    DEFF Research Database (Denmark)

    Pedersen, Thomas Lin

    2017-01-01

    The increase in available microbial genome sequences has resulted in an increase in the size of the pangenomes being analyzed. Current pangenome visualizations are not intended for the pangenome sizes possible today and new approaches are necessary in order to convert the increase in available...... information to increase in knowledge. As the pangenome data structure is essentially a collection of sets we explore the potential for scalable set visualization as a tool for pangenome analysis. We present a new hierarchical clustering algorithm based on set arithmetics that optimizes the intersection sizes...... along the branches. The intersection and union sizes along the hierarchy are visualized using a composite dendrogram and icicle plot, which, in pangenome context, shows the evolution of pangenome and core size along the evolutionary hierarchy. Outlying elements, i.e. elements whose presence pattern do...

  11. Fuzzy sets, rough sets, multisets and clustering

    CERN Document Server

    Dahlbom, Anders; Narukawa, Yasuo

    2017-01-01

    This book is dedicated to Prof. Sadaaki Miyamoto and presents cutting-edge papers in some of the areas in which he contributed. Bringing together contributions by leading researchers in the field, it concretely addresses clustering, multisets, rough sets and fuzzy sets, as well as their applications in areas such as decision-making. The book is divided in four parts, the first of which focuses on clustering and classification. The second part puts the spotlight on multisets, bags, fuzzy bags and other fuzzy extensions, while the third deals with rough sets. Rounding out the coverage, the last part explores fuzzy sets and decision-making.

  12. The Partial Fuzzy Set

    OpenAIRE

    Dr.Pranita Goswami

    2011-01-01

    The Partial Fuzzy Set is a portion of the Fuzzy Set which is again a Fuzzy Set. In the Partial Fuzzy Set the baseline is shifted from 0 to 1 to any of its α cuts . In this paper we have fuzzified a portion of the Fuzzy Set by transformation

  13. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  14. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  15. Downhole tool with replaceable tool sleeve sections

    Energy Technology Data Exchange (ETDEWEB)

    Case, W. A.

    1985-10-29

    A downhole tool for insertion in a drill stem includes elongated cylindrical half sleeve tool sections adapted to be non-rotatably supported on an elongated cylindrical body. The tool sections are mountable on and removable from the body without disconnecting either end of the tool from a drill stem. The half sleeve tool sections are provided with tapered axially extending flanges on their opposite ends which fit in corresponding tapered recesses formed on the tool body and the tool sections are retained on the body by a locknut threadedly engaged with the body and engageable with an axially movable retaining collar. The tool sections may be drivably engaged with axial keys formed on the body or the tool sections may be formed with flat surfaces on the sleeve inner sides cooperable with complementary flat surfaces formed on a reduced diameter portion of the body around which the tool sections are mounted.

  16. Biomimetics: process, tools and practice.

    Science.gov (United States)

    Fayemi, P E; Wanieck, K; Zollfrank, C; Maranzana, N; Aoussat, A

    2017-01-23

    Biomimetics applies principles and strategies abstracted from biological systems to engineering and technological design. With a huge potential for innovation, biomimetics could evolve into a key process in businesses. Yet challenges remain within the process of biomimetics, especially from the perspective of potential users. We work to clarify the understanding of the process of biomimetics. Therefore, we briefly summarize the terminology of biomimetics and bioinspiration. The implementation of biomimetics requires a stated process. Therefore, we present a model of the problem-driven process of biomimetics that can be used for problem-solving activity. The process of biomimetics can be facilitated by existing tools and creative methods. We mapped a set of tools to the biomimetic process model and set up assessment sheets to evaluate the theoretical and practical value of these tools. We analyzed the tools in interdisciplinary research workshops and present the characteristics of the tools. We also present the attempt of a utility tree which, once finalized, could be used to guide users through the process by choosing appropriate tools respective to their own expertize. The aim of this paper is to foster the dialogue and facilitate a closer collaboration within the field of biomimetics.

  17. Performative Tools and Collaborative Learning

    DEFF Research Database (Denmark)

    Minder, Bettina; Lassen, Astrid Heidemann

    of performative tools used in transdisciplinary events for collaborative learning. The results of this single case study add to extant knowledge- and learning literature by providing the reader with a rich description of characteristics and learning functions of performative tools in transdisciplinary events......The use of performative tools can support collaborative learning across knowledge domains (i.e. science and practice), because they create new spaces for dialog. However, so far innovation literature provides little answers to the important discussion of how to describe the effects and requirements...... and a description of how they interrelate with the specific setting of such an event. Furthermore, they complement previous findings by relating performative tools to collaborative learning for knowledge intensive ideas....

  18. Multiaspect Soft Sets

    Directory of Open Access Journals (Sweden)

    Nor Hashimah Sulaiman

    2013-01-01

    Full Text Available We introduce a novel concept of multiaspect soft set which is an extension of the ordinary soft set by Molodtsov. Some basic concepts, operations, and properties of the multiaspect soft sets are studied. We also define a mapping on multiaspect soft classes and investigate several properties related to the images and preimages of multiaspect soft sets.

  19. GridTool: A surface modeling and grid generation tool

    Science.gov (United States)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  20. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...... of a new sustainable settlement. The use of design tools is discussed in relation to innovation and stakeholder participation, and it is stressed that the usefulness of design tools is context dependent....

  1. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  2. Hypergraphs combinatorics of finite sets

    CERN Document Server

    Berge, C

    1989-01-01

    Graph Theory has proved to be an extremely useful tool for solving combinatorial problems in such diverse areas as Geometry, Algebra, Number Theory, Topology, Operations Research and Optimization. It is natural to attempt to generalise the concept of a graph, in order to attack additional combinatorial problems. The idea of looking at a family of sets from this standpoint took shape around 1960. In regarding each set as a ``generalised edge'' and in calling the family itself a ``hypergraph'', the initial idea was to try to extend certain classical results of Graph Theory such as the theorems of Turán and König. It was noticed that this generalisation often led to simplification; moreover, one single statement, sometimes remarkably simple, could unify several theorems on graphs. This book presents what seems to be the most significant work on hypergraphs.

  3. Sets in Coq, Coq in Sets

    Directory of Open Access Journals (Sweden)

    Bruno Barras

    2010-01-01

    Full Text Available This work is about formalizing models of various type theories of the Calculus of Constructions family. Here we focus on set theoretical models. The long-term goal is to build a formal set theoretical model of the Calculus of Inductive Constructions, so we can be sure that Coq is consistent with the language used by most mathematicians.One aspect of this work is to axiomatize several set theories: ZF possibly with inaccessible cardinals, and HF, the theory of hereditarily finite sets. On top of these theories we have developped a piece of the usual set theoretical construction of functions, ordinals and fixpoint theory. We then proved sound several models of the Calculus of Constructions, its extension with an infinite hierarchy of universes, and its extension with the inductive type of natural numbers where recursion follows the type-based termination approach.The other aspect is to try and discharge (most of these assumptions. The goal here is rather to compare the theoretical strengths of all these formalisms. As already noticed by Werner, the replacement axiom of ZF in its general form seems to require a type-theoretical axiom of choice (TTAC.

  4. Compilation Tool Chains and Intermediate Representations

    DEFF Research Database (Denmark)

    Mottin, Julien; Pacull, François; Keryell, Ronan;

    2014-01-01

    In SMECY, we believe that an efficient tool chain could only be defined when the type of parallelism required by an application domain and the hardware architecture is fixed. Furthermore, we believe that once a set of tools is available, it is possible with reasonable effort to change hardware ar...

  5. The Overture Initiative Integrating Tools for VDM

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Battle, Nick; Ferreira, Miguel;

    2010-01-01

    Overture is a community-based initiative that aims to develop a common open-source platform integrating a range of tools for constructing and analysing formal models of systems using VDM. The mission is to both provide an industrial-strength tool set for VDM and also to provide an environment tha...

  6. Sets resilient to erosion

    CERN Document Server

    Pegden, Wesley

    2011-01-01

    The erosion of a set in Euclidean space by a radius r>0 is the subset of X consisting of points at distance >/-r from the complement of X. A set is resilient to erosion if it is similar to its erosion by some positive radius. We give a somewhat surprising characterization of resilient sets, consisting in one part of simple geometric constraints on convex resilient sets, and, in another, a correspondence between nonconvex resilient sets and scale-invariant (e.g., 'exact fractal') sets.

  7. Invariant sets for Windows

    CERN Document Server

    Morozov, Albert D; Dragunov, Timothy N; Malysheva, Olga V

    1999-01-01

    This book deals with the visualization and exploration of invariant sets (fractals, strange attractors, resonance structures, patterns etc.) for various kinds of nonlinear dynamical systems. The authors have created a special Windows 95 application called WInSet, which allows one to visualize the invariant sets. A WInSet installation disk is enclosed with the book.The book consists of two parts. Part I contains a description of WInSet and a list of the built-in invariant sets which can be plotted using the program. This part is intended for a wide audience with interests ranging from dynamical

  8. Rig Diagnostic Tools

    Science.gov (United States)

    Soileau, Kerry M.; Baicy, John W.

    2008-01-01

    Rig Diagnostic Tools is a suite of applications designed to allow an operator to monitor the status and health of complex networked systems using a unique interface between Java applications and UNIX scripts. The suite consists of Java applications, C scripts, Vx- Works applications, UNIX utilities, C programs, and configuration files. The UNIX scripts retrieve data from the system and write them to a certain set of files. The Java side monitors these files and presents the data in user-friendly formats for operators to use in making troubleshooting decisions. This design allows for rapid prototyping and expansion of higher-level displays without affecting the basic data-gathering applications. The suite is designed to be extensible, with the ability to add new system components in building block fashion without affecting existing system applications. This allows for monitoring of complex systems for which unplanned shutdown time comes at a prohibitive cost.

  9. Multiple Kernel Point Set Registration.

    Science.gov (United States)

    Nguyen, Thanh Minh; Wu, Q M Jonathan

    2016-06-01

    The finite Gaussian mixture model with kernel correlation is a flexible tool that has recently received attention for point set registration. While there are many algorithms for point set registration presented in the literature, an important issue arising from these studies concerns the mapping of data with nonlinear relationships and the ability to select a suitable kernel. Kernel selection is crucial for effective point set registration. We focus here on multiple kernel point set registration. We make several contributions in this paper. First, each observation is modeled using the Student's t-distribution, which is heavily tailed and more robust than the Gaussian distribution. Second, by automatically adjusting the kernel weights, the proposed method allows us to prune the ineffective kernels. This makes the choice of kernels less crucial. After parameter learning, the kernel saliencies of the irrelevant kernels go to zero. Thus, the choice of kernels is less crucial and it is easy to include other kinds of kernels. Finally, we show empirically that our model outperforms state-of-the-art methods recently proposed in the literature.

  10. Value Set Authority Center

    Data.gov (United States)

    U.S. Department of Health & Human Services — The VSAC provides downloadable access to all official versions of vocabulary value sets contained in the 2014 Clinical Quality Measures (CQMs). Each value set...

  11. Landslides, Set 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This set expands the topics included in Set 1 and includes (in addition to landslides) rockfalls, rock avalanches, mud flows, debris flows, slumps, creep, and...

  12. Altimeter Setting Indicator

    Data.gov (United States)

    Department of Transportation — The Altimeter Setting Indicator (ASI) is an aneroid system used at airports to provide an altimeter setting for aircraft altimeters. This indicator may be an analog...

  13. Settings for Suicide Prevention

    Science.gov (United States)

    ... out more about suicide prevention in particular settings. American Indian/Alaska Native Settings Behavioral Health Care Inpatient Mental Health Outpatient Mental Health Substance Abuse Treatment Colleges and Universities Communities Crisis Centers/ ...

  14. Function S-rough sets and law identification

    Institute of Scientific and Technical Information of China (English)

    SHI KaiQuan; YAO BingXue

    2008-01-01

    By introducing element equivalence class that proposes dynamic characteristic into Pawlak Z rough sets theory, the first author of this paper improved Pawlak Z rough sets and put forward S-rough sets (singular rough sets). S-rough sets are defined by element equivalence class that proposes dynamic characteristic. S-rough sets have dynamic characteristic. By introducing the function equivalence class (law equivalence class) that proposes dynamic characteristic into S-rough sets, the first author improved S-rough sets and put forward function S-rough sets (function singular rough sets). Function S-rough sets have dynamic characteristic and law characteristic, and a function is a law. By using function S-rough sets, this paper presents law identification, law identification theorem, and law identification criterion and applications. Function S-rough sets are a new research direction of rough sets theory, and it is also a new tool to the research of system law identifica-tion.

  15. Tool Changer For Robot

    Science.gov (United States)

    Voellmer, George M.

    1992-01-01

    Mechanism enables robot to change tools on end of arm. Actuated by motion of robot: requires no additional electrical or pneumatic energy to make or break connection between tool and wrist at end of arm. Includes three basic subassemblies: wrist interface plate attached to robot arm at wrist, tool interface plate attached to tool, and holster. Separate tool interface plate and holster provided for each tool robot uses.

  16. Axiomatic set theory

    CERN Document Server

    Suppes, Patrick

    1972-01-01

    This clear and well-developed approach to axiomatic set theory is geared toward upper-level undergraduates and graduate students. It examines the basic paradoxes and history of set theory and advanced topics such as relations and functions, equipollence, finite sets and cardinal numbers, rational and real numbers, and other subjects. 1960 edition.

  17. Paired fuzzy sets

    DEFF Research Database (Denmark)

    Rodríguez, J. Tinguaro; Franco de los Ríos, Camilo; Gómez, Daniel

    2015-01-01

    In this paper we want to stress the relevance of paired fuzzy sets, as already proposed in previous works of the authors, as a family of fuzzy sets that offers a unifying view for different models based upon the opposition of two fuzzy sets, simply allowing the existence of different types...

  18. Sets, Planets, and Comets

    Science.gov (United States)

    Baker, Mark; Beltran, Jane; Buell, Jason; Conrey, Brian; Davis, Tom; Donaldson, Brianna; Detorre-Ozeki, Jeanne; Dibble, Leila; Freeman, Tom; Hammie, Robert; Montgomery, Julie; Pickford, Avery; Wong, Justine

    2013-01-01

    Sets in the game "Set" are lines in a certain four-dimensional space. Here we introduce planes into the game, leading to interesting mathematical questions, some of which we solve, and to a wonderful variation on the game "Set," in which every tableau of nine cards must contain at least one configuration for a player to pick up.

  19. Sets, Planets, and Comets

    Science.gov (United States)

    Baker, Mark; Beltran, Jane; Buell, Jason; Conrey, Brian; Davis, Tom; Donaldson, Brianna; Detorre-Ozeki, Jeanne; Dibble, Leila; Freeman, Tom; Hammie, Robert; Montgomery, Julie; Pickford, Avery; Wong, Justine

    2013-01-01

    Sets in the game "Set" are lines in a certain four-dimensional space. Here we introduce planes into the game, leading to interesting mathematical questions, some of which we solve, and to a wonderful variation on the game "Set," in which every tableau of nine cards must contain at least one configuration for a player to pick up.

  20. Route Availabililty Planning Tool -

    Data.gov (United States)

    Department of Transportation — The Route Availability Planning Tool (RAPT) is a weather-assimilated decision support tool (DST) that supports the development and execution of departure management...

  1. Poster Abstract: Towards NILM for Industrial Settings

    DEFF Research Database (Denmark)

    Holmegaard, Emil; Kjærgaard, Mikkel Baun

    2015-01-01

    Industry consumes a large share of the worldwide electricity consumption. Disaggregated information about electricity consumption enables better decision-making and feedback tools to optimize electricity consumption. In industrial settings electricity loads consist of a variety of equipment, which...... consumption for six months, at an industrial site. In this poster abstract we provide initial results for how industrial equipment challenge NILM algorithms. These results thereby open up for evaluating the use of NILM in industrial settings....

  2. Elements of set theory

    CERN Document Server

    Enderton, Herbert B

    1977-01-01

    This is an introductory undergraduate textbook in set theory. In mathematics these days, essentially everything is a set. Some knowledge of set theory is necessary part of the background everyone needs for further study of mathematics. It is also possible to study set theory for its own interest--it is a subject with intruiging results anout simple objects. This book starts with material that nobody can do without. There is no end to what can be learned of set theory, but here is a beginning.

  3. Sets avoiding integral distances

    CERN Document Server

    Kurz, Sascha

    2012-01-01

    We study open point sets in Euclidean spaces $\\mathbb{R}^d$ without a pair of points an integral distance apart. By a result of Furstenberg, Katznelson, and Weiss such sets must be of Lebesgue upper density zero. We are interested in how large such sets can be in $d$-dimensional volume. We determine the lower and upper bounds for the volumes of the sets in terms of the number of their connected components and dimension, and also give some exact values. Our problem can be viewed as a kind of inverse to known problems on sets with pairwise rational or integral distances.

  4. Generalization Rough Set Theory

    Institute of Scientific and Technical Information of China (English)

    XIAO Di; ZHANG Jun-feng; HU Shou-song

    2008-01-01

    In order to avoid the discretization in the classical rough set theory, a generlization rough set theory is proposed.At first, the degree of general importance of an attribute and attribute subsets are presented.Then, depending on the degree of general importance of attribute, the space distance can be measured with weighted method.At last, a generalization rough set theory based on the general near neighborhood relation is proposed.The proposed theory partitions the universe into the tolerant modules, and forms lower approximation and upper approximation of the set under general near neighborhood relationship, which avoids the discretization in Pawlak's rough set theory.

  5. Tools for Understanding Identity

    Energy Technology Data Exchange (ETDEWEB)

    Creese, Sadie; Gibson-Robinson, Thomas; Goldsmith, Michael; Hodges, Duncan; Kim, Dee DH; Love, Oriana J.; Nurse, Jason R.; Pike, William A.; Scholtz, Jean

    2013-12-28

    Identity attribution and enrichment is critical to many aspects of law-enforcement and intelligence gathering; this identity typically spans a number of domains in the natural-world such as biographic information (factual information – e.g. names, addresses), biometric information (e.g. fingerprints) and psychological information. In addition to these natural-world projections of identity, identity elements are projected in the cyber-world. Conversely, undesirable elements may use similar techniques to target individuals for spear-phishing attacks (or worse), and potential targets or their organizations may want to determine how to minimize the attack surface exposed. Our research has been exploring the construction of a mathematical model for identity that supports such holistic identities. The model captures the ways in which an identity is constructed through a combination of data elements (e.g. a username on a forum, an address, a telephone number). Some of these elements may allow new characteristics to be inferred, hence enriching the holistic view of the identity. An example use-case would be the inference of real names from usernames, the ‘path’ created by inferring new elements of identity is highlighted in the ‘critical information’ panel. Individual attribution exercises can be understood as paths through a number of elements. Intuitively the entire realizable ‘capability’ can be modeled as a directed graph, where the elements are nodes and the inferences are represented by links connecting one or more antecedents with a conclusion. The model can be operationalized with two levels of tool support described in this paper, the first is a working prototype, the second is expected to reach prototype by July 2013: Understanding the Model The tool allows a user to easily determine, given a particular set of inferences and attributes, which elements or inferences are of most value to an investigator (or an attacker). The tool is also able to take

  6. Noncomputable Spectral Sets

    CERN Document Server

    Teutsch, J

    2007-01-01

    It is possible to enumerate all computer programs. In particular, for every partial computable function, there is a shortest program which computes that function. f-MIN is the set of indices for shortest programs. In 1972, Meyer showed that f-MIN is Turing equivalent to 0'', the halting set with halting set oracle. This paper generalizes the notion of shortest programs, and we use various measures from computability theory to describe the complexity of the resulting "spectral sets." We show that under certain Godel numberings, the spectral sets are exactly the canonical sets 0', 0'', 0''', ... up to Turing equivalence. This is probably not true in general, however we show that spectral sets always contain some useful information. We show that immunity, or "thinness" is a useful characteristic for distinguishing between spectral sets. In the final chapter, we construct a set which neither contains nor is disjoint from any infinite arithmetic set, yet it is 0-majorized and contains a natural spectral set. Thus ...

  7. The tools of mathematical reasoning

    CERN Document Server

    Lakins, Tamara J

    2016-01-01

    This accessible textbook gives beginning undergraduate mathematics students a first exposure to introductory logic, proofs, sets, functions, number theory, relations, finite and infinite sets, and the foundations of analysis. The book provides students with a quick path to writing proofs and a practical collection of tools that they can use in later mathematics courses such as abstract algebra and analysis. The importance of the logical structure of a mathematical statement as a framework for finding a proof of that statement, and the proper use of variables, is an early and consistent theme used throughout the book.

  8. Workflow Tools for Digital Curation

    Directory of Open Access Journals (Sweden)

    Andrew James Weidner

    2013-04-01

    Full Text Available Maintaining usable and sustainable digital collections requires a complex set of actions that address the many challenges at various stages of the digital object lifecycle. Digital curation activities enhance access and retrieval, maintain quality, add value, and facilitate use and re-use over time. Digital resource lifecycle management is becoming an increasingly important topic as digital curators actively explore software tools that perform metadata curation and file management tasks. Accordingly, the University of North Texas (UNT Libraries develop tools and workflows that streamline production and quality assurance activities. This article demonstrates two open source software tools, AutoHotkey and Selenium IDE, which the UNT Digital Libraries Division has adopted for use during the pre-ingest and post-ingest stages of the digital resource lifecycle.

  9. Navigating Towards Digital Tectonic Tools

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2006-01-01

    like opposites, the term tectonics deals with creating a meaningful relationship between the two. The aim of this paper is to investigate what a digital tectonic tool could be and what relationship with technology it should represent. An understanding of this relationship can help us not only...... to understand the conflicts in architecture and the building industry but also bring us further into a discussion of how architecture can use digital tools. The investigation is carried out firstly by approaching the subject theoretically through the term tectonics and by setting up a model of the values...... a tectonic tool should encompass. Secondly the ability and validity of the model are shown by applying it to a case study of Jørn Utzon’s work on Minor Hall in Sydney Opera House - for the sake of exemplification the technical field focused on in this paper is room acoustics. Thirdly the relationship between...

  10. A Framework for IT-based Design Tools

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements to integ...... to integrated design enviornments, and analysis of engineeringdesign and design problem solving methods. And the developed framework has been testedby applying it to development of prototype design tools for realistic design scenarios.......The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements...

  11. Quality Measurement in Early Childhood Settings

    Science.gov (United States)

    Zaslow, Martha, Ed.; Martinez-Beck, Ivelisse, Ed.; Tout, Kathryn, Ed.; Halle, Tamara, Ed.

    2011-01-01

    What constitutes quality in early childhood settings, and how can it best be measured with today's widely used tools and promising new approaches? Find authoritative answers in this book, a must-have for high-level administrators and policymakers as more and more states adopt early childhood Quality Rating and Improvement Systems. The most…

  12. User manual for storage simulation construction set

    Energy Technology Data Exchange (ETDEWEB)

    Sehgal, Anil; Volz, Richard A.

    1999-04-01

    The Storage Simulation Set (SSCS) is a tool for composing storage system models using Telegrip. It is an application written in C++ and motif. With this system, the models of a storage system can be composed rapidly and accurately. The aspects of the SSCS are described within this report.

  13. A novel tool-use mode in animals: New Caledonian crows insert tools to transport objects.

    Science.gov (United States)

    Jacobs, Ivo F; von Bayern, Auguste; Osvath, Mathias

    2016-11-01

    New Caledonian crows (Corvus moneduloides) rely heavily on a range of tools to extract prey. They manufacture novel tools, save tools for later use, and have morphological features that facilitate tool use. We report six observations, in two individuals, of a novel tool-use mode not previously reported in non-human animals. Insert-and-transport tool use involves inserting a stick into an object and then moving away, thereby transporting both object and tool. All transported objects were non-food objects. One subject used a stick to transport an object that was too large to be handled by beak, which suggests the tool facilitated object control. The function in the other cases is unclear but seems to be an expression of play or exploration. Further studies should investigate whether it is adaptive in the wild and to what extent crows can flexibly apply the behaviour in experimental settings when purposive transportation of objects is advantageous.

  14. A course on Borel sets

    CERN Document Server

    Srivastava, S M

    1998-01-01

    The roots of Borel sets go back to the work of Baire [8]. He was trying to come to grips with the abstract notion of a function introduced by Dirich­ let and Riemann. According to them, a function was to be an arbitrary correspondence between objects without giving any method or procedure by which the correspondence could be established. Since all the specific functions that one studied were determined by simple analytic expressions, Baire delineated those functions that can be constructed starting from con­ tinuous functions and iterating the operation 0/ pointwise limit on a se­ quence 0/ functions. These functions are now known as Baire functions. Lebesgue [65] and Borel [19] continued this work. In [19], Borel sets were defined for the first time. In his paper, Lebesgue made a systematic study of Baire functions and introduced many tools and techniques that are used even today. Among other results, he showed that Borel functions coincide with Baire functions. The study of Borel sets got an impetus from...

  15. Rough set models of Physarum machines

    Science.gov (United States)

    Pancerz, Krzysztof; Schumann, Andrew

    2015-04-01

    In this paper, we consider transition system models of behaviour of Physarum machines in terms of rough set theory. A Physarum machine, a biological computing device implemented in the plasmodium of Physarum polycephalum (true slime mould), is a natural transition system. In the behaviour of Physarum machines, one can notice some ambiguity in Physarum motions that influences exact anticipation of states of machines in time. To model this ambiguity, we propose to use rough set models created over transition systems. Rough sets are an appropriate tool to deal with rough (ambiguous, imprecise) concepts in the universe of discourse.

  16. China Sets Resources Strategy

    Institute of Scientific and Technical Information of China (English)

    LiMin

    2003-01-01

    All mineral mining in China now has a set road to follow, and straying off its path will attract severe penalties. The country's first-round programs for provincial mineral resources exploitation took effect in mid-January, setting output goals and designating mining regions.

  17. The Model Confidence Set

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger; Nason, James M.

    The paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The M...

  18. Descriptive set theory

    CERN Document Server

    Moschovakis, YN

    1987-01-01

    Now available in paperback, this monograph is a self-contained exposition of the main results and methods of descriptive set theory. It develops all the necessary background material from logic and recursion theory, and treats both classical descriptive set theory and the effective theory developed by logicians.

  19. Economic communication model set

    Science.gov (United States)

    Zvereva, Olga M.; Berg, Dmitry B.

    2017-06-01

    This paper details findings from the research work targeted at economic communications investigation with agent-based models usage. The agent-based model set was engineered to simulate economic communications. Money in the form of internal and external currencies was introduced into the models to support exchanges in communications. Every model, being based on the general concept, has its own peculiarities in algorithm and input data set since it was engineered to solve the specific problem. Several and different origin data sets were used in experiments: theoretic sets were estimated on the basis of static Leontief's equilibrium equation and the real set was constructed on the basis of statistical data. While simulation experiments, communication process was observed in dynamics, and system macroparameters were estimated. This research approved that combination of an agent-based and mathematical model can cause a synergetic effect.

  20. Rotary fast tool servo system and methods

    Science.gov (United States)

    Montesanti, Richard C.; Trumper, David L.

    2007-10-02

    A high bandwidth rotary fast tool servo provides tool motion in a direction nominally parallel to the surface-normal of a workpiece at the point of contact between the cutting tool and workpiece. Three or more flexure blades having all ends fixed are used to form an axis of rotation for a swing arm that carries a cutting tool at a set radius from the axis of rotation. An actuator rotates a swing arm assembly such that a cutting tool is moved in and away from the lathe-mounted, rotating workpiece in a rapid and controlled manner in order to machine the workpiece. A pair of position sensors provides rotation and position information for a swing arm to a control system. A control system commands and coordinates motion of the fast tool servo with the motion of a spindle, rotating table, cross-feed slide, and in-feed slide of a precision lathe.

  1. The Vicinity of Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    Program documentation plays a vital role in almost all programming processes.  Program documentation flows between separate tools of a modularized environment, and in between the components of an integrated development environment as well.  In this paper we discuss the flow of program documentation...... between program development tools.  In the central part of the paper we introduce a mapping of documentation flow between program development tools.  In addition we discuss a set of locally developed tools which is related to program documentation.  The use of test cases as examples in an interface...... documentation tool is a noteworthy and valuable contribution to the documentation flow.  As an additional contribution we identify several circular relationships which illustrate feedback of documentation to the program editor from other tools in the development environment....

  2. LensTools: Weak Lensing computing tools

    Science.gov (United States)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  3. Basic set theory

    CERN Document Server

    Levy, Azriel

    2002-01-01

    An advanced-level treatment of the basics of set theory, this text offers students a firm foundation, stopping just short of the areas employing model-theoretic methods. Geared toward upper-level undergraduate and graduate students, it consists of two parts: the first covers pure set theory, including the basic motions, order and well-foundedness, cardinal numbers, the ordinals, and the axiom of choice and some of it consequences; the second deals with applications and advanced topics such as point set topology, real spaces, Boolean algebras, and infinite combinatorics and large cardinals. An

  4. Combinatorics of set partitions

    CERN Document Server

    Mansour, Toufik

    2012-01-01

    Focusing on a very active area of mathematical research in the last decade, Combinatorics of Set Partitions presents methods used in the combinatorics of pattern avoidance and pattern enumeration in set partitions. Designed for students and researchers in discrete mathematics, the book is a one-stop reference on the results and research activities of set partitions from 1500 A.D. to today. Each chapter gives historical perspectives and contrasts different approaches, including generating functions, kernel method, block decomposition method, generating tree, and Wilf equivalences. Methods and d

  5. Set theory essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Set Theory includes elementary logic, sets, relations, functions, denumerable and non-denumerable sets, cardinal numbers, Cantor's theorem, axiom of choice, and order relations.

  6. A Tool for the Development of Robot Control Strategies

    Directory of Open Access Journals (Sweden)

    Luiz Carlos Figueiredo

    2007-12-01

    Full Text Available In this paper we report as the development of a tool in to develop and set control strategies as a fast and easy way. Additionally, a tricycle robot with two traction motors was built to test the strategies produced with the tool. Experimental tests have shown an advantage in the use of such tool.

  7. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Landslides, Set 1

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This set of slides graphically illustrates the potential danger that major earthquakes pose to school structures and to the children and adults who happen to be...

  9. Lebesgue Sets Immeasurable Existence

    Directory of Open Access Journals (Sweden)

    Diana Marginean Petrovai

    2012-12-01

    Full Text Available It is well known that the notion of measure and integral were released early enough in close connection with practical problems of measuring of geometric figures. Notion of measure was outlined in the early 20th century through H. Lebesgue’s research, founder of the modern theory of measure and integral. It was developed concurrently a technique of integration of functions. Gradually it was formed a specific area todaycalled the measure and integral theory. Essential contributions to building this theory was made by a large number of mathematicians: C. Carathodory, J. Radon, O. Nikodym, S. Bochner, J. Pettis, P. Halmos and many others. In the following we present several abstract sets, classes of sets. There exists the sets which are not Lebesgue measurable and the sets which are Lebesgue measurable but are not Borel measurable. Hence B ⊂ L ⊂ P(X.

  10. Plaster core washout tool

    Science.gov (United States)

    Heisman, R. M.; Keir, A. R.; Teramura, K.

    1977-01-01

    Tool powered by pressurized water or air removes water soluble plaster lining from Kevlar/epoxy duct. Rotating plastic cutterhead with sealed end fitting connects flexible shaft that allows tool to be used with curved ducts.

  11. Automated Experiments on Ad Privacy Settings

    Directory of Open Access Journals (Sweden)

    Datta Amit

    2015-04-01

    Full Text Available To partly address people’s concerns over web tracking, Google has created the Ad Settings webpage to provide information about and some choice over the profiles Google creates on users. We present AdFisher, an automated tool that explores how user behaviors, Google’s ads, and Ad Settings interact. AdFisher can run browser-based experiments and analyze data using machine learning and significance tests. Our tool uses a rigorous experimental design and statistical analysis to ensure the statistical soundness of our results. We use AdFisher to find that the Ad Settings was opaque about some features of a user’s profile, that it does provide some choice on ads, and that these choices can lead to seemingly discriminatory ads. In particular, we found that visiting webpages associated with substance abuse changed the ads shown but not the settings page. We also found that setting the gender to female resulted in getting fewer instances of an ad related to high paying jobs than setting it to male. We cannot determine who caused these findings due to our limited visibility into the ad ecosystem, which includes Google, advertisers, websites, and users. Nevertheless, these results can form the starting point for deeper investigations by either the companies themselves or by regulatory bodies.

  12. VLM Tool for IDS Integration

    Directory of Open Access Journals (Sweden)

    Cǎtǎlin NAE

    2010-03-01

    Full Text Available This paper is dedicated to a very specific type of analysis tool (VLM - Vortex Lattice Method to be integrated in a IDS - Integrated Design System, tailored for the usage of small aircraft industry. The major interest is to have the possibility to simulate at very low computational costs a preliminary set of aerodynamic characteristics for basic aerodynamic global characteristics (Lift, Drag, Pitching Moment and aerodynamic derivatives for longitudinal and lateral-directional stability analysis. This work enables fast investigations of the influence of configuration changes in a very efficient computational environment. Using experimental data and/or CFD information for a specific calibration of VLM method, reliability of the analysis may me increased so that a first type (iteration zero aerodynamic evaluation of the preliminary 3D configuration is possible. The output of this tool is basic state aerodynamic and associated stability and control derivatives, as well as a complete set of information on specific loads on major airframe components.The major interest in using and validating this type of methods is coming from the possibility to integrate it as a tool in an IDS system for conceptual design phase, as considered for development for CESAR project (IP, UE FP6.

  13. OOTW COST TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    HARTLEY, D.S.III; PACKARD, S.L.

    1998-09-01

    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  14. Multicriteria identification sets method

    Science.gov (United States)

    Kamenev, G. K.

    2016-11-01

    A multicriteria identification and prediction method for mathematical models of simulation type in the case of several identification criteria (error functions) is proposed. The necessity of the multicriteria formulation arises, for example, when one needs to take into account errors of completely different origins (not reducible to a single characteristic) or when there is no information on the class of noise in the data to be analyzed. An identification sets method is described based on the approximation and visualization of the multidimensional graph of the identification error function and sets of suboptimal parameters. This method allows for additional advantages of the multicriteria approach, namely, the construction and visual analysis of the frontier and the effective identification set (frontier and the Pareto set for identification criteria), various representations of the sets of Pareto effective and subeffective parameter combinations, and the corresponding predictive trajectory tubes. The approximation is based on the deep holes method, which yields metric ɛ-coverings with nearly optimal properties, and on multiphase approximation methods for the Edgeworth-Pareto hull. The visualization relies on the approach of interactive decision maps. With the use of the multicriteria method, multiple-choice solutions of identification and prediction problems can be produced and justified by analyzing the stability of the optimal solution not only with respect to the parameters (robustness with respect to data) but also with respect to the chosen set of identification criteria (robustness with respect to the given collection of functionals).

  15. Set theory and physics

    Energy Technology Data Exchange (ETDEWEB)

    Svozil, K. [Univ. of Technology, Vienna (Austria)

    1995-11-01

    Inasmuch as physical theories are formalizable, set theory provides a framework for theoretical physics. Four speculations about the relevance of set theoretical modeling for physics are presented: the role of transcendental set theory (i) in chaos theory, (ii) for paradoxical decompositions of solid three-dimensional objects, (iii) in the theory of effective computability (Church-Turing thesis) related to the possible {open_quotes}solution of supertasks,{close_quotes} and (iv) for weak solutions. Several approaches to set theory and their advantages and disadvantages for physical applications are discussed: Cantorian {open_quotes}naive{close_quotes} (i.e., nonaxiomatic) set theory, contructivism, and operationalism. In the author`s opinion, an attitude, of {open_quotes}suspended attention{close_quotes} (a term borrowed from psychoanalysis) seems most promising for progress. Physical and set theoretical entities must be operationalized wherever possible. At the same time, physicists should be open to {open_quotes}bizarre{close_quotes} or {open_quotes}mindboggling{close_quotes} new formalisms, which need not be operationalizable or testable at the time of their creation, but which may successfully lead to novel fields of phenomenology and technology.

  16. Pro Tools HD

    CERN Document Server

    Camou, Edouard

    2013-01-01

    An easy-to-follow guide for using Pro Tools HD 11 effectively.This book is ideal for anyone who already uses ProTools and wants to learn more, or is new to Pro Tools HD and wants to use it effectively in their own audio workstations.

  17. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  18. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  19. A Components Library System Model and the Support Tool

    Institute of Scientific and Technical Information of China (English)

    MIAO Huai-kou; LIU Hui; LIU Jing; LI Xiao-bo

    2004-01-01

    Component-based development needs a well-designed components library and a set of support tools.This paper presents the design and implementation of a components library system model and its support tool UMLCASE.A set of practical CASE tools is constructed.UMLCASE can use UML to design Use Case Diagram, Class Diagram etc.And it integrates with components library system.

  20. MOD Tool (Microwave Optics Design Tool)

    Science.gov (United States)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl

  1. MOD Tool (Microwave Optics Design Tool)

    Science.gov (United States)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl

  2. High Interactivity Visualization Software for Large Computational Data Sets Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Existing scientific visualization tools have specific limitations for large scale scientific data sets. Of these four limitations can be seen as paramount: (i)...

  3. High Interactivity Visualization Software for Large Computational Data Sets Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a collection of computer tools and libraries called SciViz that enable researchers to visualize large scale data sets on HPC resources remotely...

  4. Dynamic Hurricane Data Analysis Tool

    Science.gov (United States)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  5. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  6. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    This paper describes and discusses two different Scheme documentation tools. The first is SchemeDoc, which is intended for documentation of the interfaces of Scheme libraries (APIs). The second is the Scheme Elucidator, which is for internal documentation of Scheme programs. Although the tools...... are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  7. Similar dissection of sets

    CERN Document Server

    Akiyama, Shigeki; Okazaki, Ryotaro; Steiner, Wolfgang; Thuswaldner, Jörg

    2010-01-01

    In 1994, Martin Gardner stated a set of questions concerning the dissection of a square or an equilateral triangle in three similar parts. Meanwhile, Gardner's questions have been generalized and some of them are already solved. In the present paper, we solve more of his questions and treat them in a much more general context. Let $D\\subset \\mathbb{R}^d$ be a given set and let $f_1,...,f_k$ be injective continuous mappings. Does there exist a set $X$ such that $D = X \\cup f_1(X) \\cup ... \\cup f_k(X)$ is satisfied with a non-overlapping union? We prove that such a set $X$ exists for certain choices of $D$ and $\\{f_1,...,f_k\\}$. The solutions $X$ often turn out to be attractors of iterated function systems with condensation in the sense of Barnsley. Coming back to Gardner's setting, we use our theory to prove that an equilateral triangle can be dissected in three similar copies whose areas have ratio $1:1:a$ for $a \\ge (3+\\sqrt{5})/2$.

  8. Formal Assurance Certifiable Tooling Formal Assurance Certifiable Tooling Strategy Final Report

    Science.gov (United States)

    Bush, Eric; Oglesby, David; Bhatt, Devesh; Murugesan, Anitha; Engstrom, Eric; Mueller, Joe; Pelican, Michael

    2017-01-01

    This is the Final Report of a research project to investigate issues and provide guidance for the qualification of formal methods tools under the DO-330 qualification process. It consisted of three major subtasks spread over two years: 1) an assessment of theoretical soundness issues that may affect qualification for three categories of formal methods tools, 2) a case study simulating the DO-330 qualification of two actual tool sets, and 3) an investigation of risk mitigation strategies that might be applied to chains of such formal methods tools in order to increase confidence in their certification of airborne software.

  9. Enlargements of positive sets

    CERN Document Server

    Bot, Radu Ioan

    2008-01-01

    In this paper we introduce the notion of enlargement of a positive set in SSD spaces. To a maximally positive set $A$ we associate a family of enlargements $\\E(A)$ and characterize the smallest and biggest element in this family with respect to the inclusion relation. We also emphasize the existence of a bijection between the subfamily of closed enlargements of $\\E(A)$ and the family of so-called representative functions of $A$. We show that the extremal elements of the latter family are two functions recently introduced and studied by Stephen Simons. In this way we extend to SSD spaces some former results given for monotone and maximally monotone sets in Banach spaces.

  10. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid

    2016-01-01

    Current analytical approaches in computational social science can be characterized by four dominant paradigms: text analysis (information extraction and classification), social network analysis (graph theory), social complexity analysis (complex systems science), and social simulations (cellular...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... this limitation, based on the sociology of associations and the mathematics of set theory, this paper presents a new approach to big data analytics called social set analysis. Social set analysis consists of a generative framework for the philosophies of computational social science, theory of social data...

  11. Social Set Visualizer

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Vatrapu, Ravi; Mukkamala, Raghava Rao

    2015-01-01

    in relation to the garment factory accidents in Bangladesh, and analyze the results. The enterprise application domain for the dashboard is corporate social responsibility (CSR) and the targeted end-users are CSR researchers and practitioners. The design of the dashboard was based on the social set analysis...... consisted of technical testing, usability testing, and domain-specific testing with CSR students and yielded positive results. In conclusion, we discuss the new analytical- approach of social set analysis and conclude with a discussion of the benefits of set theoretical approaches based on the social......Current state-of-the-art in big social data analytics is largely limited to graph theoretical approaches such as social network analysis (SNA) informed by the social philosophical approach of relational sociology. This paper proposes and illustrates an alternate holistic approach to big social data...

  12. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Hussain, Abid; Buus Lassen, Niels

    2015-01-01

    of Facebook or Twitter data. However, there exist no other holistic computational social science approach beyond the relational sociology and graph theory of SNA. To address this limitation, this paper presents an alternative holistic approach to Big Social Data analytics called Social Set Analysis (SSA......This paper argues that the basic premise of Social Network Analysis (SNA) -- namely that social reality is constituted by dyadic relations and that social interactions are determined by structural properties of networks-- is neither necessary nor sufficient, for Big Social Data analytics......). Based on the sociology of associations and the mathematics of classical, fuzzy and rough set theories, this paper proposes a research program. The function of which is to design, develop and evaluate social set analytics in terms of fundamentally novel formal models, predictive methods and visual...

  13. Revitalizing the setting approach

    DEFF Research Database (Denmark)

    Bloch, Paul; Toft, Ulla; Reinbach, Helene Christine

    2014-01-01

    BackgroundThe concept of health promotion rests on aspirations aiming at enabling people to increase control over and improve their health. Health promotion action is facilitated in settings such as schools, homes and work places. As a contribution to the promotion of healthy lifestyles, we have.......DiscussionThe supersetting approach is a further development of the setting approach in which the significance of integrated and coordinated actions together with a participatory approach are emphasised and important principles are specified, all of which contribute to the attainment of synergistic effects and sustainable.......SummaryThe supersetting approach is a relevant and useful conceptual framework for developing intervention-based initiatives for sustainable impact in community health promotion. It strives to attain synergistic effects from activities that are carried out in multiple settings in a coordinated manner. The supersetting...

  14. Distances and similarities in intuitionistic fuzzy sets

    CERN Document Server

    Szmidt, Eulalia

    2014-01-01

    This book presents the state-of-the-art in theory and practice regarding similarity and distance measures for intuitionistic fuzzy sets. Quantifying similarity and distances is crucial for many applications, e.g. data mining, machine learning, decision making, and control. The work provides readers with a comprehensive set of theoretical concepts and practical tools for both defining and determining similarity between intuitionistic fuzzy sets. It describes an automatic algorithm for deriving intuitionistic fuzzy sets from data, which can aid in the analysis of information in large databases. The book also discusses other important applications, e.g. the use of similarity measures to evaluate the extent of agreement between experts in the context of decision making.

  15. Combinatorics of finite sets

    CERN Document Server

    Anderson, Ian

    2011-01-01

    Coherent treatment provides comprehensive view of basic methods and results of the combinatorial study of finite set systems. The Clements-Lindstrom extension of the Kruskal-Katona theorem to multisets is explored, as is the Greene-Kleitman result concerning k-saturated chain partitions of general partially ordered sets. Connections with Dilworth's theorem, the marriage problem, and probability are also discussed. Each chapter ends with a helpful series of exercises and outline solutions appear at the end. ""An excellent text for a topics course in discrete mathematics."" - Bulletin of the Ame

  16. Set theory and logic

    CERN Document Server

    Stoll, Robert R

    1979-01-01

    Set Theory and Logic is the result of a course of lectures for advanced undergraduates, developed at Oberlin College for the purpose of introducing students to the conceptual foundations of mathematics. Mathematics, specifically the real number system, is approached as a unity whose operations can be logically ordered through axioms. One of the most complex and essential of modern mathematical innovations, the theory of sets (crucial to quantum mechanics and other sciences), is introduced in a most careful concept manner, aiming for the maximum in clarity and stimulation for further study in

  17. Why quasi-sets?

    Directory of Open Access Journals (Sweden)

    Décio Krause

    2002-11-01

    Full Text Available Quasi-set theory was developed to deal with collections of indistinguishable objects. In standard mathematics, there are no such kind of entities, for indistinguishability (agreement with respect to all properties entails numerical identity. The main motivation underlying such a theory is of course quantum physics, for collections of indistinguishable (’identical’ in the physicists’ jargon particles cannot be regarded as ’sets’ of standard set theories, which are collections of distinguishable objects. In this paper, a rationale for the development of such a theory is presented, motivated by Heinz Post’s claim that indistinguishability ofquantum entities should be attributed ’right at the start’.

  18. Nonmeasurable sets and functions

    CERN Document Server

    Kharazishvili, Alexander

    2004-01-01

    The book is devoted to various constructions of sets which are nonmeasurable with respect to invariant (more generally, quasi-invariant) measures. Our starting point is the classical Vitali theorem stating the existence of subsets of the real line which are not measurable in the Lebesgue sense. This theorem stimulated the development of the following interesting topics in mathematics:1. Paradoxical decompositions of sets in finite-dimensional Euclidean spaces;2. The theory of non-real-valued-measurable cardinals;3. The theory of invariant (quasi-invariant)extensions of invariant (quasi-invaria

  19. On Random Rough Sets

    Institute of Scientific and Technical Information of China (English)

    Weizhi Wu

    2006-01-01

    In this paper,the concept of a random rough set which includes the mechanisms of numeric and non-numeric aspects of uncertain knowledge is introduced. It is proved that for any belief structure and its inducing belief and plausibility measures there exists a random approximation space such that the associated lower and upper probabilities are respectively the given belief and plausibility measures, and vice versa. And for a random approximation space generated from a totally random set, its inducing lower and upper probabilities are respectively a pair of necessity and possibility measures.

  20. Tools for Manipulation and Characterisation of Nanostructures

    DEFF Research Database (Denmark)

    Mølhave, Kristian; Bøggild, Peter

    For construction and characterization of prototype devices with nanometer-scale parts, we have developed an in-situ scanning electron microscope (SEM) laboratory with a set of novel tools for three-dimensional nanomanipulation. We have designed, fabricated, and characterized microfabricated......) was developed as a method for soldering nanotubes in electrical circuits and constructing highly conductive three-dimensional nanostructures with solid gold cores. Together the developed set of tools comprise a nanolaboratory which in many ways can accomplish the same tasks as an electronic workshop - but using...

  1. [Tool use of objects emerge continuously.

    Science.gov (United States)

    Kahrs, Björn Alexander; Lockman, Jeffrey J

    2012-01-01

    The developmental origins of humans' ability to use objects flexibly as tools remains controversial. Although the dominant approach for conceptualizing tool use development focuses on a qualitative shift in cognition near the end of the first year, we suggest that perception-action theory offers important clues for how infants' earlier exploratory behaviors set the stage for the emergence of tool use. In particular, we consider how infants' attempts to relate objects and surfaces enables them to learn how objects function as extensions of the hand and provide opportunities for practicing the actions that will be recruited for tool use later in development. In this connection, we discuss behavioral and kinematic studies on object manipulation, which show that infants relate objects to surfaces in a discriminative manner and gain greater motor control of banging over the course of the first year. In conclusion, a perception-action perspective suggests that tool use emerges more continuously over developmental time than has traditionally been maintained.

  2. A Software Tool for Legal Drafting

    CERN Document Server

    Gorín, Daniel; Schapachnik, Fernando; 10.4204/EPTCS.68.7

    2011-01-01

    Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the \\FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  3. A Software Tool for Legal Drafting

    Directory of Open Access Journals (Sweden)

    Daniel Gorín

    2011-09-01

    Full Text Available Although many attempts at automated aids for legal drafting have been made, they were based on the construction of a new tool, completely from scratch. This is at least curious, considering that a strong parallelism can be established between a normative document and a software specification: both describe what an entity should or should not do, can or cannot do. In this article we compare normative documents and software specifications to find out their similarities and differences. The comparison shows that there are distinctive particularities, but they are restricted to a very specific subclass of normative propositions. The rest, we postulate, can be dealt with software tools. For such an enterprise the FormaLex tool set was devised: an LTL-based language and companion tools that utilize model checking to find out normative incoherences in regulations, contracts and other legal documents. A feature-rich case study is analyzed with the presented tools.

  4. Software management tools: Lessons learned from use

    Science.gov (United States)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  5. Pneumonia severity scores in resource poor settings

    Directory of Open Access Journals (Sweden)

    Jamie Rylance

    2014-06-01

    Full Text Available Clinical prognostic scores are increasingly used to streamline care in well-resourced settings. The potential benefits of identifying patients at risk of clinical deterioration and poor outcome, delivering appropriate higher level clinical care, and increasing efficiency are clear. In this focused review, we examine the use and applicability of severity scores applied to patients with community acquired pneumonia in resource poor settings. We challenge clinical researchers working in such systems to consider the generalisability of existing severity scores in their populations, and where performance of scores is suboptimal, to promote efforts to develop and validate new tools for the benefit of patients and healthcare systems.

  6. Enterprise integration: A tool`s perspective

    Energy Technology Data Exchange (ETDEWEB)

    Polito, J. [Sandia National Labs., Albuquerque, NM (United States); Jones, A. [National Inst. of Standards and Technology, Gaithersburg, MD (United States); Grant, H. [National Science Foundation, Washington, DC (United States)

    1993-06-01

    The advent of sophisticated automation equipment and computer hardware and software is changing the way manufacturing is carried out. To compete in the global marketplace, manufacturing companies must integrate these new technologies into their factories. In addition, they must integrate the planning, control, and data management methodologies needed to make effective use of these technologies. This paper provides an overview of recent approaches to achieving this enterprise integration. It then describes, using simulation as a particular example, a new tool`s perspective of enterprise integration.

  7. Setting Environmental Standards

    Science.gov (United States)

    Fishbein, Gershon

    1975-01-01

    Recent court decisions have pointed out the complexities involved in setting environmental standards. Environmental health is composed of multiple causative agents, most of which work over long periods of time. This makes the cause-and-effect relationship between health statistics and environmental contaminant exposures difficult to prove in…

  8. English At Academic Setting

    Institute of Scientific and Technical Information of China (English)

    曹悦

    2008-01-01

    This article is to help students to notice that academic writing is the essential part of university study and setting,audience,purpose and also discourse community and its expectations are all its concerns.Through academic writing,students may begin to learn how to make sense in their particular field of study.

  9. SET-Routes programme

    CERN Document Server

    CERN audiovisual service

    2009-01-01

    The SET-Routes programme, launched in 2007 with the goal of attracting girls and young women to careers in science, came to an end in April this year. The result of a collaboration between EMBL, EMBO and CERN, the programme established a network of "ambassadors", women scientists who went out to talk about their careers in science at schools and universities across Europe.

  10. The Crystal Set

    Science.gov (United States)

    Greenslade, Thomas B., Jr.

    2014-01-01

    In past issues of this journal, the late H. R. Crane wrote a long series of articles under the running title of "How Things Work." In them, Dick dealt with many questions that physics teachers asked themselves, but did not have the time to answer. This article is my attempt to work through the physics of the crystal set, which I thought…

  11. SET-Routes programme

    CERN Multimedia

    Marietta Schupp, EMBL Photolab

    2008-01-01

    Dr Sabine Hentze, specialist in human genetics, giving an Insight Lecture entitled "Human Genetics – Diagnostics, Indications and Ethical Issues" on 23 September 2008 at EMBL Heidelberg. Activities in a achool in Budapest during a visit of Angela Bekesi, Ambassadors for the SET-Routes programme.

  12. Therapists in Oncology Settings

    Science.gov (United States)

    Hendrick, Susan S.

    2013-01-01

    This article describes the author's experiences of working with cancer patients/survivors both individually and in support groups for many years, across several settings. It also documents current best-practice guidelines for the psychosocial treatment of cancer patients/survivors and their families. The author's view of the important qualities…

  13. Prices and Price Setting

    NARCIS (Netherlands)

    R.P. Faber (Riemer)

    2010-01-01

    textabstractThis thesis studies price data and tries to unravel the underlying economic processes of why firms have chosen these prices. It focuses on three aspects of price setting. First, it studies whether the existence of a suggested price has a coordinating effect on the prices of firms. Second

  14. Goal Setting and Hope

    Science.gov (United States)

    Curran, Katie; Reivich, Karen

    2011-01-01

    The science behind the mechanisms and mediators that lead to successful goal accomplishment has been a focus of research since the 1970s. When an individual desires to make a change or accomplish an outcome, research shows that he or she will be more successful if he or she attends to a number of variables that are key in goal setting.…

  15. Invariant Set Theory

    CERN Document Server

    Palmer, T N

    2016-01-01

    Invariant Set Theory (IST) is a realistic, locally causal theory of fundamental physics which assumes a much stronger synergy between cosmology and quantum physics than exists in contemporary theory. In IST the (quasi-cyclic) universe $U$ is treated as a deterministic dynamical system evolving precisely on a measure-zero fractal invariant subset $I_U$ of its state space. In this approach, the geometry of $I_U$, and not a set of differential evolution equations in space-time $\\mathcal M_U$, provides the most primitive description of the laws of physics. As such, IST is non-classical. The geometry of $I_U$ is based on Cantor sets of space-time trajectories in state space, homeomorphic to the algebraic set of $p$-adic integers, for large but finite $p$. In IST, the non-commutativity of position and momentum observables arises from number theory - in particular the non-commensurateness of $\\phi$ and $\\cos \\phi$. The complex Hilbert Space and the relativistic Dirac Equation respectively are shown to describe $I_U$...

  16. The Crystal Set

    Science.gov (United States)

    Greenslade, Thomas B., Jr.

    2014-01-01

    In past issues of this journal, the late H. R. Crane wrote a long series of articles under the running title of "How Things Work." In them, Dick dealt with many questions that physics teachers asked themselves, but did not have the time to answer. This article is my attempt to work through the physics of the crystal set, which I thought…

  17. setsApp for Cytoscape: Set operations for Cytoscape Nodes and Edges [v2; ref status: indexed, http://f1000r.es/5lz

    Directory of Open Access Journals (Sweden)

    John H. Morris

    2015-08-01

    Full Text Available setsApp (http://apps.cytoscape.org/apps/setsapp is a relatively simple Cytoscape 3 app for users to handle groups of nodes and/or edges. It supports several important biological workflows and enables various set operations. setsApp provides basic tools to create sets of nodes or edges, import or export sets, and perform standard set operations (union, difference, intersection on those sets. Automatic set partitioning and layout functions are also provided. The sets functionality is also exposed to users and app developers in the form of a set of commands that can be used for scripting purposes or integrated in other Cytoscape apps.

  18. Lunar hand tools

    Science.gov (United States)

    Bentz, Karl F.; Coleman, Robert D.; Dubnik, Kathy; Marshall, William S.; Mcentee, Amy; Na, Sae H.; Patton, Scott G.; West, Michael C.

    1987-01-01

    Tools useful for operations and maintenance tasks on the lunar surface were determined and designed. Primary constraints are the lunar environment, the astronaut's space suit and the strength limits of the astronaut on the moon. A multipurpose rotary motion tool and a collapsible tool carrier were designed. For the rotary tool, a brushless motor and controls were specified, a material for the housing was chosen, bearings and lubrication were recommended and a planetary reduction gear attachment was designed. The tool carrier was designed primarily for ease of access to the tools and fasteners. A material was selected and structural analysis was performed on the carrier. Recommendations were made about the limitations of human performance and about possible attachments to the torque driver.

  19. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  20. Authoring tool evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, A.L.; Klenk, K.S.; Coday, A.C.; McGee, J.P.; Rivenburgh, R.R.; Gonzales, D.M.; Mniszewski, S.M.

    1994-09-15

    This paper discusses and evaluates a number of authoring tools currently on the market. The tools evaluated are Visix Galaxy, NeuronData Open Interface Elements, Sybase Gain Momentum, XVT Power++, Aimtech IconAuthor, Liant C++/Views, and Inmark Technology zApp. Also discussed is the LIST project and how this evaluation is being used to fit an authoring tool to the project.

  1. Population Density Modeling Tool

    Science.gov (United States)

    2014-02-05

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...Density Modeling Tool 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Davy Andrew Michael Knott David Burke 5d. PROJECT NUMBER

  2. Speed-Selector Guard For Machine Tool

    Science.gov (United States)

    Shakhshir, Roda J.; Valentine, Richard L.

    1992-01-01

    Simple guardplate prevents accidental reversal of direction of rotation or sudden change of speed of lathe, milling machine, or other machine tool. Custom-made for specific machine and control settings. Allows control lever to be placed at only one setting. Operator uses handle to slide guard to engage or disengage control lever. Protects personnel from injury and equipment from damage occurring if speed- or direction-control lever inadvertently placed in wrong position.

  3. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  4. Java Power Tools

    CERN Document Server

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  5. Qlikview Audit Tool (QLIKVIEW) -

    Data.gov (United States)

    Department of Transportation — This tool supports the cyclical financial audit process. Qlikview supports large volumes of financial transaction data that can be mined, summarized and presented to...

  6. Double diameter boring tool

    Science.gov (United States)

    Ashbaugh, Fred N.; Murry, Kenneth R.

    1988-12-27

    A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting edges formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first cutting edge tip to the axis of rotation plus the distance from the second cutting edge tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second cutting edge tip to the axis of rotation minus one-half the distance from the first cutting edge tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.

  7. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  8. Social Set Visualizer

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Hussain, Abid; Vatrapu, Ravi

    2015-01-01

    application domain for the dashboard is Corporate Social Responsibility (CSR) and the targeted end-users are CSR researchers and practitioners. The design of the dashboard was based on the "social set analytics" approach to computational social science. The development of the dash-board involved cutting......This paper presents a state-of-the art visual analytics dash-board, Social Set Visualizer (SoSeVi), of approximately 90 million Facebook actions from 11 different companies that have been mentioned in the traditional media in relation to garment factory accidents in Bangladesh. The enterprise......-edge open source visual analytics libraries from D3.js and creation of new visualizations (ac-tor mobility across time, conversational comets etc). Evaluation of the dashboard consisting of technical testing, usability testing, and domain-specific testing with CSR students and yielded positive results....

  9. Revitalizing the setting approach

    DEFF Research Database (Denmark)

    Bloch, Paul; Toft, Ulla; Reinbach, Helene Christine

    2014-01-01

    ¿s everyday life. The supersetting approach argues for optimised effectiveness of health promotion action through integrated efforts and long-lasting partnerships involving a diverse range of actors in public institutions, private enterprises, non-governmental organisations and civil society......BackgroundThe concept of health promotion rests on aspirations aiming at enabling people to increase control over and improve their health. Health promotion action is facilitated in settings such as schools, homes and work places. As a contribution to the promotion of healthy lifestyles, we have.......DiscussionThe supersetting approach is a further development of the setting approach in which the significance of integrated and coordinated actions together with a participatory approach are emphasised and important principles are specified, all of which contribute to the attainment of synergistic effects and sustainable...

  10. Energetic Causal Sets

    CERN Document Server

    Cortês, Marina

    2013-01-01

    We propose an approach to quantum theory based on the energetic causal sets, introduced in Cort\\^{e}s and Smolin (2013). Fundamental processes are causal sets whose events carry momentum and energy, which are transmitted along causal links and conserved at each event. Fundamentally there are amplitudes for such causal processes, but no space-time. An embedding of the causal processes in an emergent space-time arises only at the semiclassical level. Hence, fundamentally there are no commutation relations, no uncertainty principle and, indeed, no hbar. All that remains of quantum theory is the relationship between the absolute value squared of complex amplitudes and probabilities. Consequently, we find that neither locality, nor non locality, are primary concepts, only causality exists at the fundamental level.

  11. Hesitant fuzzy sets theory

    CERN Document Server

    Xu, Zeshui

    2014-01-01

    This book provides the readers with a thorough and systematic introduction to hesitant fuzzy theory. It presents the most recent research results and advanced methods in the field. These includes: hesitant fuzzy aggregation techniques, hesitant fuzzy preference relations, hesitant fuzzy measures, hesitant fuzzy clustering algorithms and hesitant fuzzy multi-attribute decision making methods. Since its introduction by Torra and Narukawa in 2009, hesitant fuzzy sets have become more and more popular and have been used for a wide range of applications, from decision-making problems to cluster analysis, from medical diagnosis to personnel appraisal and information retrieval. This book offers a comprehensive report on the state-of-the-art in hesitant fuzzy sets theory and applications, aiming at becoming a reference guide for both researchers and practitioners in the area of fuzzy mathematics and other applied research fields (e.g. operations research, information science, management science and engineering) chara...

  12. Setting goals in psychotherapy

    DEFF Research Database (Denmark)

    Emiliussen, Jakob; Wagoner, Brady

    2013-01-01

    The present study is concerned with the ethical dilemmas of setting goals in therapy. The main questions that it aims to answer are: who is to set the goals for therapy and who is to decide when they have been reached? The study is based on four semi-­‐structured, phenomenological interviews...... with psychologists, which were analyzed using the framework of the Interpretative Phenomenological Analysis (IPA), with minor changes to the procedure of categorization. Using Harré’s (2002, 2012) Positioning Theory, it is shown that determining goals and deciding if they have been reached are processes...... that are based on asymmetric collaboration between the therapist and the client. Determining goals and deciding when they are reached are not “sterile” procedures, as both the client and the therapist might have different agendas when working therapeutically. The psychologists that participated in this study...

  13. Consistent sets contradict

    CERN Document Server

    Kent, A

    1996-01-01

    In the consistent histories formulation of quantum theory, the probabilistic predictions and retrodictions made from observed data depend on the choice of a consistent set. We show that this freedom allows the formalism to retrodict several contradictory propositions which correspond to orthogonal commuting projections and which all have probability one. We also show that the formalism makes contradictory probability one predictions when applied to generalised time-symmetric quantum mechanics.

  14. The Moon has set

    Science.gov (United States)

    Herschberg, I. S.; Mebius, J. E.

    1989-08-01

    The Sappho epigram mentioned in the title is shown to contain implicit astronomical information, which must have contributed to the expressiveness of Sappho's short poem to contemporary audiences. Astronomical computations are given to discover the earliest and the latest time of year for which the Pleiads set at midnight while being visible earlier in the evening, taking into account the atmospheric refraction. The time of year for which Sappho's poem is valid is concluded to run from 17 Jan. to 29 Mar.

  15. Triage in military settings.

    Science.gov (United States)

    Falzone, E; Pasquier, P; Hoffmann, C; Barbier, O; Boutonnet, M; Salvadori, A; Jarrassier, A; Renner, J; Malgras, B; Mérat, S

    2017-02-01

    Triage, a medical term derived from the French word "trier", is the practical process of sorting casualties to rationally allocate limited resources. In combat settings with limited medical resources and long transportation times, triage is challenging since the objectives are to avoid overcrowding medical treatment facilities while saving a maximum of soldiers and to get as many of them back into action as possible. The new face of modern warfare, asymmetric and non-conventional, has led to the integrative evolution of triage into the theatre of operations. This article defines different triage scores and algorithms currently implemented in military settings. The discrepancies associated with these military triage systems are highlighted. The assessment of combat casualty severity requires several scores and each nation adopts different systems for triage on the battlefield with the same aim of quickly identifying those combat casualties requiring lifesaving and damage control resuscitation procedures. Other areas of interest for triage in military settings are discussed, including predicting the need for massive transfusion, haemodynamic parameters and ultrasound exploration.

  16. Expert System for Test Program Set Fault Candidate Selection

    Science.gov (United States)

    1989-09-01

    This report describes an application of expert system technology to test program set (TPS) verification and validation. The goals of this project are...Keywords: Expert system , Artificial intelligence, Automatic test equipment (ATE), Test program set (TPS), Automatic test program generation (ATPG), Fault inspection, Verification and validation, TPS acceptance tool.

  17. Installation of Ceramic Tile: Residential Thin-Set Methods.

    Science.gov (United States)

    Short, Sam

    This curriculum guide contains materials for use in teaching a course on residential thin-set methods of tile installation. Covered in the individual units are the following topics: the tile industry; basic math; tools; measurement; safety in tile setting; installation materials and guidelines for their use; floors; counter tops and backsplashes;…

  18. Setting Goals for Achievement in Physical Education Settings

    Science.gov (United States)

    Baghurst, Timothy; Tapps, Tyler; Kensinger, Weston

    2015-01-01

    Goal setting has been shown to improve student performance, motivation, and task completion in academic settings. Although goal setting is utilized by many education professionals to help students set realistic and proper goals, physical educators may not be using goal setting effectively. Without incorporating all three types of goals and…

  19. Setting Goals for Achievement in Physical Education Settings

    Science.gov (United States)

    Baghurst, Timothy; Tapps, Tyler; Kensinger, Weston

    2015-01-01

    Goal setting has been shown to improve student performance, motivation, and task completion in academic settings. Although goal setting is utilized by many education professionals to help students set realistic and proper goals, physical educators may not be using goal setting effectively. Without incorporating all three types of goals and…

  20. A note on rough set and non-measurable set

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    It is proved that rough set is equivalent to non-measurable set in measure theory. Hence, rough set is not a new concept in some sense. At the same time, we defined the measurable degree of a set by inner measure and outer measure. Its special case is the accuracy measure of rough set.

  1. Goal setting: an integral component of effective diabetes care.

    Science.gov (United States)

    Miller, Carla K; Bauman, Jennifer

    2014-08-01

    Goal setting is a widely used behavior change tool in diabetes education and training. Prior research found specific relatively difficult but attainable goals set within a specific timeframe improved performance in sports and at the workplace. However, the impact of goal setting in diabetes self-care has not received extensive attention. This review examined the mechanisms underlying behavioral change according to goal setting theory and evaluated the impact of goal setting in diabetes intervention studies. Eight studies were identified, which incorporated goal setting as the primary strategy to promote behavioral change in individual, group-based, and primary care settings among patients with type 2 diabetes. Improvements in diabetes-related self-efficacy, dietary intake, physical activity, and A1c were observed in some but not all studies. More systematic research is needed to determine the conditions and behaviors for which goal setting is most effective. Initial recommendations for using goal setting in diabetes patient encounters are offered.

  2. The effects of environment and ownership on children's innovation of tools and tool material selection.

    Science.gov (United States)

    Sheridan, Kimberly M; Konopasky, Abigail W; Kirkwood, Sophie; Defeyter, Margaret A

    2016-03-19

    Research indicates that in experimental settings, young children of 3-7 years old are unlikely to devise a simple tool to solve a problem. This series of exploratory studies done in museums in the US and UK explores how environment and ownership of materials may improve children's ability and inclination for (i) tool material selection and (ii) innovation. The first study takes place in a children's museum, an environment where children can use tools and materials freely. We replicated a tool innovation task in this environment and found that while 3-4 year olds showed the predicted low levels of innovation rates, 4-7 year olds showed higher rates of innovation than the younger children and than reported in prior studies. The second study explores the effect of whether the experimental materials are owned by the experimenter or the child on tool selection and innovation. Results showed that 5-6 year olds and 6-7 year olds were more likely to select tool material they owned compared to tool material owned by the experimenter, although ownership had no effect on tool innovation. We argue that learning environments supporting tool exploration and invention and conveying ownership over materials may encourage successful tool innovation at earlier ages.

  3. Pneumatically actuated hand tool

    NARCIS (Netherlands)

    Cool, J.C.; Rijnsaardt, K.A.

    1996-01-01

    Abstract of NL 9401195 (A) Pneumatically actuated hand tool for carrying out a mechanical operation, provided with an exchangeable gas cartridge in which the gas which is required for pneumatic actuation is stored. More particularly, the hand tool is provided with at least one pneumatic motor, at

  4. Coring Sample Acquisition Tool

    Science.gov (United States)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  5. WATERS Expert Query Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Expert Query Tool is a web-based reporting tool using the EPA’s WATERS database.There are just three steps to using Expert Query:1. View Selection – Choose what...

  6. Study of Tools Interoperability

    NARCIS (Netherlands)

    Krilavičius, T.

    2007-01-01

    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation

  7. Maailma suurim tool

    Index Scriptorium Estoniae

    2000-01-01

    AS Tartu näitused, Tartu Kunstikool ja ajakiri 'Diivan' korraldavad 9.-11. III Tartu messikeskuse I paviljonis näituse 'Tool 2000'. Eksponeeritakse 2000 tooli, mille hulgast valitakse TOP 12. Messikeskuse territooriumile on kavas püstitada maailma suurim tool. Samal ajal II paviljonis kaksikmess 'Sisustus 2000' ja 'Büroo 2000'.

  8. Volume Sculpting Using the Level-Set Method

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Christensen, Niels Jørgen

    2002-01-01

    In this paper, we propose the use of the Level--Set Method as the underlying technology of a volume sculpting system. The main motivation is that this leads to a very generic technique for deformation of volumetric solids. In addition, our method preserves a distance field volume representation....... A scaling window is used to adapt the Level--Set Method to local deformations and to allow the user to control the intensity of the tool. Level--Set based tools have been implemented in an interactive sculpting system, and we show sculptures created using the system....

  9. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership......This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  10. Software Tool Issues

    Science.gov (United States)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  11. A Cost Estimation Tool for Charter Schools

    Science.gov (United States)

    Hayes, Cheryl D.; Keller, Eric

    2009-01-01

    To align their financing strategies and fundraising efforts with their fiscal needs, charter school leaders need to know how much funding they need and what that funding will support. This cost estimation tool offers a simple set of worksheets to help start-up charter school operators identify and estimate the range of costs and timing of…

  12. Tool sharing in parallel part production

    NARCIS (Netherlands)

    Gaalman, G.J.C.; Nawijn, W.M.

    1996-01-01

    A group of identical NC machines, laid out in a line, manufactures relatively few part types in large batch size. Parts of the same type are processed by the machines simultaneously. The operations on a part are performed by one machine only, using a part type specific tool set. During batch product

  13. OOTW Force Design Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  14. Ordered sets and lattices

    CERN Document Server

    Drashkovicheva, Kh; Igoshin, V I; Katrinyak, T; Kolibiar, M

    1989-01-01

    This book is another publication in the recent surveys of ordered sets and lattices. The papers, which might be characterized as "reviews of reviews," are based on articles reviewed in the Referativnyibreve Zhurnal: Matematika from 1978 to 1982. For the sake of completeness, the authors also attempted to integrate information from other relevant articles from that period. The bibliography of each paper provides references to the reviews in RZhMat and Mathematical Reviews where one can seek more detailed information. Specifically excluded from consideration in this volume were such topics as al

  15. Symmetry Adapted Basis Sets

    DEFF Research Database (Denmark)

    Avery, John Scales; Rettrup, Sten; Avery, James Emil

    In theoretical physics, theoretical chemistry and engineering, one often wishes to solve partial differential equations subject to a set of boundary conditions. This gives rise to eigenvalue problems of which some solutions may be very difficult to find. For example, the problem of finding...... eigenfunctions and eigenvalues for the Hamiltonian of a many-particle system is usually so difficult that it requires approximate methods, the most common of which is expansion of the eigenfunctions in terms of basis functions that obey the boundary conditions of the problem. The computational effort needed...

  16. More on neutrosophic soft rough sets and its modification

    Directory of Open Access Journals (Sweden)

    Emad Marei

    2015-12-01

    Full Text Available This paper aims to introduce and discuss anew mathematical tool for dealing with uncertainties, which is a combination of neutrosophic sets, soft sets and rough sets, namely neutrosophic soft rough set model. Also, its modification is introduced. Some of their properties are studied and supported with proved propositions and many counter examples. Some of rough relations are redefined as a neutrosophic soft rough relations. Comparisons among traditional rough model, suggested neutrosophic soft rough model and its modification, by using their properties and accuracy measures are introduced. Finally, we illustrate that, classical rough set model can be viewed as a special case of suggested models in this paper.

  17. Applied Approaches of Rough Set Theory to Web Mining

    Institute of Scientific and Technical Information of China (English)

    SUN Tie-li; JIAO Wei-wei

    2006-01-01

    Rough set theory is a new soft computing tool, and has received much attention of researchers around the world. It can deal with incomplete and uncertain information. Now,it has been applied in many areas successfully. This paper introduces the basic concepts of rough set and discusses its applications in Web mining. In particular, some applications of rough set theory to intelligent information processing are emphasized.

  18. setsApp: Set operations for Cytoscape Nodes and Edges [v1; ref status: indexed, http://f1000r.es/3ml

    Directory of Open Access Journals (Sweden)

    John H. Morris

    2014-07-01

    Full Text Available setsApp (http://apps.cytoscape.org/apps/setsapp is a relatively simple Cytoscape 3 app for users to handle groups of nodes and/or edges. It supports several important biological workflows and enables various set operations. setsApp provides basic tools to create sets of nodes or edges, import or export sets, and perform standard set operations (union, difference, intersection on those sets. The sets functionality is also exposed to users and app developers in the form of a set of commands that can be used for scripting purposes or integrated in other Cytoscape apps.

  19. Program Development Tools and Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, M

    2012-03-12

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  20. Axiomatic set theory

    CERN Document Server

    Takeuti, Gaisi

    1973-01-01

    This text deals with three basic techniques for constructing models of Zermelo-Fraenkel set theory: relative constructibility, Cohen's forcing, and Scott-Solovay's method of Boolean valued models. Our main concern will be the development of a unified theory that encompasses these techniques in one comprehensive framework. Consequently we will focus on certain funda­ mental and intrinsic relations between these methods of model construction. Extensive applications will not be treated here. This text is a continuation of our book, "I ntroduction to Axiomatic Set Theory," Springer-Verlag, 1971; indeed the two texts were originally planned as a single volume. The content of this volume is essentially that of a course taught by the first author at the University of Illinois in the spring of 1969. From the first author's lectures, a first draft was prepared by Klaus Gloede with the assistance of Donald Pelletier and the second author. This draft was then rcvised by the first author assisted by Hisao Tanaka. The in...