WorldWideScience

Sample records for general data-reduction tool

  1. p3d: a general data-reduction tool for fiber-fed integral-field spectrographs

    CERN Document Server

    Sandin, C; Roth, M M; Gerssen, J; Monreal-Ibero, A; Böhm, P; Weilbacher, P

    2010-01-01

    The reduction of integral-field spectrograph (IFS) data is demanding work. Many repetitive operations are required in order to convert raw data into, typically a large number of, spectra. This effort can be markedly simplified through the use of a tool or pipeline, which is designed to complete many of the repetitive operations without human interaction. Here we present our semi-automatic data-reduction tool p3d that is designed to be used with fiber-fed IFSs. Important components of p3d include a novel algorithm for automatic finding and tracing of spectra on the detector, and two methods of optimal spectrum extraction in addition to standard aperture extraction. p3d also provides tools to combine several images, perform wavelength calibration and flat field data. p3d is at the moment configured for four IFSs. In order to evaluate its performance we have tested the different components of the tool. For these tests we used both simulated and observational data. We demonstrate that for three of the IFSs a corr...

  2. New Swift UVOT data reduction tools and AGN variability studies

    Science.gov (United States)

    Gelbord, Jonathan; Edelson, Rick

    2017-08-01

    The efficient slewing and flexible scheduling of the Swift observatory have made it possible to conduct monitoring campaigns that are both intensive and prolonged, with multiple visits per day sustained over weeks and months. Recent Swift monitoring campaigns of a handful of AGN provide simultaneous optical, UV and X-ray light curves that can be used to measure variability and interband correlations on timescales from hours to months, providing new constraints for the structures within AGN and the relationships between them. However, the first of these campaigns, thrice-per-day observations of NGC 5548 through four months, revealed anomalous dropouts in the UVOT light curves (Edelson, Gelbord, et al. 2015). We identified the cause as localized regions of reduced detector sensitivity that are not corrected by standard processing. Properly interpreting the light curves required identifying and screening out the affected measurements.We are now using archival Swift data to better characterize these low sensitivity regions. Our immediate goal is to produce a more complete mapping of their locations so that affected measurements can be identified and screened before further analysis. Our longer-term goal is to build a more quantitative model of the effect in order to define a correction for measured fluxes, if possible, or at least to put limits on the impact upon any observation. We will combine data from numerous background stars in well-monitored fields in order to quantify the strength of the effect as a function of filter as well as location on the detector, and to test for other dependencies such as evolution over time or sensitivity to the count rate of the target. Our UVOT sensitivity maps and any correction tools will be provided to the community of Swift users.

  3. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    Science.gov (United States)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  4. Unique ion filter: a data reduction tool for GC/MS data preprocessing prior to chemometric analysis.

    Science.gov (United States)

    Adutwum, L A; Harynuk, J J

    2014-08-01

    Using raw GC/MS data as the X-block for chemometric modeling has the potential to provide better classification models for complex samples when compared to using the total ion current (TIC), extracted ion chromatograms/profiles (EIC/EIP), or integrated peak tables. However, the abundance of raw GC/MS data necessitates some form of data reduction/feature selection to remove the variables containing primarily noise from the data set. Several algorithms for feature selection exist; however, due to the extreme number of variables (10(6)-10(8) variables per chromatogram), the feature selection time can be prolonged and computationally expensive. Herein, we present a new prefilter for automated data reduction of GC/MS data prior to feature selection. This tool, termed unique ion filter (UIF), is a module that can be added after chromatographic alignment and prior to any subsequent feature selection algorithm. The UIF objectively reduces the number of irrelevant or redundant variables in raw GC/MS data, while preserving potentially relevant analytical information. In the m/z dimension, data are reduced from a full spectrum to a handful of unique ions for each chromatographic peak. In the time dimension, data are reduced to only a handful of scans around each peak apex. UIF was applied to a data set of GC/MS data for a variety of gasoline samples to be classified using partial least-squares discriminant analysis (PLS-DA) according to octane rating. It was also applied to a series of chromatograms from casework fire debris analysis to be classified on the basis of whether or not signatures of gasoline were detected. By reducing the overall population of candidate variables subjected to subsequent variable selection, the UIF reduced the total feature selection time for which a perfect classification of all validation data was achieved from 373 to 9 min (98% reduction in computing time). Additionally, the significant reduction in included variables resulted in a concomitant

  5. General purpose MDE tools

    Directory of Open Access Journals (Sweden)

    Juan Manuel Cueva Lovelle

    2008-12-01

    Full Text Available MDE paradigm promises to release developers from writing code. The basis of this paradigm consists in working at such a level of abstraction that will make it easyer for analysts to detail the project to be undertaken. Using the model described by analysts, software tools will do the rest of the task, generating software that will comply with customer's defined requirements. The purpose of this study is to compare general purpose tools available right now that enable to put in practice the principles of this paradigm and aimed at generating a wide variety of applications composed by interactive multimedia and artificial intelligence components.

  6. GumTree: Data reduction

    Science.gov (United States)

    Rayner, Hugh; Hathaway, Paul; Hauser, Nick; Fei, Yang; Franceschini, Ferdi; Lam, Tony

    2006-11-01

    Access to software tools for interactive data reduction, visualisation and analysis during a neutron scattering experiment enables instrument users to make informed decisions regarding the direction and success of their experiment. ANSTO aims to enhance the experiment experience of its facility's users by integrating these data reduction tools with the instrument control interface for immediate feedback. GumTree is a software framework and application designed to support an Integrated Scientific Experimental Environment, for concurrent access to instrument control, data acquisition, visualisation and analysis software. The Data Reduction and Analysis (DRA) module is a component of the GumTree framework that allows users to perform data reduction, correction and basic analysis within GumTree while an experiment is running. It is highly integrated with GumTree, able to pull experiment data and metadata directly from the instrument control and data acquisition components. The DRA itself uses components common to all instruments at the facility, providing a consistent interface. It features familiar ISAW-based 1D and 2D plotting, an OpenGL-based 3D plotter and peak fitting performed by fityk. This paper covers the benefits of integration, the flexibility of the DRA module, ease of use for the interface and audit trail generation.

  7. The Panchromatic High-Resolution Spectroscopic Survey of Local Group Star Clusters - I. General Data Reduction Procedures for the VLT/X-shooter UVB and VIS arm

    CERN Document Server

    Schönebeck, Frederik; Pasquali, Anna; Grebel, Eva K; Kissler-Patig, Markus; Kuntschner, Harald; Lyubenova, Mariya; Perina, Sibilla

    2014-01-01

    Our dataset contains spectroscopic observations of 29 globular clusters in the Magellanic Clouds and the Milky Way performed with VLT/X-shooter. Here we present detailed data reduction procedures for the VLT/X-shooter UVB and VIS arm. These are not restricted to our particular dataset, but are generally applicable to different kinds of X-shooter data without major limitation on the astronomical object of interest. The packaged pipeline provided by ESO (v1.5.0) performs well and reliably for the wavelength calibration and the associated rectification procedure, yet we find several weaknesses in the reduction cascade that are addressed with additional calibration steps, such as bad pixel interpolation, flat fielding, and slit illumination corrections. Furthermore, the instrumental PSF is analytically modeled and used to reconstruct flux losses at slit transit and for optimally extracting point sources. Regular observations of spectrophotometric standard stars allow us to detect instrumental variability, which n...

  8. General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  9. XRT -- ROSAT XRT Data Reduction

    Science.gov (United States)

    Davenhall, A. C.; Platon, R. T.

    XRT is a package for reducing data acquired with the ROSAT XRT instruments. The XRT (X-Ray Telescope) was the principal scientific payload of the ROSAT X-ray astronomy satellite. The XRT had two instruments: the PSPC (Position Sensitive Proportional Counter) and the HRI (High Resolution Imager). The XRT package operates on data produced by these instruments and can be used to transform them into calibrated images, spectra, time-series etc. XRT was created by taking the ROSAT XRT-specific functions in the ASTERIX general X-ray astronomy data reduction system and re-packaging them as stand-alone applications.

  10. Automated data reduction workflows for astronomy

    CERN Document Server

    Freudling, W; Bramich, D M; Ballester, P; Forchi, V; Garcia-Dablo, C E; Moehler, S; Neeser, M J

    2013-01-01

    Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. The efficiency of data reduction can be improved by using automatic workflows to organise data and execute the sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowch...

  11. General Mission Analysis Tool (GMAT) Mathematical Specifications

    Science.gov (United States)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  12. The MUSE Data Reduction Software Pipeline

    Science.gov (United States)

    Weilbacher, P. M.; Roth, M. M.; Pécontal-Rousset, A.; Bacon, R.; Muse Team

    2006-07-01

    After giving a short overview of the instrument characteristics of the second generation VLT instrument MUSE, we discuss properties of the data will look like and present challenges and goals of its data reduction software. It is conceived as a number of pipeline recipes to be run in an automated way within the ESO data flow system. These recipes are based on a data reduction library that is being written in the C language using ESO's CPL API. We give a short overview of the steps needed for reduction and post-processing of science data, discuss requirements of a future visualization tool for integral field spectroscopy and close with the timeline for MUSE and its data reduction pipeline.

  13. A business intelligence approach using web search tools and online data reduction techniques to examine the value of product-enabled services

    DEFF Research Database (Denmark)

    Tanev, Stoyan; Liotta, Giacomo; Kleismantas, Andrius

    2015-01-01

    in Canada and Europe. It adopts an innovative methodology based on online textual data that could be implemented in advanced business intelligence tools aiming at the facilitation of innovation, marketing and business decision making. Combinations of keywords referring to different aspects of service value...... were designed and used in a web search resulting in the frequency of their use on companies’ websites. Principal component analysis was applied to identify distinctive groups of keyword combinations that were interpreted in terms of specific service value attributes. Finally, the firms were classified...... by means of K-means cluster analysis in order to identify the firms with a high degree of articulation of their service value attributes. The results show that the main service value attributes of the Canadian firms are: better service effectiveness, higher market share, higher service quality...

  14. Robust methods for data reduction

    CERN Document Server

    Farcomeni, Alessio

    2015-01-01

    Robust Methods for Data Reduction gives a non-technical overview of robust data reduction techniques, encouraging the use of these important and useful methods in practical applications. The main areas covered include principal components analysis, sparse principal component analysis, canonical correlation analysis, factor analysis, clustering, double clustering, and discriminant analysis.The first part of the book illustrates how dimension reduction techniques synthesize available information by reducing the dimensionality of the data. The second part focuses on cluster and discriminant analy

  15. 2dfdr: Data reduction software

    Science.gov (United States)

    AAO software Team

    2015-05-01

    2dfdr is an automatic data reduction pipeline dedicated to reducing multi-fibre spectroscopy data, with current implementations for AAOmega (fed by the 2dF, KOALA-IFU, SAMI Multi-IFU or older SPIRAL front-ends), HERMES, 2dF (spectrograph), 6dF, and FMOS. A graphical user interface is provided to control data reduction and allow inspection of the reduced spectra.

  16. General Analysis Tool Box for Controlled Perturbation

    CERN Document Server

    Osbild, Ralf

    2012-01-01

    The implementation of reliable and efficient geometric algorithms is a challenging task. The reason is the following conflict: On the one hand, computing with rounded arithmetic may question the reliability of programs while, on the other hand, computing with exact arithmetic may be too expensive and hence inefficient. One solution is the implementation of controlled perturbation algorithms which combine the speed of floating-point arithmetic with a protection mechanism that guarantees reliability, nonetheless. This paper is concerned with the performance analysis of controlled perturbation algorithms in theory. We answer this question with the presentation of a general analysis tool box. This tool box is separated into independent components which are presented individually with their interfaces. This way, the tool box supports alternative approaches for the derivation of the most crucial bounds. We present three approaches for this task. Furthermore, we have thoroughly reworked the concept of controlled per...

  17. Data Reduction with the MIKE Spectrometer

    CERN Document Server

    Bernstein, Rebecca A; Prochaska, J Xavier

    2015-01-01

    This manuscript describes the design, usage, and data-reduction pipeline developed for the Magellan Inamori Kyocera Echelle (MIKE) spectrometer used with the Magellan telescope at the Las Campanas Observatory. We summarize the basic characteristics of the instrument and discuss observational procedures recommended for calibrating the standard data products. We detail the design and implementation of an IDL based data-reduction pipeline for MIKE data (since generalized to other echelle spectrometers, e.g. Keck/HIRES, VLT/UVES). This includes novel techniques for flat-fielding, wavelength calibration, and the extraction of echelle spectroscopy. Sufficient detail is provided in this manuscript to enable inexperienced observers to understand the strengths and weaknesses of the instrument and software package and an assessment of the related systematics.

  18. The CARMA Data Reduction Pipeline

    Science.gov (United States)

    Friedel, D. N.

    2013-10-01

    The Combined Array for Millimeter-wave Astronomy (CARMA) data reduction pipeline (CADRE) has been developed to give investigators a first look at a fully reduced set of their data. It runs automatically on all data produced by the telescope as they arrive in the data archive. CADRE is written in Python and uses Python wrappers for MIRIAD subroutines for direct access to the data. It applies passband, gain and flux calibration to the data sets and produces a set of continuum and spectral line maps in both MIRIAD and FITS format. CADRE has been in production for a year and this poster will discuss the current capabilities and planned improvements.

  19. CCD data reductions at ESO

    Science.gov (United States)

    Grosbol, Preben

    The image-processing and data-reduction functions of the IHAP and MIDAS software packages developed at ESO for CCD astronomy are briefly reviewed. IHAP and MIDAS perform the same basic operations on HP 1000 and VAX computers, respectively, and MIDAS is currently being modified to run in the UNIX operating system as well as in VAX VMS. Consideration is given to the special properties of CCD data, the removal of gross errors (due to bad pixels and cosmic-ray events), photometric correction for dark current and sensitivity variations, digital filtering and Fourier transforms, detection and classification algorithms for direct imaging, surface photometry of extended objects, function fitting, and image deconvolution.

  20. Using the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel J.; Parker, Joel

    2017-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.

  1. Data reduction for the MATISSE instrument

    CERN Document Server

    Millour, Florentin; Heininger, M; Hofmann, K -H; Schertl, D; Weigelt, G; Guitton, F; Jaffe, W; Beckmann, U; Petrov, R; Allouche, F; Robbe-Dubois, S; Lagarde, S; Soulain, A; Meilland, A; Matter, A; Cruzalèbes, P; Lopez, B

    2016-01-01

    We present in this paper the general formalism and data processing steps used in the MATISSE data reduction software, as it has been developed by the MATISSE consortium. The MATISSE instrument is the mid-infrared new generation interferometric instrument of the Very Large Telescope Interferometer (VLTI). It is a 2-in-1 instrument with 2 cryostats and 2 detectors: one 2k x 2k Rockwell Hawaii 2RG detector for L&M-bands, and one 1k x 1k Raytheon Aquarius detector for N-band, both read at high framerates, up to 30 frames per second. MATISSE is undergoing its first tests in laboratory today.

  2. Mercury and frame-dragging in light of the MESSENGER flybys: conflict with general relativity, poor knowledge of the physical properties of the Sun, data reduction artifact, or still insufficient observations?

    CERN Document Server

    Iorio, Lorenzo

    2011-01-01

    The Lense-Thirring precession of the longitude of perihelion of Mercury, as predicted by general relativity by using the value of the Sun's angular momentum S = 190 x 10^39 kg m^2 s^-1 from helioseismology, is -2.0 milliarcseconds per century, computed in a celestial equatorial reference frame. It disagrees at 4-{\\sigma} level with the correction 0.4 +/- 0.6 milliarcseconds per century to the standard Newtonian/Einsteinian precession. It was recently determined in a global fit with the INPOP10a ephemerides to a long planetary data record (1914-2010) including also 3 data points collected in 2008-2009 from the MESSENGER spacecraft. The INPOP10a models did not include the solar gravitomagnetic field at all, so that its signature might have partly been removed in the data reduction process. On the other hand, the Lense-Thirring precession may have been canceled to a certain extent by the competing precession caused by a small mismodeling in the quadrupole mass moment of the Sun, actually modeled, of the order of...

  3. General model for boring tool optimization

    Science.gov (United States)

    Moraru, G. M.; rbes, M. V. Ze; Popescu, L. G.

    2016-08-01

    Optimizing a tool (and therefore those for boring) consist in improving its performance through maximizing the objective functions chosen by the designer and/or by user. In order to define and to implement the proposed objective functions, contribute numerous features and performance required by tool users. Incorporation of new features makes the cutting tool to be competitive in the market and to meet user requirements.

  4. Development of an expert data reduction assistant

    Science.gov (United States)

    Miller, Glenn E.; Johnston, Mark D.; Hanisch, Robert J.

    1993-01-01

    We propose the development of an expert system tool for the management and reduction of complex datasets. the proposed work is an extension of a successful prototype system for the calibration of CCD (charge coupled device) images developed by Dr. Johnston in 1987. (ref.: Proceedings of the Goddard Conference on Space Applications of Artificial Intelligence). The reduction of complex multi-parameter data sets presents severe challenges to a scientist. Not only must a particular data analysis system be mastered, (e.g. IRAF/SDAS/MIDAS), large amounts of data can require many days of tedious work and supervision by the scientist for even the most straightforward reductions. The proposed Expert Data Reduction Assistant will help the scientist overcome these obstacles by developing a reduction plan based on the data at hand and producing a script for the reduction of the data in a target common language.

  5. Data Reduction Pipeline for GTC/FRIDA

    Science.gov (United States)

    Eliche-Moral, M. C.; Cardiel, N.; Pascual, S.; Gallego, J.

    2009-07-01

    FRIDA (inFRared Imager and Dissector for the Adaptive optics system of the GTC) will be a NIR (1-2.5 μm) imager and Integral Field Unit spectrograph to operate with the Adaptive Optics system of the 10.4 m GTC telescope. FRIDA will offer broad and narrow band diffraction-limited imaging and integral field spectroscopy at low, intermediate and high spectral resolution. The Extragalactic Astrophysics and Astronomical Instrumentation group of the Universidad Complutense de Madrid (GUAIX) is developing the Data Reduction Pipeline for FRIDA. Specific tools for converting output, reduced datacubes to the standard Euro3D FITS format will be developed, in order to allow users to exploit existing VO applications for analysis. FRIDA is to be commissioned on the telescope in 2011.

  6. Parallel Enhancements of the General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  7. Peak Wind Tool for General Forecasting

    Science.gov (United States)

    Barrett, Joe H., III

    2010-01-01

    The expected peak wind speed of the day is an important forecast element in the 45th Weather Squadron's (45 WS) daily 24-Hour and Weekly Planning Forecasts. The forecasts are used for ground and space launch operations at the Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). The 45 WS also issues wind advisories for KSC/CCAFS when they expect wind gusts to meet or exceed 25 kt, 35 kt and 50 kt thresholds at any level from the surface to 300 ft. The 45 WS forecasters have indicated peak wind speeds are challenging to forecast, particularly in the cool season months of October - April. In Phase I of this task, the Applied Meteorology Unit (AMU) developed a tool to help the 45 WS forecast non-convective winds at KSC/CCAFS for the 24-hour period of 0800 to 0800 local time. The tool was delivered as a Microsoft Excel graphical user interface (GUI). The GUI displayed the forecast of peak wind speed, 5-minute average wind speed at the time of the peak wind, timing of the peak wind and probability the peak speed would meet or exceed 25 kt, 35 kt and 50 kt. For the current task (Phase II ), the 45 WS requested additional observations be used for the creation of the forecast equations by expanding the period of record (POR). Additional parameters were evaluated as predictors, including wind speeds between 500 ft and 3000 ft, static stability classification, Bulk Richardson Number, mixing depth, vertical wind shear, temperature inversion strength and depth and wind direction. Using a verification data set, the AMU compared the performance of the Phase I and II prediction methods. Just as in Phase I, the tool was delivered as a Microsoft Excel GUI. The 45 WS requested the tool also be available in the Meteorological Interactive Data Display System (MIDDS). The AMU first expanded the POR by two years by adding tower observations, surface observations and CCAFS (XMR) soundings for the cool season months of March 2007 to April 2009. The POR was expanded

  8. General concepts and tools of political marketing

    Directory of Open Access Journals (Sweden)

    O.S. Teletov

    2013-06-01

    Full Text Available The aim of the article. The aim of the article is consideration of political systems using marketing concept, which has recently received significant development in society live. The general laws of economic and social systems development are confirmed.The results of the analysis. It is known that systemic approaches to the solution of certain scientific problems make it possible to transfer properties of studied patterns on other systems. In the beginning of twenty-first century economy and politics intertwin with each other. So it is naturally to consider political events of recent years using marketing methodology, principles of market segmentation, research methods, properties of certain elements of the marketing mix: product life cycle graph, financial policy, instruments of marketing communications and more. Nowadays marketing approach moves from the market of goods and services to nonprofit sector. It happens because a lot of processes in this sphere are based on the use of market approaches and mechanisms. President, parliamentary and local government elections are the most felicitous objects to verify the effectiveness of marketing concept inpolitics.Political marketing is defined as a system of means and measures to create an image of party or its leaders. This image has to meet the expectations of their potential electorate and stress the differences between our party and competitors. The purpose of political marketing is to coordinate steps and program of the party or its leader with electoral expectations, to attract voter and to form assessment criterions. Such criterions can be: win or significant percentage of election results, the current rating of the party and so on.The subject of research in political marketing is election campaigns, mass political and educational events, propaganda work, lobbying process, preparation and implementation of various projects, work with political parties and public organizations. An applied

  9. Generalized Geophysical Retrieval and Analysis Tool for Planetary Atmospheres Project

    Data.gov (United States)

    National Aeronautics and Space Administration — CPI proposes to develop an innovative, generalized retrieval algorithm and analysis tool (GRANT) that will facilitate analysis of remote sensing data from both...

  10. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  11. General practice ethnicity data: evaluation of a tool

    Directory of Open Access Journals (Sweden)

    Neuwelt P

    2014-03-01

    Full Text Available INTRODUCTION: There is evidence that the collection of ethnicity data in New Zealand primary care is variable and that data recording in practices does not always align with the procedures outlined in the Ethnicity Data Protocols for the Health and Disability Sector. In 2010, The Ministry of Health funded the development of a tool to audit the collection of ethnicity data in primary care. The aim of this study was to pilot the Ethnicity Data Audit Tool (EAT in general practice. The goal was to evaluate the tool and identify recommendations for its improvement. METHODS: Eight general practices in the Waitemata District Health Board region participated in the EAT pilot. Feedback about the pilot process was gathered by questionnaires and interviews, to gain an understanding of practices’ experiences in using the tool. Questionnaire and interview data were analysed using a simple analytical framework and a general inductive method. FINDINGS: General practice receptionists, practice managers and general practitioners participated in the pilot. Participants found the pilot process challenging but enlightening. The majority felt that the EAT was a useful quality improvement tool for handling patient ethnicity data. Larger practices were the most positive about the tool. CONCLUSION: The findings suggest that, with minor improvements to the toolkit, the EAT has the potential to lead to significant improvements in the quality of ethnicity data collection and recording in New Zealand general practices. Other system-level factors also need to be addressed.

  12. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    In classical time series analysis the sample autocorrelation function (SACF) and the sample partial autocorrelation function (SPACF) has gained wide application for structural identification of linear time series models. We suggest generalizations, founded on smoothing techniques, applicable for ....... In this paper the generalizations are applied to some simulated data sets and to the Canadian lynx data. The generalizations seem to perform well and the measure of the departure from linearity proves to be an important additional tool....

  13. Delivering data reduction pipelines to science users

    Science.gov (United States)

    Freudling, Wolfram; Romaniello, Martino

    2016-07-01

    The European Southern Observatory has a long history of providing specialized data processing algorithms, called recipes, for most of its instruments. These recipes are used for both operational purposes at the observatory sites, and for data reduction by the scientists at their home institutions. The two applications require substantially different environments for running and controlling the recipes. In this papers, we describe the ESOReflex environment that is used for running recipes on the users' desktops. ESOReflex is a workflow driven data reduction environment. It allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection of and interaction with the data. It includes fully automatic data organization and visualization, interaction with recipes, and the exploration of the provenance tree of intermediate and final data products. ESOReflex uses a number of innovative concepts that have been described in Ref. 1. In October 2015, the complete system was released to the public. ESOReflex allows highly efficient data reduction, using its internal bookkeeping database to recognize and skip previously completed steps during repeated processing of the same or similar data sets. It has been widely adopted by the science community for the reduction of VLT data.

  14. The pipeline for the GOSSS data reduction

    CERN Document Server

    Sota, Alfredo

    2011-01-01

    The Galactic O-Star Spectroscopic Survey (GOSSS) is an ambitious project that is observing all known Galactic O stars with B < 13 in the blue-violet part of the spectrum with R-2500. It is based on version 2 of the most complete catalog to date of Galactic O stars with accurate spectral types (v1, Ma\\'iz Apell\\'aniz et al. 2004 ;v2, Sota et al. 2008). Given the large amount of data that we are getting (more than 150 nights of observations at three different observatories in the last 4 years) we have developed an automatic spectroscopic reduction pipeline. This pipeline has been programmed in IDL and automates the process of data reduction. It can operate in two modes: automatic data reduction (quicklook) or semi-automatic data reduction (full). In "quicklook", we are able to get rectified and calibrated spectra of all stars of a full night just minutes after the observations. The pipeline automatically identifies the type of image and applies the standard reduction procedure (bias subtraction, flat field c...

  15. CADRE: The CArma Data REduction pipeline

    Science.gov (United States)

    Friedel, D. N.

    2013-08-01

    The Combined Array for Millimeter-wave Astronomy (CARMA) data reduction pipeline (CADRE) has been developed to give investigators a first look at a fully reduced set of their data. It runs automatically on all data produced by the telescope as they arrive in the CARMA data archive. CADRE is written in Python and uses Python wrappers for MIRIAD subroutines for direct access to the data. It goes through the typical reduction procedures for radio telescope array data and produces a set of continuum and spectral line maps in both MIRIAD and FITS format. CADRE has been in production for nearly two years and this paper presents the current capabilities and planned development.

  16. CADRE: The CArma Data REduction pipeline

    CERN Document Server

    Friedel, D N

    2013-01-01

    The Combined Array for Millimeter-wave Astronomy (CARMA) data reduction pipeline (CADRE) has been developed to give investigators a first look at a fully reduced set of their data. It runs automatically on all data produced by the telescope as they arrive in the CARMA data archive. CADRE is written in Python and uses Python wrappers for MIRIAD subroutines for direct access to the data. It goes through the typical reduction procedures for radio telescope array data and produces a set of continuum and spectral line maps in both MIRIAD and FITS format. CADRE has been in production for nearly two years and this paper presents the current capabilities and planned development.

  17. Infrared Imaging Data Reduction Software and Techniques

    CERN Document Server

    Sabbey, C N; Lewis, J R; Irwin, M J; Sabbey, Chris N.; Mahon, Richard G. Mc; Lewis, James R.; Irwin, Mike J.

    2001-01-01

    We describe the InfraRed Data Reduction (IRDR) software package, a small ANSI C library of fast image processing routines for automated pipeline reduction of infrared (dithered) observations. We developed the software to satisfy certain design requirements not met in existing packages (e.g., full weight map handling) and to optimize the software for large data sets (non-interactive tasks that are CPU and disk efficient). The software includes stand-alone C programs for tasks such as running sky frame subtraction with object masking, image registration and coaddition with weight maps, dither offset measurement using cross-correlation, and object mask dilation. Although we currently use the software to process data taken with CIRSI (a near-IR mosaic imager), the software is modular and concise and should be easy to adapt/reuse for other work. IRDR is available from anonymous ftp to ftp.ast.cam.ac.uk in pub/sabbey.

  18. The e-MERLIN Data Reduction Pipeline

    CERN Document Server

    Argo, Megan

    2015-01-01

    Written in Python and utilising ParselTongue to interface with the Astronomical Image Processing System (AIPS), the e-MERLIN data reduction pipeline is intended to automate the procedures required in processing and calibrating radio astronomy data from the e-MERLIN correlator. Driven by a plain text file of input parameters, the pipeline is modular and can be run in stages by the user, depending on requirements. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent, carry out some or all of the calibration procedures including self-calibration), and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so that the quality of the data can be assessed. The software is available for download from the e-MERLIN website or via Github.

  19. e-MERLIN data reduction pipeline

    Science.gov (United States)

    Argo, Megan

    2014-07-01

    Written in Python and utilizing ParselTongue (ascl:1208.020) to interface with AIPS (ascl:9911.003), the e-MERLIN data reduction pipeline processes, calibrates and images data from the UK's radio interferometric array (Multi-Element Remote-Linked Interferometer Network). Driven by a plain text input file, the pipeline is modular and can be run in stages. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent (ascl:1312.001), carry out some or all of the calibration procedures (including self-calibration), and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so data quality can be assessed.

  20. The e-MERLIN Data Reduction Pipeline

    Directory of Open Access Journals (Sweden)

    Megan Kirsty Argo

    2015-01-01

    Full Text Available Written in Python and utilising ParselTongue to interface with the Astronomical Image Processing System (AIPS, the e-MERLIN data reduction pipeline is intended to automate the procedures required in processing and calibrating radio astronomy data from the e-MERLIN correlator. Driven by a plain text file of input parameters, the pipeline is modular and can be run in stages by the user, depending on requirements. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent, carry out some or all of the calibration procedures (including self-calibration, and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so that the quality of the data can be assessed. The software is available for download from the e-MERLIN website or via Github.

  1. The GALAH survey: The data reduction pipeline

    CERN Document Server

    Kos, Janez; Zwitter, Tomaž; Žerjal, Maruška; Sharma, Sanjib; Bland-Hawthorn, Joss; Asplund, Martin; Casey, Andrew R; De Silva, Gayandhi M; Freeman, Ken C; Martell, Sarah L; Simpson, Jeffrey D; Schlesinger, Katharine J; Zucker, Daniel; Anguiano, Borja; Bacigalupo, Carlos; Bedding, Timothy R; Betters, Christopher; Da Costa, Gary; Duong, Ly; Hyde, Elaina; Ireland, Michael; Kafle, Prajwal R; Leon-Saval, Sergio; Lewis, Geraint F; Munari, Ulisse; Nataf, David; Stello, Dennis; Tinney, Chris G; Traven, Gregor; Watson, Fred; Wittenmyer, Robert A

    2016-01-01

    We present the data reduction procedures being used by the GALAH survey, carried out with the HERMES fibre-fed, multi-object spectrograph on the 3.9~m Anglo-Australian Telescope. GALAH is a unique survey, targeting 1 million stars brighter than magnitude V=14 at a resolution of 28,000 with a goal to measure the abundances of 29 elements. Such a large number of high resolution spectra necessitates the development of a reduction pipeline optimized for speed, accuracy, and consistency. We outline the design and structure of the Iraf-based reduction pipeline that we developed, specifically for GALAH, to produce fully calibrated spectra aimed for subsequent stellar atmospheric parameter estimation. The pipeline takes advantage of existing Iraf routines and other readily available software so as to be simple to maintain, testable and reliable. A radial velocity and stellar atmospheric parameter estimator code is also presented, which is used for further data analysis and yields a useful verification of the reductio...

  2. Data Reduction of Multi-wavelength Observations

    CERN Document Server

    Pilia, M; Pellizzoni, A P; Bachetti, M; Piano, G; Poddighe, A; Egron, E; Iacolina, M N; Melis, A; Concu, R; Possenti, A; Perrodin, D

    2015-01-01

    Multi-messenger astronomy is becoming the key to understanding the Universe from a comprehensive perspective. In most cases, the data and the technology are already in place, therefore it is important to provide an easily-accessible package that combines datasets from multiple telescopes at different wavelengths. In order to achieve this, we are working to produce a data analysis pipeline that allows the data reduction from different instruments without needing detailed knowledge of each observation. Ideally, the specifics of each observation are automatically dealt with, while the necessary information on how to handle the data in each case is provided by a tutorial that is included in the program. We first focus our project on the study of pulsars and their wind nebulae (PWNe) at radio and gamma-ray frequencies. In this way, we aim to combine time-domain and imaging datasets at two extremes of the electromagnetic spectrum. In addition, the emission has the same non-thermal origin in pulsars at radio and gam...

  3. The GALAH survey: the data reduction pipeline

    Science.gov (United States)

    Kos, Janez; Lin, Jane; Zwitter, Tomaž; Žerjal, Maruška; Sharma, Sanjib; Bland-Hawthorn, Joss; Asplund, Martin; Casey, Andrew R.; De Silva, Gayandhi M.; Freeman, Ken C.; Martell, Sarah L.; Simpson, Jeffrey D.; Schlesinger, Katharine J.; Zucker, Daniel; Anguiano, Borja; Bacigalupo, Carlos; Bedding, Timothy R.; Betters, Christopher; Da Costa, Gary; Duong, Ly; Hyde, Elaina; Ireland, Michael; Kafle, Prajwal R.; Leon-Saval, Sergio; Lewis, Geraint F.; Munari, Ulisse; Nataf, David; Stello, Dennis; Tinney, C. G.; Traven, Gregor; Watson, Fred; Wittenmyer, Robert A.

    2017-01-01

    We present the data reduction procedures being used by the GALactic Archeology with Hermes (GALAH) survey, carried out with the HERMES fibre-fed, multi-object spectrograph on the 3.9-m Anglo-Australian Telescope. GALAH is a unique survey, targeting 1 million stars brighter than magnitude V = 14 at a resolution of 28 000 with a goal to measure the abundances of 29 elements. Such a large number of high-resolution spectra necessitate the development of a reduction pipeline optimized for speed, accuracy, and consistency. We outline the design and structure of the IRAF-based reduction pipeline that we developed, specifically for GALAH, to produce fully calibrated spectra aimed for subsequent stellar atmospheric parameter estimation. The pipeline takes advantage of existing IRAF routines and other readily available software so as to be simple to maintain, testable, and reliable. A radial velocity and stellar atmospheric parameter estimator code is also presented, which is used for further data analysis and yields a useful verification of the reduction quality. We have used this estimator to quantify the data quality of GALAH for fibre cross-talk level (≲0.5 per cent) and scattered light (˜5 counts in a typical 20 min exposure), resolution across the field, sky spectrum properties, wavelength solution reliability (better than 1 km s-1 accuracy), and radial velocity precision.

  4. The Infrared Imaging Spectrograph (IRIS) for TMT: data reduction system

    Science.gov (United States)

    Walth, Gregory; Wright, Shelley A.; Weiss, Jason; Larkin, James E.; Moore, Anna M.; Chapin, Edward L.; Do, Tuan; Dunn, Jennifer; Ellerbroek, Brent; Gillies, Kim; Hayano, Yutaka; Johnson, Chris; Marshall, Daniel; Riddle, Reed L.; Simard, Luc; Sohn, Ji Man; Suzuki, Ryuji; Wincentsen, James

    2016-08-01

    IRIS (InfraRed Imaging Spectrograph) is the diffraction-limited first light instrument for the Thirty Meter Telescope (TMT) that consists of a near-infrared (0.84 to 2.4 μm) imager and integral field spectrograph (IFS). The IFS makes use of a lenslet array and slicer for spatial sampling, which will be able to operate in 100's of different modes, including a combination of four plate scales from 4 milliarcseconds (mas) to 50 mas with a large range of filters and gratings. The imager will have a field of view of 34×34 arcsec2 with a plate scale of 4 mas with many selectable filters. We present the preliminary design of the data reduction system (DRS) for IRIS that need to address all of these observing modes. Reduction of IRIS data will have unique challenges since it will provide real-time reduction and analysis of the imaging and spectroscopic data during observational sequences, as well as advanced post-processing algorithms. The DRS will support three basic modes of operation of IRIS; reducing data from the imager, the lenslet IFS, and slicer IFS. The DRS will be written in Python, making use of open-source astronomical packages available. In addition to real-time data reduction, the DRS will utilize real-time visualization tools, providing astronomers with up-to-date evaluation of the target acquisition and data quality. The quick look suite will include visualization tools for 1D, 2D, and 3D raw and reduced images. We discuss the overall requirements of the DRS and visualization tools, as well as necessary calibration data to achieve optimal data quality in order to exploit science cases across all cosmic distance scales.

  5. Input Range Testing for the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  6. Verification and Validation of the General Mission Analysis Tool (GMAT)

    Science.gov (United States)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  7. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    Science.gov (United States)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is

  8. Generalized Analysis Tools for Multi-Spacecraft Missions

    Science.gov (United States)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  9. IFSRED: Data Reduction for Integral Field Spectrographs

    Science.gov (United States)

    Rupke, David S. N.

    2014-09-01

    IFSRED is a general-purpose library for reducing data from integral field spectrographs (IFSs). For a general IFS data cube, it contains IDL routines to: (1) find and apply a zero-point shift in a wavelength solution on a spaxel-by-spaxel basis, using sky lines; (2) find the spatial coordinates of a flux peak; (3) empirically correct for differential atmospheric refraction; (4) mosaic dithered exposures; (5) (integer) rebin; and (6) apply a telluric correction. A sky-subtraction routine for data from the Gemini Multi-Object Spectrograph and Imager (GMOS) that can be easily modified for any instrument is also included. IFSRED also contains additional software specific to reducing data from GMOS and the Gemini Near-Infrared Integral Field Spectrograph (NIFS).

  10. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    Science.gov (United States)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  11. Publishing nutrition research: a review of multivariate techniques--part 3: data reduction methods.

    Science.gov (United States)

    Gleason, Philip M; Boushey, Carol J; Harris, Jeffrey E; Zoellner, Jamie

    2015-07-01

    This is the ninth in a series of monographs on research design and analysis, and the third in a set of these monographs devoted to multivariate methods. The purpose of this article is to provide an overview of data reduction methods, including principal components analysis, factor analysis, reduced rank regression, and cluster analysis. In the field of nutrition, data reduction methods can be used for three general purposes: for descriptive analysis in which large sets of variables are efficiently summarized, to create variables to be used in subsequent analysis and hypothesis testing, and in questionnaire development. The article describes the situations in which these data reduction methods can be most useful, briefly describes how the underlying statistical analyses are performed, and summarizes how the results of these data reduction methods should be interpreted.

  12. Exposure Assessment Tools by Lifestages and Populations - General Population

    Science.gov (United States)

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  13. DNA – A General Energy System Simulation Tool

    DEFF Research Database (Denmark)

    Elmegaard, Brian; Houbak, Niels

    2005-01-01

    operation. The program decides at runtime to apply the DAE solver if the system contains differential equations. This makes it easy to extend an existing steady state model to simulate dynamic operation of the plant. The use of the program is illustrated by examples of gas turbine models. The paper also......The paper reviews the development of the energy system simulation tool DNA (Dynamic Network Analysis). DNA has been developed since 1989 to be able to handle models of any kind of energy system based on the control volume approach, usually systems of lumped parameter components. DNA has proven...... to be a useful tool in the analysis and optimization of several types of thermal systems: Steam turbines, gas turbines, fuels cells, gasification, refrigeration and heat pumps for both conventional fossil fuels and different types of biomass. DNA is applicable for models of both steady state and dynamic...

  14. A general thermal model of machine tool spindle

    Directory of Open Access Journals (Sweden)

    Yanfang Dong

    2017-01-01

    Full Text Available As the core component of machine tool, the thermal characteristics of the spindle have a significant influence on machine tool running status. Lack of an accurate model of the spindle system, particularly the model of load–deformation coefficient between the bearing rolling elements and rings, severely limits the thermal error analytic precision of the spindle. In this article, bearing internal loads, especially the function relationships between the principal curvature difference F(ρ and auxiliary parameter nδ, semi-major axis a, and semi-minor axis b, have been determined; furthermore, high-precision heat generation combining the heat sinks in the spindle system is calculated; finally, an accurate thermal model of the spindle was established. Moreover, a conventional spindle with embedded fiber Bragg grating temperature sensors has been developed. By comparing the experiment results with simulation, it indicates that the model has good accuracy, which verifies the reliability of the modeling process.

  15. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    In classical time series analysis the sample autocorrelation function (SACF) and the sample partial autocorrelation function (SPACF) has gained wide application for structural identification of linear time series models. We suggest generalizations, founded on smoothing techniques, applicable...... for structural identification of non-linear time series models. A similar generalization of the sample cross correlation function is discussed. Furthermore, a measure of the departure from linearity is suggested. It is shown how bootstrapping can be applied to construct confidence intervals under independence...

  16. Generalized Aliasing as a Basis for Program Analysis Tools

    Science.gov (United States)

    2000-11-01

    applications are described in the next chapter, in Section 9.2.2.) For example, the Ladybug specification checker tool [44] has a user interface shell...any particular implementation of the interface. At run time, Ladybug uses reflection to load the engine class by name and create an object of that...supplied with Sun’s JDK 1.1.7 Jess Java Expert System Shell version 4.4, from Sandia National Labs [35] Ladybug The Ladybug specification checker, by Craig

  17. F-111C Flight Data Reduction and Analysis Procedures

    Science.gov (United States)

    1990-12-01

    Victoria Qantas Airways Limited Australian Airline, Library Ansett Airlines of Australia, Library Hawker de Havilland Aust Pty Ltd, Victoria, Library...MELBOURNE, VICTORIA Flight Mechanics Report 187 F-111C FLIGHT DATA REDUCTION AND ANALYSIS PROCEDURES by y 7 M.I. Cooper J.S. Drobik C.A. Martin...RESEARCH LABORATORY Flight Mechanics Report 187 F-111C FLIGHT DATA REDUCTION AND ANALYSIS PROCEDURES by M. I. COOPER J. S. DROBIK C. A. MARTIN

  18. Cure-WISE: HETDEX Data Reduction with Astro-WISE

    Science.gov (United States)

    Snigula, J. M.; Drory, N.; Fabricius, M.; Landriau, M.; Montesano, F.; Hill, G. J.; Gebhardt, K.; Cornell, M. E.

    2014-05-01

    The Hobby-Eberly Telescope Dark Energy Experiment (HETDEX, Hill et al. 2012b) is a blind spectroscopic survey to map the evolution of dark energy using Lyman-alpha emitting galaxies at redshifts 1.9currently receives a wide-field upgrade (Hill et al. 2012a) to accomodate the spectrographs and to provide the needed field of view. Over the projected five year run of the survey we expect to obtain approximately 170 GB of data each night. For the data reduction we developed the Cure pipeline, to automatically find and calibrate the observed spectra, subtract the sky background, and detect and classify different types of sources. Cure employs rigorous statistical methods and complete pixel-level error propagation throughout the reduction process to ensure Poisson-limited performance and meaningful significance values. To automate the reduction of the whole dataset we implemented the Cure pipeline in the Astro-WISE framework. This integration provides for HETDEX a database backend with complete dependency tracking of the various reduction steps, automated checks, and a searchable interface to the detected sources and user management. It can be used to create various web interfaces for data access and quality control. Astro-WISE allows us to reduce the data from all the IFUs in parallel on a compute cluster. This cluster allows us to reduce the observed data in quasi real time and still have excess capacity for rerunning parts of the reduction. Finally, the Astro-WISE interface will be used to provide access to reduced data products to the general community.

  19. The DEEP-South: Scheduling and Data Reduction Software System

    Science.gov (United States)

    Yim, Hong-Suh; Kim, Myung-Jin; Bae, Youngho; Moon, Hong-Kyu; Choi, Young-Jun; Roh, Dong-Goo; the DEEP-South Team

    2015-08-01

    The DEep Ecliptic Patrol of the Southern sky (DEEP-South), started in October 2012, is currently in test runs with the first Korea Microlensing Telescope Network (KMTNet) 1.6 m wide-field telescope located at CTIO in Chile. While the primary objective for the DEEP-South is physical characterization of small bodies in the Solar System, it is expected to discover a large number of such bodies, many of them previously unknown.An automatic observation planning and data reduction software subsystem called "The DEEP-South Scheduling and Data reduction System" (the DEEP-South SDS) is currently being designed and implemented for observation planning, data reduction and analysis of huge amount of data with minimum human interaction. The DEEP-South SDS consists of three software subsystems: the DEEP-South Scheduling System (DSS), the Local Data Reduction System (LDR), and the Main Data Reduction System (MDR). The DSS manages observation targets, makes decision on target priority and observation methods, schedules nightly observations, and archive data using the Database Management System (DBMS). The LDR is designed to detect moving objects from CCD images, while the MDR conducts photometry and reconstructs lightcurves. Based on analysis made at the LDR and the MDR, the DSS schedules follow-up observation to be conducted at other KMTNet stations. In the end of 2015, we expect the DEEP-South SDS to achieve a stable operation. We also have a plan to improve the SDS to accomplish finely tuned observation strategy and more efficient data reduction in 2016.

  20. Physical activity in adolescents – Accelerometer data reduction criteria

    DEFF Research Database (Denmark)

    Toftager, Mette; Breum, Lars; Kristensen, Peter Lund

    : number of valid days (1, 2, 3, 4, 5, 6 and 7days), daily wear time (6, 8, 9, 10 and 12 h/day) and non-wear time (10, 20, 30, 60 and 90 min of consecutive zeroes). The open source software Propero Actigraph Data Analyzer was used to compare the effects of the selected criteria on participant inclusion......Introduction: Accelerometry is increasingly being recognized as an accurate and reliable method to assess free-living physical activity (PA). However, reporting of accelerometer data reduction and methods remains inconsistent. In this study we investigated the impact of different data reduction...

  1. My Family Health Portrait, A tool from the Surgeon General | NIH MedlinePlus the Magazine

    Science.gov (United States)

    ... of this page please turn Javascript on. My Family Health Portrait, A tool from the Surgeon General ... use Why is it important to know my family medical history? Your family medical history is a ...

  2. MUSE: Design and Status of the Data Reduction Pipeline

    Science.gov (United States)

    Weilbacher, P.; Gerssen, J.; Roth, M. M.; Böhm, P.; Muse Team

    We briefly summarize instrument properties of the future second generation VLT instrument MUSE, a giant integral field spectrograph, and describe the layout of the data it will provide. The current design of the data reduction pipeline is presented along with a project timeline.

  3. The Power of Data Reduction : Kernels for Fundamental Graph Problems

    NARCIS (Netherlands)

    Jansen, B.M.P.

    2013-01-01

    The purpose of this thesis is to give a mathematical analysis of the power of data reduction for dealing with fundamental NP-hard graph problems. It has often been observed that the use of heuristic reduction rules in a preprocessing phase gives significant performance gains when solving such proble

  4. Constant temperature hot wire anemometry data reduction procedure

    Science.gov (United States)

    Klopfer, G. H.

    1974-01-01

    The theory and data reduction procedure for constant temperature hot wire anemometry are presented. The procedure is valid for all Mach and Prandtl numbers, but limited to Reynolds numbers based on wire diameter between 0.1 and 300. The fluids are limited to gases which approximate ideal gas behavior. Losses due to radiation, free convection and conduction are included.

  5. Intelligent data reduction - A preliminary investigation. [spacecraft subsystem telemetry

    Science.gov (United States)

    Ford, Donnie R.; Weeks, David J.

    1988-01-01

    Research being undertaken to develop expert systems for reducing telemetry data from spacecraft is described. The use of the Hubble Space Telescope Electrical Power System as a testbed is examined. The Nickel Cadmium Battery Expert System is briefly addressed, and the I-DARE (Intelligent Data Reduction) prototype system is discussed.

  6. Individual and social learning processes involved in the acquisition and generalization of tool use in macaques

    Science.gov (United States)

    Macellini, S.; Maranesi, M.; Bonini, L.; Simone, L.; Rozzi, S.; Ferrari, P. F.; Fogassi, L.

    2012-01-01

    Macaques can efficiently use several tools, but their capacity to discriminate the relevant physical features of a tool and the social factors contributing to their acquisition are still poorly explored. In a series of studies, we investigated macaques' ability to generalize the use of a stick as a tool to new objects having different physical features (study 1), or to new contexts, requiring them to adapt the previously learned motor strategy (study 2). We then assessed whether the observation of a skilled model might facilitate tool-use learning by naive observer monkeys (study 3). Results of study 1 and study 2 showed that monkeys trained to use a tool generalize this ability to tools of different shape and length, and learn to adapt their motor strategy to a new task. Study 3 demonstrated that observing a skilled model increases the observers' manipulations of a stick, thus facilitating the individual discovery of the relevant properties of this object as a tool. These findings support the view that in macaques, the motor system can be modified through tool use and that it has a limited capacity to adjust the learnt motor skills to a new context. Social factors, although important to facilitate the interaction with tools, are not crucial for tool-use learning. PMID:22106424

  7. Tool for decision-making regarding general evacuation during a rapid river flood.

    Science.gov (United States)

    Radosavljevic, V; Belojevic, G; Pavlovic, N

    2017-05-01

    To propose a simple and effective tool for decision-making regarding general evacuation during a rapid river flood. Virtual testing of a tool in a real event. A four-component tool was applied to build an alternative scenario of the catastrophic river flood in Obrenovac, Serbia, on May 2014. The components of this tool are: (1) the amount of precipitation above the 95th percentile of all previous measurements; (2) upstream river discharge above the 95th percentile of all previous measurements; (3) upstream river level above the 95th percentile of all previous measurements; and (4) worsening of the hydrometeorological situation in the following 48 h. In the early morning of 16 May 2014, a rapid river wave flooded 80% of the Obrenovac territory. There were 13 deaths due to drowning. Application of the study tool shows that these lives could have been saved, as the score to recommend general evacuation was reached 1 day before the flooding. The application of this tool to two previous great floods in Serbia shows that the score to recommend general evacuation was reached either 1 day before or on the onset of flash flooding. Due to its simplicity, this tool is universally applicable to facilitate decision-making regarding general evacuation during a rapid river flood, and it should be further tested in future similar catastrophes. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  8. Adaptive radial basis function mesh deformation using data reduction

    Science.gov (United States)

    Gillebaart, T.; Blom, D. S.; van Zuijlen, A. H.; Bijl, H.

    2016-09-01

    Radial Basis Function (RBF) mesh deformation is one of the most robust mesh deformation methods available. Using the greedy (data reduction) method in combination with an explicit boundary correction, results in an efficient method as shown in literature. However, to ensure the method remains robust, two issues are addressed: 1) how to ensure that the set of control points remains an accurate representation of the geometry in time and 2) how to use/automate the explicit boundary correction, while ensuring a high mesh quality. In this paper, we propose an adaptive RBF mesh deformation method, which ensures the set of control points always represents the geometry/displacement up to a certain (user-specified) criteria, by keeping track of the boundary error throughout the simulation and re-selecting when needed. Opposed to the unit displacement and prescribed displacement selection methods, the adaptive method is more robust, user-independent and efficient, for the cases considered. Secondly, the analysis of a single high aspect ratio cell is used to formulate an equation for the correction radius needed, depending on the characteristics of the correction function used, maximum aspect ratio, minimum first cell height and boundary error. Based on the analysis two new radial basis correction functions are derived and proposed. This proposed automated procedure is verified while varying the correction function, Reynolds number (and thus first cell height and aspect ratio) and boundary error. Finally, the parallel efficiency is studied for the two adaptive methods, unit displacement and prescribed displacement for both the CPU as well as the memory formulation with a 2D oscillating and translating airfoil with oscillating flap, a 3D flexible locally deforming tube and deforming wind turbine blade. Generally, the memory formulation requires less work (due to the large amount of work required for evaluating RBF's), but the parallel efficiency reduces due to the limited

  9. Data Reduction Algorithm for Optical Wide Field Patrol (OWL)

    Science.gov (United States)

    Park, S.; Park, Y.; Yim, H.; Jo, J.; Moon, H.; Bae, Y.; Lim, Y.; Choi, J.; Choi, Y.; Park, J.; Son, J.

    2014-09-01

    OWL (Optical Wide-field Patrol) has a detector system which has the chopper which consists of 4 blades in front of the CCD camera to acquire efficiently the position and time information of moving objects such as artificial satellites. Using this system, it is possible to get more position data by splitting the streaks of the moving object into many pieces with fast rotating blades during tracking. At the same time, the time data of the rotating chopper can be acquired by the time tagger connected to the photo diode. In order to derive the orbits of the targets, we need a sequential data reduction procedure including the calculation of WCS (World Coordinate System) solution to transform the positions into equatorial coordinate systems, and the combination of the time data from the time tagger and the position data. We present such a data reduction procedure and the preliminary results after applying this procedure to the observation images.

  10. Biometric data reduction for embedding in small images

    Science.gov (United States)

    Ishaq Qazi, Naseem

    2003-06-01

    Biometric authentication systems require a fast and accurate method of matching biometric data for identification purposes. This paper introduces a data reduction technique based on image processing to better embed biometric data in small images. For the most part, biometric data cannot be directly embedded in small images, because of limited embedding capacities and a large amount of data in biometric images. An image processing technique to extract features from biometric data, like fingerprints and retinal scans, has been developed and tested. This new technique developed to extract features is based on the Hough transform and has been tested on a large volume of real image data. The data reduction technique was applied to these images and the data reduced to size, which could be easily embedded in small pictures, like those on identity cards. Existing embedding algorithms were utilized.

  11. The data reduction pipeline for the Hi-GAL survey

    CERN Document Server

    Traficante, A; Veneziani, M; Ali, B; de Gasperis, G; Di Giorgio, A M; Ikhenaode, D; Molinari, S; Natoli, P; Pestalozzi, M; Pezzuto, S; Piacentini, F; Piazzo, L; Polenta, G; Schisano, E

    2011-01-01

    We present the data reduction pipeline for the Hi-GAL survey. Hi-GAL is a key project of the Herschel satellite which is mapping the inner part of the Galactic plane (|l| <= 70\\cdot and |b| <= 1\\cdot), using 2 PACS and 3 SPIRE frequency bands, from 70{\\mu}m to 500{\\mu}m. Our pipeline relies only partially on the Herschel Interactive Standard Environment (HIPE) and features several newly developed routines to perform data reduction, including accurate data culling, noise estimation and minimum variance map-making, the latter performed with the ROMAGAL algorithm, a deep modification of the ROMA code already tested on cosmological surveys. We discuss in depth the properties of the Hi-GAL Science Demonstration Phase (SDP) data.

  12. Rolling Deck to Repository (R2R): Automated Magnetic and Gravity Quality Assessment and Data Reduction

    Science.gov (United States)

    Morton, J. J.; O'Hara, S.; Ferrini, V.; Arko, R. A.

    2010-12-01

    With its global capability and diverse array of sensors, the academic research fleet is an integral component of ocean exploration. The Rolling Deck to Repository (R2R) Program provides a central shore-side gateway for underway data from the U.S. academic research fleet. In addition to ensuring preservation and documentation of routine underway data, R2R is also developing automated quality assessment (QA) tools for a variety of underway data types. Routine post-cruise QA will enable prompt feedback to shipboard operators and to provide the science community with sufficient background information for data analysis. Based on community feedback, R2R will perform data reduction to generate enhanced data products for select data types including gravity and magnetics. In the development of these tools, R2R seeks input from the scientific community, engaging specialists for each data type and requesting feedback from operators and scientists to deliver the most relevant and useful metadata. Development of data acquisition best practices that are being assembled within the community for some data types will also be important components of R2R QA development. Protocols for gravity and magnetics QA will include the development of guidelines for minimal and optimal metadata for each data type that will enable data reduction and optimize data re-use. Metadata including instrument specifications, navigational offsets, and calibration information will be important inputs for both data reduction and QA. Data reduction will include merging these geophysical data types with high-quality R2R-generated navigation data products, cleaning the data and applying instrument corrections. Automated routines that are being developed will then be used to assess data quality, ultimately producing a Quality Assessment Certificate (QAC) that will provide the science community with quality information in an easily accessible and understandable format. We present progress to date and invite

  13. Towards a Data Reduction for the Minimum Flip Supertree Problem

    CERN Document Server

    Böcker, Sebastian

    2011-01-01

    In computational phylogenetics, the problem of constructing a supertree of a given set of rooted input trees can be formalized in different ways, to cope with contradictory information in the input. We consider the Minimum Flip Supertree problem, where the input trees are transformed into a 0/1/?-matrix, such that each row represents a taxon, and each column represents an inner node of one of the input trees. Our goal is to find a perfect phylogeny for the input matrix requiring a minimum number of 0/1-flips, that is, corrections of 0/1-entries in the matrix. The problem is known to be NP-complete. Here, we present a parameterized data reduction with polynomial running time. The data reduction guarantees that the reduced instance has a solution if and only if the original instance has a solution. We then make our data reduction parameter-independent by using upper bounds. This allows us to preprocess an instance, and to solve the reduced instance with an arbitrary method. Different from an existing data reduc...

  14. Development of the EMAP tool facilitating existential communication between general practitioners and cancer patients

    DEFF Research Database (Denmark)

    Assing Hvidt, Elisabeth; Hansen, Dorte Gilså; Ammentorp, Jette

    2017-01-01

    BACKGROUND: General practice recognizes the existential dimension as an integral part of multidimensional patient care alongside the physical, psychological and social dimensions. However, general practitioners (GPs) report substantial barriers related to communication with patients about...... existential concerns. OBJECTIVES: To describe the development of the EMAP tool facilitating communication about existential problems and resources between GPs and patients with cancer. METHODS: A mixed-methods design was chosen comprising a literature search, focus group interviews with GPs and patients (n...... dimension. The tool utilized the acronym and mnemonic EMAP (existential communication in general practice) indicating the intention of the tool: to provide a map of possible existential problems and resources that the GP and the patient can discuss to find points of reorientation in the patient's situation...

  15. Final Report for Geometric Analysis for Data Reduction and Structure Discovery DE-FG02-10ER25983, STRIPES award # DE-SC0004096

    Energy Technology Data Exchange (ETDEWEB)

    Vixie, Kevin R. [Washington State Univ., Pullman, WA (United States)

    2014-11-27

    This is the final report for the project "Geometric Analysis for Data Reduction and Structure Discovery" in which insights and tools from geometric analysis were developed and exploited for their potential to large scale data challenges.

  16. The MUSE Data Reduction Pipeline: Status after Preliminary Acceptance Europe

    CERN Document Server

    Weilbacher, Peter M; Urrutia, Tanya; Pécontal-Rousset, Arlette; Jarno, Aurélien; Bacon, Roland

    2015-01-01

    MUSE, a giant integral field spectrograph, is about to become the newest facility instrument at the VLT. It will see first light in February 2014. Here, we summarize the properties of the instrument as built and outline functionality of the data reduction system, that transforms the raw data that gets recorded separately in 24 IFUs by 4k CCDs, into a fully calibrated, scientifically usable data cube. We then describe recent work regarding geometrical calibration of the instrument and testing of the processing pipeline, before concluding with results of the Preliminary Acceptance in Europe and an outlook to the on-sky commissioning.

  17. The Data Reduction Pipeline of the Hamburg Robotic Telescope

    Directory of Open Access Journals (Sweden)

    Marco Mittag

    2010-01-01

    spectrograph of the Hamburg Robotic Telescope (HRT is presented. This pipeline is started automatically after finishing the night-time observations and calibrations. The pipeline includes all necessary procedures for a reliable and complete data reduction, that is, Bias, Dark, and Flat Field correction. Also the order definition, wavelength calibration, and data extraction are included. The final output is written in a fits-format and ready to use for the astronomer. The reduction pipeline is implemented in IDL and based on the IDL reduction package REDUCE written by Piskunov and Valenti (2002.

  18. Remediation, General Education, and Technical Mathematics. Educational Resources for the Machine Tool Industry.

    Science.gov (United States)

    Texas State Technical Coll. System, Waco.

    This document contains descriptions of adult education courses in remediation, general education, and technical mathematics. They are part of a program developed by the Machine Tool Advanced Skills Technology Educational Resources (MASTER) program to help workers become competent in the skills needed to be productive workers in the machine tools…

  19. Investigation of Phase Congruency for SSA Data Reduction

    Science.gov (United States)

    Schultz, D.; Sydney, P.; Flewelling, B.

    2014-09-01

    The space situational awareness (SSA) data reduction pipeline is well established and formulated from our innate understanding of CCDs, optical distortion, and viewing conditions. Proper photometric calibration includes dark frame collection as well as the time-consuming task of on-sky flats which can vary with sky brightness and availability. Proper astrometric calibration includes synchronization of all timing sources, proper mount modeling, and jitter reduction. Once calibrated, further data analysis can include streak reduction, PSF fitting, and star separation with such goals as aperture photometry or dim object characterization. The ideal SSA feature detection algorithm would separate statistically-distinct spatial regions of an image into background, objects, and stars without need for darks/flats. Computer vision feature detection techniques may offer this one-stop-shop approach to data reduction and analysis using just the raw frames. One such technique uses phase congruency: a measure of the strength of how well the Fourier components of an image are in-phase which correlates with features such as edges. We examine a phase congruency approach based upon contrast invariant multi-scale image decomposition using simulated and nighttime Raven data.

  20. Physical activity in adolescents – Accelerometer data reduction criteria

    DEFF Research Database (Denmark)

    Toftager, Mette; Breum, Lars; Kristensen, Peter Lund

    and PA outcomes (mean cpm). The following parameters in the data reduction analyses were fixed: 30sec epoch, 24h duration, first registration accepted after 4h, maximum value 20,000cpm, and two activity epochs permitted in blocks of non-wear. Results: Accelerometer data were obtained from a total of 1...... 1 valid day of 6h wear time using a 10min non-wear criterion. The corresponding numbers using a 90min non-wear criterion were 20.6% and 99.4%. Lengthening the non-wear period decreases PA level (mean cpm) substantially, e.g. average PA was 641 cpm (5 days of 10h) using the 10min non-wear criterion...... compared to 570 cpm using 90min non-wear. No systematic differences in PA outcomes were found when comparing the range of days and hours. Discussion: We used a systematic approach to illustrate that even small inconsistencies in accelerometer data reduction can have substantial impact on compliance and PA...

  1. The ARCONS Pipeline: Data Reduction for MKID Arrays

    CERN Document Server

    van Eyken, J C; Walter, A B; Meeker, S R; Szypryt, P; Stoughton, C; O'Brien, K; Marsden, D; Rice, N K; Lin, Y; Mazin, B A

    2015-01-01

    The Array Camera for Optical to Near-IR Spectrophotometry, or ARCONS, is a camera based on Microwave Kinetic Inductance Detectors (MKIDs), a new technology that has the potential for broad application in astronomy. Using an array of MKIDs, the instrument is able to produce time-resolved imaging and low-resolution spectroscopy constructed from detections of individual photons. The arrival time and energy of each photon are recorded in a manner similar to X-ray calorimetry, but at higher photon fluxes. The technique works over a very large wavelength range, is free from fundamental read noise and dark-current limitations, and provides microsecond-level timing resolution. Since the instrument reads out all pixels continuously while exposing, there is no loss of active exposure time to readout. The technology requires a different approach to data reduction compared to conventional CCDs. We outline here the prototype data reduction pipeline developed for ARCONS, though many of the principles are also more broadly ...

  2. THE ARCONS PIPELINE: DATA REDUCTION FOR MKID ARRAYS

    Energy Technology Data Exchange (ETDEWEB)

    Eyken, J. C. van; Strader, M. J.; Walter, A. B.; Meeker, S. R.; Szypryt, P.; Marsden, D.; Rice, N. K.; Lin, Y.; Mazin, B. A. [Department of Physics, UC Santa Barbara, Santa Barbara, CA 93106 (United States); Stoughton, C. [Fermilab Center for Particle Astrophysics, Batavia, IL 60510 (United States); O’Brien, K., E-mail: vaneyken@ipac.caltech.edu [Department of Physics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford, OX1 3RH (United Kingdom)

    2015-07-15

    The Array Camera for Optical to Near-IR Spectrophotometry, or ARCONS, is a camera based on Microwave Kinetic Inductance Detectors (MKIDs), a new technology that has the potential for broad application in astronomy. Using an array of MKIDs, the instrument is able to produce time-resolved imaging and low-resolution spectroscopy constructed from detections of individual photons. The arrival time and energy of each photon are recorded in a manner similar to X-ray calorimetry, but at higher photon fluxes. The technique works over a very large wavelength range, is free from fundamental read noise and dark-current limitations, and provides microsecond-level timing resolution. Since the instrument reads out all pixels continuously while exposing, there is no loss of active exposure time to readout. The technology requires a different approach to data reduction compared to conventional CCDs. We outline here the prototype data reduction pipeline developed for ARCONS, though many of the principles are also more broadly applicable to energy-resolved photon counting arrays (e.g., transition edge sensors, superconducting tunnel junctions). We describe the pipeline’s current status, and the algorithms and techniques employed in taking data from the arrival of photons at the MKID array to the production of images, spectra, and time-resolved light curves.

  3. Improved FTIR open-path remote sensing data reduction technique

    Science.gov (United States)

    Phillips, Bill; Moyers, Rick; Lay, Lori T.

    1995-05-01

    Progress on the developement of a nonlinear curve fitting computer algorithm for data reduction of optical remote sensing Fourier transform spectrometer (FTS) data is presented. This new algorithm is an adaptation of an existing algorithm employed at the Arnold Engineering Development Center for the analysis of infrared plume signature and optical gas diagnostic data on rocket and turbine engine exhaust. Because it is a nonlinear model, the algorithm can be used to determine parameters not readily determined by linear methods such as classical least squares. Unlike linear methods this procedure can simultaneously determine atmospheric gas concetrations, spectral resolution, spectral shift, and the background or (Io(omega) spectrum. Additionally, species which possess spectra that are strongly masked by atmospheric absorption features such as BTX can also be incorporated into the procedure. The basic theory behind the algorithm is presented as well as test results on FTS data and synthetic data containing benzene and toluene spectral features.

  4. Measuring general surgery residents' communication skills from the patient's perspective using the Communication Assessment Tool (CAT).

    Science.gov (United States)

    Stausmire, Julie M; Cashen, Constance P; Myerholtz, Linda; Buderer, Nancy

    2015-01-01

    The Communication Assessment Tool (CAT) has been used and validated to assess Family and Emergency Medicine resident communication skills from the patient's perspective. However, it has not been previously reported as an outcome measure for general surgery residents. The purpose of this study is to establish initial benchmarking data for the use of the CAT as an evaluation tool in an osteopathic general surgery residency program. Results are analyzed quarterly and used by the program director to provide meaningful feedback and targeted goal setting for residents to demonstrate progressive achievement of interpersonal and communication skills with patients. The 14-item paper version of the CAT (developed by Makoul et al. for residency programs) asks patients to anonymously rate surgery residents on discrete communication skills using a 5-point rating scale immediately after the clinical encounter. Results are reported as the percentage of items rated as "excellent" (5) by the patient. The setting is a hospital-affiliated ambulatory urban surgery office staffed by the residency program. Participants are representative of adult patients of both sexes across all ages with diverse ethnic backgrounds. They include preoperative and postoperative patients, as well as those needing diagnostic testing and follow-up. Data have been collected on 17 general surgery residents from a single residency program representing 5 postgraduate year levels and 448 patient encounters since March 2012. The reliability (Cronbach α) of the tool for surgery residents was 0.98. The overall mean percentage of items rated as excellent was 70% (standard deviations = 42%), with a median of 100%. The CAT is a useful tool for measuring 1 facet of resident communication skills-the patient's perception of the physician-patient encounter. The tool provides a unique and personalized outcome measure for identifying communication strengths and improvement opportunities, allowing residents to receive

  5. SIMULATION SYSTEM FOR FIVE-AXIS NC MACHINING USING GENERAL CUTTING TOOL

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A simulation system for five-axis NC machining using general cutting tools is presented. This system differs from other simulation system in that it not only focuses on the geometric simulation but also focuses on the collision detection which is usually not included in NC machining simulation. Besides all of these, estimating cutting forces is also discussed. In order to obtain high efficiency, all algorithms use swept volume modeling technique, so the simulation system is compact and can be performed efficiently.

  6. New software for neutron data reduction and visualization

    Energy Technology Data Exchange (ETDEWEB)

    Worlton, T.; Chatterjee, A.; Hammonds, J.; Chen, D.; Loong, C.K. [Argonne National Laboratory, Argonne, IL (United States); Mikkelson, D.; Mikkelson, R. [Univ. of Wisconsin-Stout, Menomonie, WI (United States)

    2001-03-01

    Development of advanced neutron sources and instruments has necessitated corresponding advances in software for neutron scattering data reduction and visualization. New sources produce datasets more rapidly, and new instruments produce large numbers of spectra. Because of the shorter collection times, users are able to make more measurements on a given sample. This rapid production of datasets requires that users be able to reduce and analyze data quickly to prevent a data bottleneck. In addition, the new sources and instruments are accommodating more users with less neutron-scattering specific expertise, which requires software that is easy to use and freely available. We have developed an Integrated Spectral Analysis Workbench (ISAW) software package to permit the rapid reduction and visualization of neutron data. It can handle large numbers of spectra and merge data from separate measurements. The data can be sorted according to any attribute and transformed in numerous ways. ISAW provides several views of the data that enable users to compare spectra and observe trends in the data. A command interpreter, which is now part of ISAW, allows scientists to easily set up a series of instrument-specific operations to reduce and visualize data automatically. ISAW is written entirely in Java to permit portability to different computer platforms and easy distribution of the software. The software was constructed using modern computer design methods to allow easy customization and improvement. ISAW currently only reads data from IPNS 'run' files, but work is underway to provide input of NeXus files. (author)

  7. The Effelsberg-Bonn HI Survey: Data reduction

    CERN Document Server

    Winkel, B; Kerp, J; Floeer, L

    2010-01-01

    Starting in winter 2008/2009 an L-band 7-Feed-Array receiver is used for a 21-cm line survey performed with the 100-m telescope, the Effelsberg-Bonn HI survey (EBHIS). The EBHIS will cover the whole northern hemisphere for decl.>-5 deg comprising both the galactic and extragalactic sky out to a distance of about 230 Mpc. Using state-of-the-art FPGA-based digital fast Fourier transform spectrometers, superior in dynamic range and temporal resolution to conventional correlators, allows us to apply sophisticated radio frequency interference (RFI) mitigation schemes. In this paper, the EBHIS data reduction package and first results are presented. The reduction software consists of RFI detection schemes, flux and gain-curve calibration, stray-radiation removal, baseline fitting, and finally the gridding to produce data cubes. The whole software chain is successfully tested using multi-feed data toward many smaller test fields (1--100 square degrees) and recently applied for the first time to data of two large sky ...

  8. CoRoT data reduction by example

    Science.gov (United States)

    Weingrill, J.

    2015-02-01

    Data reduction techniques published so far for the CoRoT N2 data product were targeted primarily on the detection of extrasolar planets. Since the whole dataset has been released, specific algorithms are required to process the lightcurves from CoRoT correctly. Though only unflagged datapoints must be chosen for scientific processing, some flags might be reconsidered. The reduction of data along with improving the signal-to-noise ratio can be achieved by applying a one dimensional drizzle algorithm. Gaps can be filled by linear interpolated data without harming the frequency spectrum. Magnitudes derived from the CoRoT color channels might be used to derive additional information about the targets. Depending on the needs, various filters in the frequency domain remove either the red noise background or high frequency noise. The autocorrelation function or the least squares periodogram are appropriate methods to identify periodic signals. The methods described here are not strictly limited to CoRoT data but may also be applied on Kepler data or the upcoming PLATO mission. The CoRoT space mission, launched on 2006 December 27, has been developed and is operated by CNES, with the contribution of Austria, Belgium, Brazil, ESA (RSSD and Science Programme), Germany and Spain.

  9. The Infrared Imaging Spectrograph (IRIS) for TMT: Data Reduction System

    CERN Document Server

    Walth, Gregory; Weiss, Jason; Larkin, James E; Moore, Anna M; Chapin, Edward L; Do, Tuan; Dunn, Jennifer; Ellerbroek, Brent; Gillies, Kim; Hayano, Yutaka; Johnson, Chris; Marshall, Daniel; Riddle, Reed L; Simard, Luc; Sohn, Ji Man; Suzuki, Ryuji; Wincensten, James

    2016-01-01

    IRIS (InfraRed Imaging Spectrograph) is the diffraction-limited first light instrument for the Thirty Meter Telescope (TMT) that consists of a near-infrared (0.84 to 2.4 $\\mu$m) imager and integral field spectrograph (IFS). The IFS makes use of a lenslet array and slicer for spatial sampling, which will be able to operate in 100's of different modes, including a combination of four plate scales from 4 milliarcseconds (mas) to 50 mas with a large range of filters and gratings. The imager will have a field of view of 34$\\times$34 arcsec$^{2}$ with a plate scale of 4 mas with many selectable filters. We present the preliminary design of the data reduction system (DRS) for IRIS that need to address all of these observing modes. Reduction of IRIS data will have unique challenges since it will provide real-time reduction and analysis of the imaging and spectroscopic data during observational sequences, as well as advanced post-processing algorithms. The DRS will support three basic modes of operation of IRIS; reduc...

  10. Accelerating high-dimensional clustering with lossless data reduction.

    Science.gov (United States)

    Qaqish, Bahjat F; O'Brien, Jonathon J; Hibbard, Jonathan C; Clowers, Katie J

    2017-09-15

    For cluster analysis, high-dimensional data are associated with instability, decreased classification accuracy and high-computational burden. The latter challenge can be eliminated as a serious concern. For applications where dimension reduction techniques are not implemented, we propose a temporary transformation which accelerates computations with no loss of information. The algorithm can be applied for any statistical procedure depending only on Euclidean distances and can be implemented sequentially to enable analyses of data that would otherwise exceed memory limitations. The method is easily implemented in common statistical software as a standard pre-processing step. The benefit of our algorithm grows with the dimensionality of the problem and the complexity of the analysis. Consequently, our simple algorithm not only decreases the computation time for routine analyses, it opens the door to performing calculations that may have otherwise been too burdensome to attempt. R, Matlab and SAS/IML code for implementing lossless data reduction is freely available in the Appendix. obrienj@hms.harvard.edu.

  11. FPGA based algorithms for data reduction at Belle II

    Energy Technology Data Exchange (ETDEWEB)

    Muenchow, David; Gessler, Thomas; Kuehn, Wolfgang; Lange, Jens Soeren; Liu, Ming; Spruck, Bjoern [II. Physikalisches Institut, Universitaet Giessen (Germany)

    2011-07-01

    Belle II, the upgrade of the existing Belle experiment at Super-KEKB in Tsukuba, Japan, is an asymmetric e{sup +}e{sup -} collider with a design luminosity of 8.10{sup 35}cm{sup -2}s{sup -1}. At Belle II the estimated event rate is {<=}30 kHz. The resulting data rate at the Pixel Detector (PXD) will be {<=}7.2 GB/s. This data rate needs to be reduced to be able to process and store the data. A region of interest (ROI) selection is based upon two mechanisms. a.) a tracklet finder using the silicon strip detector and b.) the HLT using all other Belle II subdetectors. These ROIs and the pixel data are forwarded to an FPGA based Compute Node for processing. Here a VHDL based algorithm on FPGA with the benefit of pipelining and parallelisation will be implemented. For a fast data handling we developed a dedicated memory management system for buffering and storing the data. The status of the implementation and performance tests of the memory manager and data reduction algorithm is presented.

  12. HI data reduction for the Arecibo Pisces-Perseus Supercluster Survey

    Science.gov (United States)

    Davis, Cory; Johnson, Cory; Craig, David W.; Haynes, Martha P.; Jones, Michael G.; Koopmann, Rebecca A.; Hallenbeck, Gregory L.; Undergraduate ALFALFA Team

    2017-01-01

    The Undergraduate ALFALFA team is currently focusing on the analysis of the Pisces-Perseus Supercluster to test current supercluster formation models. The primary goal of our research is to reduce L-band HI data from the Arecibo telescope. To reduce the data we use IDL programs written by our collaborators to reduce the data and find potential sources whose mass can be estimated by the baryonic Tully-Fisher relation, which relates the luminosity to the rotational velocity profile of spiral galaxies. Thus far we have reduced data and estimated HI masses for several galaxies in the supercluster region.We will give examples of data reduction and preliminary results for both the fall 2015 and 2016 observing seasons. We will also describe the data reduction process and the process of learning the associated software, and the use of virtual observatory tools such as the SDSS databases, Aladin, TOPCAT and others.This research was supported by the NSF grant AST-1211005.

  13. Simrank: Rapid and sensitive general-purpose k-mer search tool

    Energy Technology Data Exchange (ETDEWEB)

    DeSantis, T.Z.; Keller, K.; Karaoz, U.; Alekseyenko, A.V; Singh, N.N.S.; Brodie, E.L; Pei, Z.; Andersen, G.L; Larsen, N.

    2011-04-01

    Terabyte-scale collections of string-encoded data are expected from consortia efforts such as the Human Microbiome Project (http://nihroadmap.nih.gov/hmp). Intra- and inter-project data similarity searches are enabled by rapid k-mer matching strategies. Software applications for sequence database partitioning, guide tree estimation, molecular classification and alignment acceleration have benefited from embedded k-mer searches as sub-routines. However, a rapid, general-purpose, open-source, flexible, stand-alone k-mer tool has not been available. Here we present a stand-alone utility, Simrank, which allows users to rapidly identify database strings the most similar to query strings. Performance testing of Simrank and related tools against DNA, RNA, protein and human-languages found Simrank 10X to 928X faster depending on the dataset. Simrank provides molecular ecologists with a high-throughput, open source choice for comparing large sequence sets to find similarity.

  14. Detection of adverse events in general surgery using the " Trigger Tool" methodology.

    Science.gov (United States)

    Pérez Zapata, Ana Isabel; Gutiérrez Samaniego, María; Rodríguez Cuéllar, Elías; Andrés Esteban, Eva María; Gómez de la Cámara, Agustín; Ruiz López, Pedro

    2015-02-01

    Surgery is one of the high-risk areas for the occurrence of adverse events (AE). The purpose of this study is to know the percentage of hospitalisation-related AE that are detected by the «Global Trigger Tool» methodology in surgical patients, their characteristics and the tool validity. Retrospective, observational study on patients admitted to a general surgery department, who underwent a surgical operation in a third level hospital during the year 2012. The identification of AE was carried out by patient record review using an adaptation of «Global Trigger Tool» methodology. Once an AE was identified, a harm category was assigned, including the grade in which the AE could have been avoided and its relation with the surgical procedure. The prevalence of AE was 36,8%. There were 0,5 AE per patient. 56,2% were deemed preventable. 69,3% were directly related to the surgical procedure. The tool had a sensitivity of 86% and a specificity of 93,6%. The positive predictive value was 89% and the negative predictive value 92%. Prevalence of AE is greater than the estimate of other studies. In most cases the AE detected were related to the surgical procedure and more than half were also preventable. The adapted «Global Trigger Tool» methodology has demonstrated to be highly effective and efficient for detecting AE in surgical patients, identifying all the serious AE with few false negative results. Copyright © 2014 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  15. Star formation: Submillimeter observations and data reduction techniques

    Science.gov (United States)

    Attard, Michael

    2010-12-01

    The process of star formation is key to astrophysics and its understanding remains a fundamental problem. The following chapters describe recent work on this subject with instrumentation at the Caltech Submillimeter Observatory. Chapter 1 provides an introduction to this thesis. Chapter 2 describes a new data reduction technique for dual-array polarimeters. This technique is meant to address a potential problem with these instruments; artificial polarization signals are introduced into the data when misalignments between the subarrays and pointing drifts are present during the data acquisition process. The correction algorithm presented is meant to treat for this problem, and has been tested using simulated and actual data. The results indicate that this approach is effective at removing up to 60% of the artificial polarization. Chapter 3 discusses an analysis of the low-mass star forming region NGC 1333 IRAS 4 involving SHARP 350 mum polarimetry and HCN J=4→3 emission spectra. The polarimetry indicates a uniform magnetic field morphology over a 20" radius from the peak continuum flux of IRAS 4A, in agreement with models of magnetically supported cloud collapse. The magnetic field morphology around IRAS 4B appears to be quite distinct however, with indications of depolarization observed towards the peak flux of this source. Inverse P-Cygni profiles are observed in the HCN J=4→3 line spectra towards IRAS 4A, providing a clear indication of infall gas motions. Taken together, the evidence gathered appears to support the scenario that IRAS 4A is a cloud core in a critical state of support against gravitational collapse. Chapter 4 covers SHARP 450 mum polarimetry obtained over the high-mass star forming region NGC 6334 I(N). The "Method 2" approach described in a recent paper by G. Novak and collaborators is applied here to combine our data with results from the Hertz and SPARO polarimeters. This is done in order to estimate the intrinsic angular dispersion

  16. Scanning probe microscopy beyond imaging: a general tool for quantitative analysis.

    Science.gov (United States)

    Liscio, Andrea

    2013-04-15

    A simple, fast and general approach for quantitative analysis of scanning probe microscopy (SPM) images is reported. As a proof of concept it is used to determine with a high degree of precision the value of observables such as 1) the height, 2) the flowing current and 3) the corresponding surface potential (SP) of flat nanostructures such as gold electrodes, organic semiconductor architectures and graphenic sheets. Despite histogram analysis, or frequency count (Fc), being the most common mathematical tool used to analyse SPM images, the analytical approach is still lacking. By using the mathematical relationship between Fc and the collected data, the proposed method allows quantitative information on observable values close to the noise level to be gained. For instance, the thickness of nanostructures deposited on very rough substrates can be quantified, and this makes it possible to distinguish the contribution of an adsorbed nanostructure from that of the underlying substrate. Being non-numerical, this versatile analytical approach is a useful and general tool for quantitative analysis of the Fc that enables all signals acquired and recorded by an SPM data array to be studied with high precision.

  17. Python tools for rapid development, calibration, and analysis of generalized groundwater-flow models

    Science.gov (United States)

    Starn, J. J.; Belitz, K.

    2014-12-01

    National-scale water-quality data sets for the United States have been available for several decades; however, groundwater models to interpret these data are available for only a small percentage of the country. Generalized models may be adequate to explain and project groundwater-quality trends at the national scale by using regional scale models (defined as watersheds at or between the HUC-6 and HUC-8 levels). Coast-to-coast data such as the National Hydrologic Dataset Plus (NHD+) make it possible to extract the basic building blocks for a model anywhere in the country. IPython notebooks have been developed to automate the creation of generalized groundwater-flow models from the NHD+. The notebook format allows rapid testing of methods for model creation, calibration, and analysis. Capabilities within the Python ecosystem greatly speed up the development and testing of algorithms. GeoPandas is used for very efficient geospatial processing. Raster processing includes the Geospatial Data Abstraction Library and image processing tools. Model creation is made possible through Flopy, a versatile input and output writer for several MODFLOW-based flow and transport model codes. Interpolation, integration, and map plotting included in the standard Python tool stack also are used, making the notebook a comprehensive platform within on to build and evaluate general models. Models with alternative boundary conditions, number of layers, and cell spacing can be tested against one another and evaluated by using water-quality data. Novel calibration criteria were developed by comparing modeled heads to land-surface and surface-water elevations. Information, such as predicted age distributions, can be extracted from general models and tested for its ability to explain water-quality trends. Groundwater ages then can be correlated with horizontal and vertical hydrologic position, a relation that can be used for statistical assessment of likely groundwater-quality conditions

  18. Gemini IRAF: Data reduction software for the Gemini telescopes

    Science.gov (United States)

    Gemini Observatory; AURA

    2016-08-01

    The Gemini IRAF package processes observational data obtained with the Gemini telescopes. It is an external package layered upon IRAF and supports data from numerous instruments, including FLAMINGOS-2, GMOS-N, GMOS-S, GNIRS, GSAOI, NIFS, and NIRI. The Gemini IRAF package is organized into sub-packages; it contains a generic tools package, "gemtools", along with instrument-specific packages. The raw data from the Gemini facility instruments are stored as Multi-Extension FITS (MEF) files. Therefore, all the tasks in the Gemini IRAF package, intended for processing data from the Gemini facility instruments, are capable of handling MEF files.

  19. Data reduction pipeline for the MMT Magellan Infrared Spectrograph

    CERN Document Server

    Chilingarian, Igor; Fabricant, Daniel; McLeod, Brian; Roll, John; Szentgyorgyi, Andrew

    2012-01-01

    We describe principal components of the new spectroscopic data pipeline for the multi-object MMT/Magellan Infrared Spectrograph (MMIRS). The pipeline is implemented in IDL and C++. The performance of the data processing algorithms is sufficient to reduce a single dataset in 2--3 min on a modern PC workstation so that one can use the pipeline as a quick-look tool during observations. We provide an example of the spectral data processed by our pipeline and demonstrate that the sky subtraction quality gets close to the limits set by the Poisson photon statistics.

  20. A Visual Analytic for High-Dimensional Data Exploitation: The Heterogeneous Data-Reduction Proximity Tool

    Science.gov (United States)

    2013-07-01

    information within time-critical environments (1, 2). Innovated methods are required that allow the efficient and effective transformation of data...Clicking the mouse on the column heading will cause the rows to sort alphabetically (words) or number order ( digits ) according to the data in that column...void setCriminalRec(String crm ) { criminalRec = crm ; if (criminalRec.equalsIgnoreCase("Guilty")) criminalRecBinary = 0; else if

  1. Mobile task management tool that improves workflow of an acute general surgical service.

    Science.gov (United States)

    Foo, Elizabeth; McDonald, Rod; Savage, Earle; Floyd, Richard; Butler, Anthony; Rumball-Smith, Alistair; Connor, Saxon

    2015-10-01

    Understanding and being able to measure constraints within a health system is crucial if outcomes are to be improved. Current systems lack the ability to capture decision making with regard to tasks performed within a patient journey. The aim of this study was to assess the impact of a mobile task management tool on clinical workflow within an acute general surgical service by analysing data capture and usability of the application tool. The Cortex iOS application was developed to digitize patient flow and provide real-time visibility over clinical decision making and task performance. Study outcomes measured were workflow data capture for patient and staff events. Usability was assessed using an electronic survey. There were 449 unique patient journeys tracked with a total of 3072 patient events recorded. The results repository was accessed 7792 times. The participants reported that the application sped up decision making, reduced redundancy of work and improved team communication. The mode of the estimated time the application saved participants was 5-9 min/h of work. Of the 14 respondents, nine discarded their analogue methods of tracking tasks by the end of the study period. The introduction of a mobile task management system improved the working efficiency of junior clinical staff. The application allowed capture of data not previously available to hospital systems. In the future, such data will contribute to the accurate mapping of patient journeys through the health system. © 2015 Royal Australasian College of Surgeons.

  2. A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System

    Science.gov (United States)

    Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.

    2010-05-01

    The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.

  3. Discrete derivative estimation in LISA Pathfinder data reduction

    CERN Document Server

    Ferraioli, Luigi; Vitale, Stefano

    2009-01-01

    Data analysis for the LISA Technology package (LTP) experiment to be flown aboard the LISA Pathfinder mission requires the solution of the system dynamics for the calculation of the force acting on the test masses (TMs) starting from interferometer position data. The need for a solution to this problem has prompted us to implement a discrete time domain derivative estimator suited for the LTP experiment requirements. We first report on the mathematical procedures for the definition of two methods; the first based on a parabolic fit approximation and the second based on a Taylor series expansion. These two methods are then generalized and incorporated in a more general class of five point discrete derivative estimators. The same procedure employed for the second derivative can be applied to the estimation of the first derivative and of a data smoother allowing defining a class of simple five points estimators for both. The performances of three particular realization of the five point second derivative estimat...

  4. Technical bases for modern data reduction on photographic plates

    Science.gov (United States)

    Stavinschi, Magda

    In a simplified way, the principles of CCDs and photographic plates as astronomical detectors are compared. Since there are large archives of photographic plates which will remain valuable sources of astronomical data, it is worthwile to consider modern tools for digitization of photographic plates. One of the most accurate machines is the PDS microdensitometer, except, that its photometric accuracy is limited by the originally employed analogue logarithmic amplifier, which leads to severe distortions at density gradients. A new, accurate amplifier/converter is described. It has successfully worked for several years at the PDS2020GM microdensitometers at Münster University and other machines; since September 2000 it has also been installed at the PDS1010 microdensitometer of the Sofia Sky Archive Data Center (SSADC).

  5. Dynamical generalized Hurst exponent as a tool to monitor unstable periods in financial time series

    Science.gov (United States)

    Morales, Raffaello; Di Matteo, T.; Gramatica, Ruggero; Aste, Tomaso

    2012-06-01

    We investigate the use of the Hurst exponent, dynamically computed over a weighted moving time-window, to evaluate the level of stability/instability of financial firms. Financial firms bailed-out as a consequence of the 2007-2008 credit crisis show a neat increase with time of the generalized Hurst exponent in the period preceding the unfolding of the crisis. Conversely, firms belonging to other market sectors, which suffered the least throughout the crisis, show opposite behaviors. We find that the multifractality of the bailed-out firms increase at the crisis suggesting that the multi fractal properties of the time series are changing. These findings suggest the possibility of using the scaling behavior as a tool to track the level of stability of a firm. In this paper, we introduce a method to compute the generalized Hurst exponent which assigns larger weights to more recent events with respect to older ones. In this way large fluctuations in the remote past are less likely to influence the recent past. We also investigate the scaling associated with the tails of the log-returns distributions and compare this scaling with the scaling associated with the Hurst exponent, observing that the processes underlying the price dynamics of these firms are truly multi-scaling.

  6. ComPASS : a tool for distributed parallel finite volume discretizations on general unstructured polyhedral meshes

    Directory of Open Access Journals (Sweden)

    Dalissier E.

    2013-12-01

    Full Text Available The objective of the ComPASS project is to develop a parallel multiphase Darcy flow simulator adapted to general unstructured polyhedral meshes (in a general sense with possibly non planar faces and to the parallelization of advanced finite volume discretizations with various choices of the degrees of freedom such as cell centres, vertices, or face centres. The main targeted applications are the simulation of CO2 geological storage, nuclear waste repository and reservoir simulations. The CEMRACS 2012 summer school devoted to high performance computing has been an ideal framework to start this collaborative project. This paper describes what has been achieved during the four weeks of the CEMRACS project which has been focusing on the implementation of basic features of the code such as the distributed unstructured polyhedral mesh, the synchronization of the degrees of freedom, and the connection to scientific libraries including the partitioner METIS, the visualization tool PARAVIEW, and the parallel linear solver library PETSc. The parallel efficiency of this first version of the ComPASS code has been validated on a toy parabolic problem using the Vertex Approximate Gradient finite volume spatial discretization with both cell and vertex degrees of freedom, combined with an Euler implicit time integration.

  7. A Global Multi-Objective Optimization Tool for Design of Mechatronic Components using Generalized Differential Evolution

    DEFF Research Database (Denmark)

    Bech, Michael Møller; Nørgård, Christian; Roemer, Daniel Beck

    2016-01-01

    This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri-objectiv...... different optimization control parameter settings and it is concluded that GDE3 is a reliable optimization tool that can assist mechatronic engineers in the design and decision making process.......This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri......-objective problems having 10+ design variables are both highly constrained, nonlinear and non-smooth but nevertheless the algorithm converges to the Pareto-front within a hours of computation (20k function evaluations). Additionally, the robustness and convergence speed of the algorithm are investigated using...

  8. Programs for Data Reduction and Optimization of the System Work

    Science.gov (United States)

    Breus, V. V.

    During last years, some new computer programs were developed. In this article, will be described three of them. The "Variable Stars Calculator" was developed for processing photometrical observations of variable stars. It helps the observer at each step from converting estimates of brightness into stellar magnitudes to searching a period of changing brightness, PCA analysis, searching extremums by the polynomial approximation etc. The program has Ukrainian, Russian an English interface languages and it is possible to add new ones. The "PolarObs" was developed for processing polarimetrical observations, obtained at the 2.6 Shain telescope in the Crimean astrophysical observatory. It was used either for processing observations of cataclysmic variable stars, or for comets. "TrayDog" is a system tool for Windows with more than 50 functions. Enhanced Task manager, that can view and edit properties of process, windows, libraries, threads, network ports and opened files. Other functions are: switching between desktops by hot-key, minimize any window to the system tray area, system information, blocking pop-ups of any kind, view and connect network shared resources, alarm clock andother functions. The interface of the current version is only in Russian. These and some other programs can be downloaded from the pages http://uavso.org.ua/breus, http://uavso.org.ua/breus

  9. Matrix sketching for big data reduction (Conference Presentation)

    Science.gov (United States)

    Ezekiel, Soundararajan; Giansiracusa, Michael

    2017-05-01

    Abstract: In recent years, the concept of Big Data has become a more prominent issue as the volume of data as well as the velocity in which it is produced exponentially increases. By 2020 the amount of data being stored is estimated to be 44 Zettabytes and currently over 31 Terabytes of data is being generated every second. Algorithms and applications must be able to effectively scale to the volume of data being generated. One such application designed to effectively and efficiently work with Big Data is IBM's Skylark. Part of DARPA's XDATA program, an open-source catalog of tools to deal with Big Data; Skylark, or Sketching-based Matrix Computations for Machine Learning is a library of functions designed to reduce the complexity of large scale matrix problems that also implements kernel-based machine learning tasks. Sketching reduces the dimensionality of matrices through randomization and compresses matrices while preserving key properties, speeding up computations. Matrix sketches can be used to find accurate solutions to computations in less time, or can summarize data by identifying important rows and columns. In this paper, we investigate the effectiveness of sketched matrix computations using IBM's Skylark versus non-sketched computations. We judge effectiveness based on several factors: computational complexity and validity of outputs. Initial results from testing with smaller matrices are promising, showing that Skylark has a considerable reduction ratio while still accurately performing matrix computations.

  10. On Efficient Data Reduction for Network Partition Forecasting in WSNs

    Directory of Open Access Journals (Sweden)

    Faisal Karim Shaikh

    2011-04-01

    Full Text Available WSNs (Wireless Sensor Networks are generally deployed for long-lived missions. However, they rely on finite energy resources which lead to network partitioning. Network partitioning limits the dependability of WSN by making relevant spatial regions disconnected thus requiring the maintenance of the network. The network maintenance necessitates early warning and consequently forecasting of the network partitioning such that some early action can be taken to mitigate the problem. There exist approaches allowing for detection of network partitioning but none for its forecasting. We present an efficient approach for a proactive network ParFor (Partition Forecasting based on energy maps. ParFor implements spatial and temporal suppression mechanisms such that from energy weak regions only a few nodes report short alarms to the sink. Using these alarms the forecasting is done centrally at the sink. Using simulations we highlight the efficiency and accuracy of ParFor.

  11. The 7-Item Generalized Anxiety Disorder Scale as a Tool for Measuring Generalized Anxiety in Multiple Sclerosis

    OpenAIRE

    Terrill, Alexandra L.; Hartoonian, Narineh; Beier, Meghan; Salem, Rana; Alschuler, Kevin

    2015-01-01

    Background: Generalized anxiety disorder (GAD) is common in multiple sclerosis (MS) but understudied. Reliable and valid measures are needed to advance clinical care and expand research in this area. The objectives of this study were to examine the psychometric properties of the 7-item Generalized Anxiety Disorder Scale (GAD-7) in individuals with MS and to analyze correlates of GAD.

  12. Textbook-Bundled Metacognitive Tools: A Study of LearnSmart's Efficacy in General Chemistry

    Science.gov (United States)

    Thadani, Vandana; Bouvier-Brown, Nicole C.

    2016-01-01

    College textbook publishers increasingly bundle sophisticated technology-based study tools with their texts. These tools appear promising, but empirical work on their efficacy is needed. We examined whether LearnSmart, a study tool bundled with McGraw-Hill's textbook "Chemistry" (Chang & Goldsby, 2013), improved learning in an…

  13. Textbook-Bundled Metacognitive Tools: A Study of LearnSmart's Efficacy in General Chemistry

    Science.gov (United States)

    Thadani, Vandana; Bouvier-Brown, Nicole C.

    2016-01-01

    College textbook publishers increasingly bundle sophisticated technology-based study tools with their texts. These tools appear promising, but empirical work on their efficacy is needed. We examined whether LearnSmart, a study tool bundled with McGraw-Hill's textbook "Chemistry" (Chang & Goldsby, 2013), improved learning in an…

  14. A general tool for evaluating high-contrast coronagraphic telescope performance error budgets

    Science.gov (United States)

    Marchen, Luis F.; Shaklan, Stuart B.

    2009-08-01

    This paper describes a general purpose Coronagraph Performance Error Budget (CPEB) tool that we have developed under the NASA Exoplanet Exploration Program. The CPEB automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. It operates in 3 steps: first, a CodeV or Zemax prescription is converted into a MACOS optical prescription. Second, a Matlab program calls ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-ofsight pointing, with and without controlled coarse and fine-steering mirrors. Third, the sensitivity matrices are imported by macros into Excel 2007 where the error budget is created. Once created, the user specifies the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions and combines them with the sensitivity matrices to generate an error budget for the system. The user can easily modify the motion allocations to perform trade studies.

  15. Current Events via Electronic Media: An Instructional Tool in a General Education Geology Course

    Science.gov (United States)

    Flood, T. P.

    2008-12-01

    St. Norbert College (SNC) is a liberal arts college in the Green Bay Metropolitan area with an enrollment of approximately 2100 students. All students are required to take one science course with a laboratory component as part of the general education program. Approximately 40% of all SNC students take introductory geology. Class size for this course is approximately 35 students. Each faculty member teaches one section per semester in a smart classroom A synthesis of current events via electronic media is an excellent pedagogical tool for the introductory geology course. An on-going informal survey of my introductory geology class indicates that between 75- 85% of all students in the class, mostly freshman and sophomores, do not follow the news on a regular basis in any format, i.e. print, internet, or television. Consequently, most are unaware of current scientific topics, events, trends, and relevancy. To address this issue, and develop a positive habit of the mind, a technique called In-the-News-Making-News (INMN) is employed. Each class period begins with a scientifically-related (mostly geology) online news article displayed on an overhead screen. The articles are drawn from a variety of sources that include international sites such as the BBC and CBC; national sites such as PBS, New York Times, and CNN; and local sites such as the Milwaukee Journal Sentinel and the Green Bay Press Gazette. After perusing the article, additional information is often acquired by "Google" to help supplement and clarify the original article. An interactive discussion follows. Topics that are typically covered include: global climate change, basic scientific and technological discoveries, paleontology/evolution, natural disasters, mineral/ energy/ water resources, funding for science, space exploration, and other. Ancillary areas that are often touched on in the conversation include ethics, politics, economics, philosophy, education, geography, culture, or other. INMN addresses

  16. The general alcoholics anonymous tools of recovery: the adoption of 12-step practices and beliefs.

    Science.gov (United States)

    Greenfield, Brenna L; Tonigan, J Scott

    2013-09-01

    Working the 12 steps is widely prescribed for Alcoholics Anonymous (AA) members although the relative merits of different methods for measuring step work have received minimal attention and even less is known about how step work predicts later substance use. The current study (1) compared endorsements of step work on an face-valid or direct measure, the Alcoholics Anonymous Inventory (AAI), with an indirect measure of step work, the General Alcoholics Anonymous Tools of Recovery (GAATOR); (2) evaluated the underlying factor structure of the GAATOR and changes in step work over time; (3) examined changes in the endorsement of step work over time; and (4) investigated how, if at all, 12-step work predicted later substance use. New AA affiliates (N = 130) completed assessments at intake, 3, 6, and 9 months. Significantly more participants endorsed step work on the GAATOR than on the AAI for nine of the 12 steps. An exploratory factor analysis revealed a two-factor structure for the GAATOR comprising behavioral step work and spiritual step work. Behavioral step work did not change over time, but was predicted by having a sponsor, while Spiritual step work decreased over time and increases were predicted by attending 12-step meetings or treatment. Behavioral step work did not prospectively predict substance use. In contrast, spiritual step work predicted percent days abstinent. Behavioral step work and spiritual step work appear to be conceptually distinct components of step work that have distinct predictors and unique impacts on outcomes. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  17. Micro-Arcsec mission: implications of the monitoring, diagnostic and calibration of the instrument response in the data reduction chain. .

    Science.gov (United States)

    Busonero, D.; Gai, M.

    The goals of 21st century high angular precision experiments rely on the limiting performance associated to the selected instrumental configuration and observational strategy. Both global and narrow angle micro-arcsec space astrometry require that the instrument contributions to the overall error budget has to be less than the desired micro-arcsec level precision. Appropriate modelling of the astrometric response is required for optimal definition of the data reduction and calibration algorithms, in order to ensure high sensitivity to the astrophysical source parameters and in general high accuracy. We will refer to the framework of the SIM-Lite and the Gaia mission, the most challenging space missions of the next decade in the narrow angle and global astrometry field, respectively. We will focus our dissertation on the Gaia data reduction issues and instrument calibration implications. We describe selected topics in the framework of the Astrometric Instrument Modelling for the Gaia mission, evidencing their role in the data reduction chain and we give a brief overview of the Astrometric Instrument Model Data Analysis Software System, a Java-based pipeline under development by our team.

  18. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...

  19. Mille general practice governance (MilleGPG): an interactive tool to address an effective quality of care through the Italian general practice network.

    Science.gov (United States)

    Cricelli, Iacopo; Lapi, Francesco; Montalbano, Carmelo; Medea, Gerardo; Cricelli, Claudio

    2013-10-01

    The General Practitioner (GP) is the "gate-keeper" in patients' treatment and management. Herein, the use of Electronic Medical Records (EMR) could represent an effective support for GPs. Software capable of managing EMRs are available and they can be functional in adopting treatment guidelines by means of computerized prompts and reminders systems. These tools can be also programmed to include clinical algorithms with which to measure the quality of care to make possible the identification of clinical issues, and to take actions for addressing them. Given that similar tools were not available in Italy, we developed MilleGPG, an interactive tool aimed to evaluate, and subsequently improve the quality of care among patients with comorbidities.

  20. Examination of skin lesions for cancer : Which clinical decision aids and tools are available in general practice?

    NARCIS (Netherlands)

    Koelink, Cecile J. L.; Jonkman, Marcel F.; Van der Meer, Klaas; Van der Heide, Wouter K.

    2014-01-01

    Background While skin cancer incidence is rising throughout Europe, general practitioners (GP) feel unsure about their ability to diagnose skin malignancies. Objectives To evaluate whether the GP has sufficient validated clinical decision aids and tools for the examination of potentially malignant

  1. The generalized mathematical model of the failure of the cutting tool

    Science.gov (United States)

    Pasko, N. I.; Antsev, A. V.; Antseva, N. V.; Fyodorov, V. P.

    2017-02-01

    We offer a mathematical model which takes into account the following factors: the spread of the cutting properties of the tool, parameters spread of gear blanks and consideration of the factor of a possible fracture of the cutting wedge tool. The reliability function, taking into account the above-mentioned factors, has five parameters for which assessment we propose a method according to our experience. A numerical illustration of the method is shown in the article. We suggest using the model in the optimization mode of the cutting tool preventive measures.

  2. Streamlining an IRAF data reduction process Pythonically with Astropy and NDMapper

    Science.gov (United States)

    Turner, James

    2016-03-01

    In the course of re-writing my typical top-level GMOS-IFU data reduction sequence in Python for a research project, I have developed a small module that helps express the scientific process in a relatively intuitive way as a Pythonic series of operations on NDData collections, mapped to files, with existing IRAF steps integrated almost seamlessly (pending their eventual replacement). For scientific end-user purposes, this experiment aims to obviate a need for pipeline machinery, favouring simple control flow in the main script and retaining a smooth transition from high-level process description to lower-level libraries by encapsulating necessary bookeeping within the data representation and simple wrappers. The I/O abstraction should make support for file formats other than FITS (eg. ASDF) straightforward to add. This work-in-progress can be found at https://github.com/jehturner/ndmapper and I intend to split its functionality involving IRAF or instrument processing into a separate "ndprocess" module as the prototype nears completion, leaving a core "ndmapper" package, without any special dependencies, as a general add-on for nddata.

  3. Using Data Reduction Methods To Predict Quality Of Life In Brest ...

    African Journals Online (AJOL)

    But usually existed a lot of factor cause difficulty for fitting the models and predicting. ... method of data reduction was used for reducing the number of predictors. ... regression showed that only role function, social function and diarrhea were ...

  4. Gemini Planet Imager Observational Calibrations I: Overview of the GPI Data Reduction Pipeline

    CERN Document Server

    Perrin, Marshall D; Ingraham, Patrick; Savransky, Dmitry; Millar-Blanchaer, Max; Wolff, Schuyler G; Ruffio, Jean-Baptiste; Wang, Jason J; Draper, Zachary H; Sadakuni, Naru; Marois, Christian; Rajan, Abhijith; Fitzgerald, Michael P; Macintosh, Bruce; Graham, James R; Doyon, René; Larkin, James E; Chilcote, Jeffrey K; Goodsell, Stephen J; Palmer, David W; Labrie, Kathleen; Beaulieu, Mathilde; De Rosa, Robert J; Greenbaum, Alexandra Z; Hartung, Markus; Hibon, Pascale; Konopacky, Quinn; Lafreniere, David; Lavigne, Jean-Francois; Marchis, Franck; Patience, Jenny; Pueyo, Laurent; Rantakyrö, Fredrik T; Soummer, Rémi; Sivaramakrishnan, Anand; Thomas, Sandrine; Ward-Duong, Kimberly; Wiktorowicz, Sloane

    2014-01-01

    The Gemini Planet Imager (GPI) has as its science instrument an infrared integral field spectrograph/polarimeter (IFS). Integral field spectrographs are scientificially powerful but require sophisticated data reduction systems. For GPI to achieve its scientific goals of exoplanet and disk characterization, IFS data must be reconstructed into high quality astrometrically and photometrically accurate datacubes in both spectral and polarization modes, via flexible software that is usable by the broad Gemini community. The data reduction pipeline developed by the GPI instrument team to meet these needs is now publicly available following GPI's commissioning. This paper, the first of a series, provides a broad overview of GPI data reduction, summarizes key steps, and presents the overall software framework and implementation. Subsequent papers describe in more detail the algorithms necessary for calibrating GPI data. The GPI data reduction pipeline is open source, available from planetimager.org, and will continue...

  5. A New Comprehensive Short-form Health Literacy Survey Tool for Patients in General

    Directory of Open Access Journals (Sweden)

    Tuyen Van Duong, RN, MSN, PhD

    2017-03-01

    Conclusion: The comprehensive HL-SF12 was a valid and easy to use tool for assessing patients’ health literacy in the hospitals to facilitate healthcare providers in enhancing patients’ health literacy and healthcare qualities.

  6. Usefulness of a virtual community of practice and web 2.0 tools for general practice training: experiences and expectations of general practitioner registrars and supervisors.

    Science.gov (United States)

    Barnett, Stephen; Jones, Sandra C; Bennett, Sue; Iverson, Don; Bonney, Andrew

    2013-01-01

    General practice training is a community of practice in which novices and experts share knowledge. However, there are barriers to knowledge sharing for general practioner (GP) registrars, including geographic and workplace isolation. Virtual communities of practice (VCoP) can be effective in overcoming these barriers using social media tools. The present study examined the perceived usefulness, features and barriers to implementing a VCoP for GP training. Following a survey study of GP registrars and supervisors on VCoP feasibility, a qualitative telephone interview study was undertaken within a regional training provider. Participants with the highest Internet usage in the survey study were selected. Two researchers worked independently conducting thematic analysis using manual coding of transcriptions, later discussing themes until agreement was reached. Seven GP registrars and three GP supervisors participated in the study (average age 38.2 years). Themes emerged regarding professional isolation, potential of social media tools to provide peer support and improve knowledge sharing, and barriers to usage, including time, access and skills. Frequent Internet-using GP registrars and supervisors perceive a VCoP for GP training as a useful tool to overcome professional isolation through improved knowledge sharing. Given that professional isolation can lead to decreased rural work and reduced hours, a successful VCoP may have a positive outcome on the rural medical workforce.

  7. MACHINE TOOL OPERATOR--GENERAL, ENTRY, SUGGESTED GUIDE FOR A TRAINING COURSE.

    Science.gov (United States)

    RONEY, MAURICE W.; AND OTHERS

    THE PURPOSE OF THIS CURRICULUM GUIDE IS TO ASSIST THE ADMINISTRATOR AND INSTRUCTOR IN PLANNING AND DEVELOPING MANPOWER DEVELOPMENT AND TRAINING PROGRAMS TO PREPARE MACHINE TOOL OPERATORS FOR ENTRY-LEVEL POSITIONS. THE COURSE OUTLINE PROVIDES UNITS IN -- (1) ORIENTATION, (2) BENCH WORK, (3) SHOP MATHEMATICS, (4) BLUEPRINT READING AND SKETCHING, (5)…

  8. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Imprecision and uncertainty in information representation and processing new tools based on intuitionistic fuzzy sets and generalized nets

    CERN Document Server

    Sotirov, Sotir

    2016-01-01

    The book offers a comprehensive and timely overview of advanced mathematical tools for both uncertainty analysis and modeling of parallel processes, with a special emphasis on intuitionistic fuzzy sets and generalized nets. The different chapters, written by active researchers in their respective areas, are structured to provide a coherent picture of this interdisciplinary yet still evolving field of science. They describe key tools and give practical insights into and research perspectives on the use of Atanassov's intuitionistic fuzzy sets and logic, and generalized nets for describing and dealing with uncertainty in different areas of science, technology and business, in a single, to date unique book. Here, readers find theoretical chapters, dealing with intuitionistic fuzzy operators, membership functions and algorithms, among other topics, as well as application-oriented chapters, reporting on the implementation of methods and relevant case studies in management science, the IT industry, medicine and/or ...

  10. A clinical trial alert tool to recruit large patient samples and assess selection bias in general practice research

    Directory of Open Access Journals (Sweden)

    Scheidt-Nave Christa

    2011-02-01

    Full Text Available Abstract Background Many research projects in general practice face problems when recruiting patients, often resulting in low recruitment rates and an unknown selection bias, thus limiting their value for health services research. The objective of the study is to evaluate the recruitment performance of the practice staff in 25 participating general practices when using a clinical trial alert (CTA tool. Methods The CTA tool was developed for an osteoporosis survey of patients at risk for osteoporosis and fractures. The tool used data from electronic patient records (EPRs to automatically identify the population at risk (net sample, to apply eligibility criteria, to contact eligible patients, to enrol and survey at least 200 patients per practice. The effects of the CTA intervention were evaluated on the basis of recruitment efficiency and selection bias. Results The CTA tool identified a net sample of 16,067 patients (range 162 to 1,316 per practice, of which the practice staff reviewed 5,161 (32% cases for eligibility. They excluded 3,248 patients and contacted 1,913 patients. Of these, 1,526 patients (range 4 to 202 per practice were successfully enrolled and surveyed. This made up 9% of the net sample and 80% of the patients contacted. Men and older patients were underrepresented in the study population. Conclusion Although the recruitment target was unreachable for most practices, the practice staff in the participating practices used the CTA tool successfully to identify, document and survey a large patient sample. The tool also helped the research team to precisely determine a slight selection bias.

  11. Outcome prioritisation tool for medication review in older patients with multimorbidity: a pilot study in general practice

    OpenAIRE

    van Summeren, Jojanneke JGT; Schuling, Jan; Haaijer-Ruskamp, Flora M.; Denig, Petra

    2017-01-01

    Background Several methods have been developed to conduct and support medication reviews in older persons with multimorbidity. Assessing the patient’s priorities for achieving specific health outcomes can guide the medication review process. Little is known about the impact of conducting such assessments. Aim This pilot study aimed to determine proposed and observed medication changes when using an outcome prioritisation tool (OPT) during a medication review in general practice. Design and se...

  12. A System Identification Software Tool for General MISO ARX-Type of Model Structures

    OpenAIRE

    Lindskog, Peter

    1996-01-01

    The typical system identification procedure requires powerful and versatile software means. In this paper we describe and exemplify the use of a prototype identi#cation software tool, applicable for the rather broad class of multi input single output model structures with regressors that are formed by delayed in- and outputs. Interesting special instances of this model structure category include, e.g., linear ARX and many semi-physical structures, feed-forward neural networks, radial basis fu...

  13. The Differential Phase Experiment: experimental concept, design analysis, and data reduction analysis

    Science.gov (United States)

    Tyler, Glenn A.; Brennan, Terry J.; Browne, Stephen L.; Dueck, Robert H.; Lodin, Michael S.; Roberts, Phillip H.; Vaughn, Jeffrey L.

    1997-08-01

    This paper describes the differential phase experiment (DPE) which formed a major part of the ABLE ACE suite of experiments conducted by the Air Force. The work described covers the rationale for the experiment, the basic experimental concept, the analysis of the differential phase, the optical and software design analysis, a discussion of the polarization scrambling characteristics of the optics, calibration of the equipment and a presentation of some of the major results of the data reduction effort to date. The DPE was a propagation experiment conducted between two aircraft flying at an altitude of 40,000 feet whose purpose was to measure the phase difference between two beams propagating at slightly different angels through the atmosphere. A four bin polarization interferometer was used to measure the differential phase. Due to the high level of scintillation that was presented branch points were present in the phase function. Rytov theory, wave optics simulation and the experimental measurements are in general agreement. Self consistency checks that were performed on the data indicate a high level of confidence in the results. Values of Cn2 that are consistent with the measurements of the differential phase agree with simultaneous scintillometer measurement taken long the same path in levels of turbulence where the scintillometer is not saturated. These differential phase based Cn2 estimates do not appear to saturate as is typical of scintillometer measurements and appear to extend the range over which high levels of Cn2 can be estimated. In addition the differential phase and anisoplanatic Strehl computed from the data is consistent with Rytov theory and wave optics simulations.

  14. General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    Science.gov (United States)

    Marchen, Luis F.

    2011-01-01

    The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model

  15. A new approach in data reduction proper handling of random errors and image distortions

    CERN Document Server

    Cardiel, N; Gallego, J; Serrano, A; Zamorano, J; García-Vargas, M L; Gómez-Cambronero, P; Filgueira, J M

    2002-01-01

    Data reduction procedures are aimed to minimize the impact of data acquisition imperfections on the measurement of data properties with a scientific meaning for the astronomer. To achieve this purpose, appropriate arithmetic manipulations with data and calibration frames must be performed. Furthermore, a full understanding of all the possible measurements relies on a solid constraint of their associated errors. We discuss different strategies for obtaining realistic determinations of final random errors. In particular, we highlight the benefits of considering the data reduction process as the full characterization of the raw-data frames, but avoiding, as far as possible, the arithmetic manipulation of that data until the final measure and analysis of the image properties. This philosophy will be used in the pipeline data reduction for ELMER and EMIR.

  16. The Gemini Recipe System: A Dynamic Workflow for Automated Data Reduction

    Science.gov (United States)

    Labrie, K.; Hirst, P.; Allen, C.

    2011-07-01

    Gemini's next generation data reduction software suite aims to offer greater automation of the data reduction process without compromising the flexibility required by science programs using advanced or unusual observing strategies. The Recipe System is central to our new data reduction software. Developed in Python, it facilitates near-real time processing for data quality assessment, and both on- and off-line science quality processing. The Recipe System can be run as a standalone application or as the data processing core of an automatic pipeline. Building on concepts that originated in ORAC-DR, a data reduction process is defined in a Recipe written in a science (as opposed to computer) oriented language, and consists of a sequence of data reduction steps called Primitives. The Primitives are written in Python and can be launched from the PyRAF user interface by users wishing for more hands-on optimization of the data reduction process. The fact that the same processing Primitives can be run within both the pipeline context and interactively in a PyRAF session is an important strength of the Recipe System. The Recipe System offers dynamic flow control allowing for decisions regarding processing and calibration to be made automatically, based on the pixel and the metadata properties of the dataset at the stage in processing where the decision is being made, and the context in which the processing is being carried out. Processing history and provenance recording are provided by the AstroData middleware, which also offers header abstraction and data type recognition to facilitate the development of instrument-agnostic processing routines. All observatory or instrument specific definitions are isolated from the core of the AstroData system and distributed in external configuration packages that define a lexicon including classifications, uniform metadata elements, and transformations.

  17. Adapting generalization tools to physiographic diversity for the united states national hydrography dataset

    Science.gov (United States)

    Buttenfield, B.P.; Stanislawski, L.V.; Brewer, C.A.

    2011-01-01

    This paper reports on generalization and data modeling to create reduced scale versions of the National Hydrographic Dataset (NHD) for dissemination through The National Map, the primary data delivery portal for USGS. Our approach distinguishes local differences in physiographic factors, to demonstrate that knowledge about varying terrain (mountainous, hilly or flat) and varying climate (dry or humid) can support decisions about algorithms, parameters, and processing sequences to create generalized, smaller scale data versions which preserve distinct hydrographic patterns in these regions. We work with multiple subbasins of the NHD that provide a range of terrain and climate characteristics. Specifically tailored generalization sequences are used to create simplified versions of the high resolution data, which was compiled for 1:24,000 scale mapping. Results are evaluated cartographically and metrically against a medium resolution benchmark version compiled for 1:100,000, developing coefficients of linear and areal correspondence.

  18. Evaluation of the generalized gamma as a tool for treatment planning optimization

    Directory of Open Access Journals (Sweden)

    Emmanouil I Petrou

    2014-12-01

    Full Text Available Purpose: The aim of that work is to study the theoretical behavior and merits of the Generalized Gamma (generalized dose response gradient as well as to investigate the usefulness of this concept in practical radiobiological treatment planning.Methods: In this study, the treatment planning system RayStation 1.9 (Raysearch Laboratories AB, Stockholm, Sweden was used. Furthermore, radiobiological models that provide the tumor control probability (TCP, normal tissue complication probability (NTCP, complication-free tumor control probability (P+ and the Generalized Gamma were employed. The Generalized Gammas of TCP and NTCP, respectively were calculated for given heterogeneous dose distributions to different organs in order to verify the TCP and NTCP computations of the treatment planning system. In this process, a treatment plan was created, where the target and the organs at risk were included in the same ROI in order to check the validity of the system regarding the objective function P+ and the Generalized Gamma. Subsequently, six additional treatment plans were created with the target organ and the organs at risk placed in the same or different ROIs. In these plans, the mean dose was increased in order to investigate the behavior of dose change on tissue response and on Generalized Gamma before and after the change in dose. By theoretically calculating these quantities, the agreement of different theoretical expressions compared to the values that the treatment planning system provides could be evaluated. Finally, the relative error between the real and approximate response values using the Poisson and the Probit models, for the case of having a target organ consisting of two compartments in a parallel architecture and with the same number of clonogens could be investigated and quantified. Results: The computations of the RayStation regarding the values of the Generalized Gamma and the objective function (P+ were verified by using an

  19. Clinical diagnosis of influenza virus infection : evaluation of diagnostic tools in general practice

    NARCIS (Netherlands)

    van Elden, LJR; van Essen, GA; Boucher, CAB; van Loon, AM; Nijhuis, M; Schipper, P; Verheij, TJM; Hoepelman, IM

    2001-01-01

    Background: With the development of new antiviral agents for influenza, the urge for rapid and reliable diagnosis of influenza becomes increasingly important. Respiratory virus infections are difficult to distinguish on clinical grounds General practitioners (GPs) however still depend on their clini

  20. Application of Generalized Mie Theory to EELS Calculations as a Tool for Optimization of Plasmonic Structures

    DEFF Research Database (Denmark)

    Thomas, Stefan; Matyssek, Christian; Hergert, Wolfram

    2015-01-01

    Technical applications of plasmonic nanostructures require a careful structural optimization with respect to the desired functionality. The success of such optimizations strongly depends on the applied method. We extend the generalized multiparticle Mie (GMM) computational electromagnetic method ...... by the application of genetic algorithms combined with a simplex algorithm. The scheme is applied to the design of plasmonic filters....

  1. Evaluation of data reduction methods for dynamic PET series based on Monte Carlo techniques and the NCAT phantom

    Energy Technology Data Exchange (ETDEWEB)

    Thireou, Trias [Biomedical Engineering Laboratory, National Technical University of Athens, Athens (Greece): Institute of Computer Science, Foundation for Research and Technology Hellas, Heraklion (Greece); Rubio Guivernau, Jose Luis [E.T.S.I. de Telecomunicacion, Universidad Politecnica de Madrid, Madrid (Spain); Atlamazoglou, Vassilis [Biophysics Laboratory, Foundation of Biomedical Research of the Academy of Athens, Athens (Greece); Ledesma, Maria Jesus [E.T.S.I. de Telecomunicacion, Universidad Politecnica de Madrid, Madrid (Spain); Pavlopoulos, Sotiris [Biomedical Engineering Laboratory, National Technical University of Athens, Athens (Greece); Santos, Andres [E.T.S.I. de Telecomunicacion, Universidad Politecnica de Madrid, Madrid (Spain); Kontaxakis, George [E.T.S.I. de Telecomunicacion, Universidad Politecnica de Madrid, Madrid (Spain)]. E-mail: g.kontaxakis@upm.es

    2006-12-20

    A realistic dynamic positron-emission tomography (PET) thoracic study was generated, using the 4D NURBS-based (non-uniform rational B-splines) cardiac-torso (NCAT) phantom and a sophisticated model of the PET imaging process, simulating two solitary pulmonary nodules. Three data reduction and blind source separation methods were applied to the simulated data: principal component analysis, independent component analysis and similarity mapping. All methods reduced the initial amount of image data to a smaller, comprehensive and easily managed set of parametric images, where structures were separated based on their different kinetic characteristics and the lesions were readily identified. The results indicate that the above-mentioned methods can provide an accurate tool for the support of both visual inspection and subsequent detailed kinetic analysis of the dynamic series via compartmental or non-compartmental models.

  2. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis.

    Science.gov (United States)

    Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa

    2013-09-17

    Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further

  3. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis

    Science.gov (United States)

    2013-01-01

    Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to

  4. EEG after sleep deprivation is a sensitive tool in the first diagnosis of idiopathic generalized but not focal epilepsy.

    Science.gov (United States)

    Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa

    2016-01-01

    Electroencephalography (EEG) is an essential tool in the diagnosis of epilepsy. EEG after sleep deprivation might increase the likelihood of finding specific epileptiform abnormalities. However conflicting data exist concerning the sensitivity and specificity of this method. We aimed to evaluate the role of EEG after sleep deprivation in the first diagnosis of epilepsy. We analyzed retrospectively the medical histories of patients who underwent at least one unspecific standard EEG and a subsequent EEG after sleep deprivation during the time period from 2001 to 2014 at the University Hospital Zurich because of suspected epilepsy. Out of 237 patients who fulfilled all inclusion criteria, 69 were finally diagnosed with epilepsy. Seventeen of them showed interictal epileptiform patterns in EEGs after sleep deprivation, giving this method an overall sensitivity of 25%. Sensitivity of EEG after sleep deprivation was superior in patients with primary generalized epilepsies compared to patients with focal epilepsies (64% vs. 17%, p=0.0011). Overall EEG after sleep deprivation was not more sensitive than a subsequent repeated standard EEG in a subgroup of 55 patients (22% vs. 9%; p=0.065). After an unspecific standard EEG, EEG after sleep deprivation is a useful tool to increase diagnostic sensitivity in patients with idiopathic generalized epilepsy but not in those with focal epilepsy. This study provides further evidence about the usefulness of EEG after sleep deprivation as an additional diagnostic tool in epilepsy. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  5. Space-Based Gravitational-Wave Observations as Tools for Testing General Relativity

    Science.gov (United States)

    Will, Clifford M.

    2004-01-01

    We continued a project, to analyse the ways in which detection and study of gravitational waves could provide quantitative tests of general relativity, with particular emphasis on waves that would be detectable by space-based observatories, such as LISA. This work had three foci: 1) Tests of scalar-tensor theories of gravity that, could be done by analyzing gravitational waves from neutron stars inspiralling into massive black holes, as detectable by LISA; 2) Study of alternative theories of gravity in which the graviton could be massive, and of how gravitational-wave observations by space-based detectors, solar-system tests, and cosmological observations could constrain such theories; and 3) Study of gravitational-radiation back reaction of particles orbiting black holes in general relativity, with emphasis on the effects of spin.

  6. The electronic patient record as a meaningful audit tool - Accountability and autonomy in general practitioner work

    DEFF Research Database (Denmark)

    Winthereik, Brit Ross; van der Ploeg, I.; Berg, Marc

    2007-01-01

    Health authorities increasingly request that general practitioners (GPs) use information and communication technologies such as electronic patient records (EPR) for accountability purposes. This article deals with the use of EPRs among general practitioners in Britain. It examines two ways in which...... GPs use the EPR for accountability purposes. One way is to generate audit reports on the basis of the information that has been entered into the record. The other is to let the computer intervene in the clinical process through prompts. The article argues that GPs' ambivalence toward using the EPR...... requests to document one's work. Instead, new forms of autonomy are produced in the sociotechnical network that is made up by health policy and local engagements with patients and technology....

  7. Molecfit: A general tool for telluric absorption correction. I. Method and application to ESO instruments

    CERN Document Server

    Smette, A; Noll, S; Horst, H; Kausch, W; Kimeswenger, S; Barden, M; Szyszka, C; Jones, A M; Gallenne, A; Vinther, J; Ballester, P; Taylor, J

    2015-01-01

    Context: The interaction of the light from astronomical objects with the constituents of the Earth's atmosphere leads to the formation of telluric absorption lines in ground-based collected spectra. Correcting for these lines, mostly affecting the red and infrared region of the spectrum, usually relies on observations of specific stars obtained close in time and airmass to the science targets, therefore using precious observing time. Aims: We present molecfit, a tool for correcting for telluric absorption lines based on synthetic modelling of the Earth's atmospheric transmission. Molecfit is versatile and can be used with data obtained with various ground-based telescopes and instruments. Methods: Molecfit combines a publicly available radiative transfer code, a molecular line database, atmospheric profiles, and various kernels to model the instrument line spread function. The atmospheric profiles are created by merging a standard atmospheric profile representative of a given observatory's climate, of local m...

  8. A Heuristic Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) Authoring Tools

    Science.gov (United States)

    2016-03-01

    reuse and interoperability through standards. These goals were considered in the heuristic analysis of the authoring experience for GIFT users. 1.2...evaluations will be needed periodically as the technical goals are achieved as delineated in the research outlines that follow. • ARL Special Report...ARL-SR-0353 ● MAR 2016 US Army Research Laboratory A Heuristic Evaluation of the Generalized Intelligent Framework for Tutoring

  9. The General Mission Analysis Tool (GMAT): A New Resource for Supporting Debris Orbit Determination, Tracking and Analysis

    Science.gov (United States)

    Jah, Moriba; Huges, Steven; Wilkins, Matthew; Kelecy, Tom

    2009-03-01

    The General Mission Analysis Tool (GMAT) was initially developed at NASA's Goddard Space Flight Center (GSFC) as a high accuracy orbital analysis tool to support a variety of space missions. A formal agreement has recently been established between NASA and the Air Force Research Laboratory (AFRL) to further develop GMAT to include orbit determination (OD) capabilities. A variety of estimation strategies and dynamic models will be included in the new version of GMAT. GMAT will accommodate orbit determination, tracking and analysis of orbital debris through a combination of model, processing and implementation requirements. The GMAT processing architecture natively supports parallel processing such that allow it can efficiently accommodate the OD and tracking of numerous objects resulting from breakups. A full first release of the augmented GMAT capability is anticipated in September 2009 and it will be available for community use at no charge.

  10. Data Reduction Functions for the Langley 14- by 22-Foot Subsonic Tunnel

    Science.gov (United States)

    Boney, Andy D.

    2014-01-01

    The Langley 14- by 22-Foot Subsonic Tunnel's data reduction software utilizes six major functions to compute the acquired data. These functions calculate engineering units, tunnel parameters, flowmeters, jet exhaust measurements, balance loads/model attitudes, and model /wall pressures. The input (required) variables, the output (computed) variables, and the equations and/or subfunction(s) associated with each major function are discussed.

  11. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

    Science.gov (United States)

    Konrad, T. G.; Kropfli, R. A.

    1975-01-01

    Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

  12. General Authorisations as a Tool to Promote Water Allocation Reform in South Africa

    Directory of Open Access Journals (Sweden)

    A. Anderson, G. Quibell, J. Cullis and N. Ncapayi

    2007-09-01

    Full Text Available South Africa faces significant inequities in access to and use of water for productive purposes. The National Water Act seeks to address these inequities and introduced a public rights system where water is owned by the people of South Africa and held in custody by the state. This public trust doctrine forms the basis for the State to give effect to its constitutional obligation for redress. Compulsory licensing is a mechanism to proactively reallocate water on a catchment basis to achieve redress, while at the same time promoting economic efficiency and ecological sustainability. During compulsory licensing, all users are required to reapply for their water use entitlement, and a process is followed to allow for a fairer allocation of water between competing users and sectors. Some concerns have been raised that equity may not be achieved through compulsory licensing as historically disadvantaged individuals may not have the capacity to partake in the process. Similarly, the administrative burden of processing large numbers of licences from small scale users may cripple licensing authorities. Moreover, the compulsory licensing process, while encouraging Historically Disadvantaged Individuals (HDIs to apply, may have little impact on poverty if the poorest are not able to participate in the process. General authorisations are proposed as a way of addressing these concerns by setting water aside for specific categories of users. This paper introduces the concept of general authorisations in support of compulsory licensing and outlines some of the implementation challenges.

  13. Refined composite multivariate generalized multiscale fuzzy entropy: A tool for complexity analysis of multichannel signals

    Science.gov (United States)

    Azami, Hamed; Escudero, Javier

    2017-01-01

    Multiscale entropy (MSE) is an appealing tool to characterize the complexity of time series over multiple temporal scales. Recent developments in the field have tried to extend the MSE technique in different ways. Building on these trends, we propose the so-called refined composite multivariate multiscale fuzzy entropy (RCmvMFE) whose coarse-graining step uses variance (RCmvMFEσ2) or mean (RCmvMFEμ). We investigate the behavior of these multivariate methods on multichannel white Gaussian and 1/ f noise signals, and two publicly available biomedical recordings. Our simulations demonstrate that RCmvMFEσ2 and RCmvMFEμ lead to more stable results and are less sensitive to the signals' length in comparison with the other existing multivariate multiscale entropy-based methods. The classification results also show that using both the variance and mean in the coarse-graining step offers complexity profiles with complementary information for biomedical signal analysis. We also made freely available all the Matlab codes used in this paper.

  14. Clinical audit, a valuable tool to improve quality of care: General methodology and applications in nephrology.

    Science.gov (United States)

    Esposito, Pasquale; Dal Canton, Antonio

    2014-11-06

    Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings.

  15. A General Tool for Engineering the NAD/NADP Cofactor Preference of Oxidoreductases.

    Science.gov (United States)

    Cahn, Jackson K B; Werlang, Caroline A; Baumschlager, Armin; Brinkmann-Chen, Sabine; Mayo, Stephen L; Arnold, Frances H

    2017-02-17

    The ability to control enzymatic nicotinamide cofactor utilization is critical for engineering efficient metabolic pathways. However, the complex interactions that determine cofactor-binding preference render this engineering particularly challenging. Physics-based models have been insufficiently accurate and blind directed evolution methods too inefficient to be widely adopted. Building on a comprehensive survey of previous studies and our own prior engineering successes, we present a structure-guided, semirational strategy for reversing enzymatic nicotinamide cofactor specificity. This heuristic-based approach leverages the diversity and sensitivity of catalytically productive cofactor binding geometries to limit the problem to an experimentally tractable scale. We demonstrate the efficacy of this strategy by inverting the cofactor specificity of four structurally diverse NADP-dependent enzymes: glyoxylate reductase, cinnamyl alcohol dehydrogenase, xylose reductase, and iron-containing alcohol dehydrogenase. The analytical components of this approach have been fully automated and are available in the form of an easy-to-use web tool: Cofactor Specificity Reversal-Structural Analysis and Library Design (CSR-SALAD).

  16. Upgrade of the Cellular General Purpose Monte Carlo Tool FOAM to version 2.06

    CERN Document Server

    Jadach, Stanislaw

    2006-01-01

    FOAM-2.06 is an upgraded version of FOAM, a general purpose, self-adapting Monte Carlo event generator. In comparison with FOAM-2.05, it has two important improvements. New interface to random numbers lets the user to choose from the three "state of the art" random number generators. Improved algorithms for simplical grid need less computer memory; the problem of the prohibitively large memory allocation required for the large number ($>10^6$) of simplical cells is now eliminated -- the new version can handle such cases even on the average desktop computers. In addition, generation of the Monte Carlo events, in case of large number of cells, may be even significantly faster.

  17. Atomicrex—a general purpose tool for the construction of atomic interaction models

    Science.gov (United States)

    Stukowski, Alexander; Fransson, Erik; Mock, Markus; Erhart, Paul

    2017-07-01

    We introduce atomicrex, an open-source code for constructing interatomic potentials as well as more general types of atomic-scale models. Such effective models are required to simulate extended materials structures comprising many thousands of atoms or more, because electronic structure methods become computationally too expensive at this scale. atomicrex covers a wide range of interatomic potential types and fulfills many needs in atomistic model development. As inputs, it supports experimental property values as well as ab initio energies and forces, to which models can be fitted using various optimization algorithms. The open architecture of atomicrex allows it to be used in custom model development scenarios beyond classical interatomic potentials while thanks to its Python interface it can be readily integrated e.g., with electronic structure calculations or machine learning algorithms.

  18. Wormholes in spacetime and their use for interstellar travel: A tool for teaching general relativity

    Energy Technology Data Exchange (ETDEWEB)

    Morris, M.S.; Thorne, K.S.

    1988-05-01

    Rapid interstellar travel by means of spacetime wormholes is described in a way that is useful for teaching elementary general relativity. The description touches base with Carl Sagan's novel Contact, which, unlike most science fiction novels, treats such travel in a manner that accords with the best 1986 knowledge of the laws of physics. Many objections are given against the use of black holes or Schwarzschild wormholes for rapid interstellar travel. A new class of solutions of the Einstein field equations is presented, which describe wormholes that, in principle, could be traversed by human beings. It is essential in these solutions that the wormhole possess a throat at which there is no horizon; and this property, together with the Einstein field equations, places an extreme constraint on the material that generates the wormhole's spacetime curvature: In the wormhole's throat that material must possess a radial tension tau/sub 0/ with the enormous magnitude tau/sub 0/approx. (pressure at the center of the most massive of neutron stars) x (20 km)/sup 2//(circumference of throat)/sup 2/. Moreover, this tension must exceed the material's density of mass-energy, rho/sub 0/c/sup 2/. No known material has this tau/sub 0/>rho/sub 0/c/sup 2/ property, and such material would violate all the ''energy conditions'' that underlie some deeply cherished theorems in general relativity. However, it is not possible today to rule out firmly the existence of such material; and quantum field theory gives tantalizing hints that such material might, in fact, be possible.

  19. A Useful Tool As a Medical Checkup in a General Population—Bioelectrical Impedance Analysis

    Science.gov (United States)

    Enomoto, Mika; Adachi, Hisashi; Fukami, Ako; Kumagai, Eita; Nakamura, Sachiko; Nohara, Yume; Kono, Shoko; Nakao, Erika; Morikawa, Nagisa; Tsuru, Tomoko; Sakaue, Akiko; Fukumoto, Yoshihiro

    2017-01-01

    Accumulation of visceral fat leads to metabolic syndrome and increases risks of cerebro-cardiovascular diseases, which should be recognized and improved at the early stage in general population. Accurate measurement of visceral fat area (VFA) is commonly performed by the abdominal cross-sectional image measured by computed tomography scan, which is, however, limited due to the radiation exposure. The bioelectrical impedance analysis (OMRON, HDS-2000 DUALSCANR) has been recently developed to measure VFA, which is more easily accessible modality. In the present study, we investigated the clinical usefulness of DUALSCANR in 226 subjects who received health examination, including blood chemistries, electrocardiography, cardio, and carotid ultrasonography. VFA was measured within only just 5 min. Average of VFA was 83.5 ± 36.3 cm2 in men, and 64.8 ± 28.0 cm2 in women, which was correlated to weight (r = 0.7404, p < 0.0001), body mass index (BMI) (r = 0.7320, p < 0.0001), and waist circumstance (r = 0.7393, p < 0.0001). In multivariate analyses, VFA was significantly associated with weight (p < 0.0001), BMI (p < 0.0001), and waist circumstance (p < 0.0001). Compared to the group of smaller waist and normal BMI, VFA was significantly increased (p < 0.0001) in the group of larger waist and obese subjects. In conclusion, these results indicated that DUALSCANR is useful to measure VFA easily in general population, even in a large number of subjects. PMID:28210619

  20. Reliability and validity of the Evaluation Tool of Children's Handwriting-Cursive (ETCH-C) using the general scoring criteria.

    Science.gov (United States)

    Duff, Sharon; Goyen, Traci-Anne

    2010-01-01

    To determine the reliability and aspects of validity of the Evaluation Tool of Children's Handwriting-Cursive (ETCH-C; Amundson, 1995), using the general scoring criteria, when assessing children who use alternative writing scripts. Children in Years 5 and 6 with handwriting problems and a group of matched control participants from their respective classrooms were assessed with the ETCH-C twice, 4 weeks apart. Total Letter scores were most reliable; more variability should be expected for Total Word scores. Total Numeral scores showed unacceptable reliability levels and are not recommended. We found good discriminant validity for Letter and Word scores and established cutoff scores to distinguish children with and without handwriting dysfunction (Total Letter <90%, Total Word <85%). The ETCH-C, using the general scoring criteria, is a reliable and valid test of handwriting for children using alternative scripts.

  1. ROSE: The Design of a General Tool for the Independent Optimization of Object-Oriented Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Davis, K.; Philip, B.; Quinlan, D.

    1999-05-18

    framework. The interface to ROSE is particularly simple and takes advantage of standard compiler technology. ROSE acts like a preprocessor, since it must parse standard C++{sup 1}, and its use is optional, it can not be used to introduce any new language features. ROSE reads standard C++ source code and outputs standard C++ code. Its use is always optional, by design: so as not to interfere with and to remain consistent with the object-oriented framework. It is a mechanism to introduce optimizations only; adding language features using ROSE is by design no more possible than within the framework itself. Importantly, since ROSE generates C ++ code it does not preclude the use of other tools or mechanisms that would work with an application source code (including template mechanisms).

  2. A tool to measure whether business management capacity in general practice impacts on the quality of chronic illness care.

    Science.gov (United States)

    Holton, Christine H; Proudfoot, Judith G; Jayasinghe, Upali W; Grimm, Jane; Bubner, Tanya K; Winstanley, Julie; Harris, Mark F; Beilby, Justin J

    2010-11-01

    Our aim was to develop a tool to identify specific features of the business and financial management of practices that facilitate better quality care for chronic illness in primary care. Domains of management were identified, resulting in the development of a structured interview tool that was administered in 97 primary care practices in Australia. Interview items were screened and subjected to factor analysis, subscales identified and the overall model fit determined. The instrument's validity was assessed against another measure of quality of care. Analysis provided a four-factor solution containing 21 items, which explained 42.5% of the variance in the total scores. The factors related to administrative processes, human resources, marketing analysis and business development. All scores increased significantly with practice size. The business development subscale and total score were higher for rural practices. There was a significant correlation between the business development subscale and quality of care. The indicators of business and financial management in the final tool appear to be useful predictors of the quality of care. The instrument may help inform policy regarding the structure of general practice and implementation of a systems approach to chronic illness care. It can provide information to practices about areas for further development.

  3. Overview of the SOFIA Data Cycle System: An integrated set of tools and services for the SOFIA General Investigator

    CERN Document Server

    Shuping, R Y; Lin, Lan; Sun, Li; Krzaczek, Robert

    2013-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne astronomical observatory comprised of a 2.5 meter infrared telescope mounted in the aft section of a Boeing 747SP aircraft that flies at operational altitudes between 37,000 and 45,00 feet, above 99% of atmospheric water vapor. During routine operations, a host of instruments will be available to the astronomical community including cameras and spectrographs in the near- to far-IR; a sub-mm heterodyne receiver; and an high-speed occultation imager. One of the challenges for SOFIA (and all observatories in general) is providing a uniform set of tools that enable the non-expert General Investigator (GI) to propose, plan, and obtain observations using a variety of very different instruments in an easy and seamless manner. The SOFIA Data Cycle System (DCS) is an integrated set of services and user tools for the SOFIA Science and Mission Operations GI Program designed to address this challenge. Program activities supported by the DCS inclu...

  4. [Evaluation of general health status by SF-36 tool in Hip Osteoarthritis].

    Science.gov (United States)

    Angulo Tabernero, María; Fernández Letamendi, Teresa; Aguilar Ezquerra, Andres; Ungria Murillo, Julia; Panisello Sebastia, Juan José; Agudo, Jesús Mateo

    2014-01-01

    Objetivo: Determinar el estado de salud general percibida por los pacientes sometidos a artroplastia total de cadera con mini-vástago. Material y Método: Se ha administrado el cuestionario de salud SF-36 para evaluar el estado de salud percibido por 13 pacientes varones con una edad media de 46,62 (34-53) años en los que se implantó una artroplastia total de cadera no cementada tipo MiniHip® (CorinMedical) tras un seguimiento medio de 23,2 (12-47) meses. Los resultados obtenidos se compararon con los valores de referencia de la población española en varones de 45 a 54 años. Resultados: Se hallaron diferencias en aspectos como rol físico y emocional, función física, función social y dolor; mientras que en el resto de los ítems la puntuación fue similar a la de la población de referencia. Discusión: Existe la necesidad de conocer en qué grado afectan nuestras intervenciones a la calidad de vida del paciente y la manera en que es percibida por él mismo para complementar los resultados de nuestras intervenciones. Conclusión: Es necesaria una nueva perspectiva para la valoración funcional y de calidad de vida de los pacientes jóvenes sometidos a una artroplastia total de cadera.

  5. General consumer communication tools for improved image management and communication in medicine.

    Science.gov (United States)

    Rosset, Chantal; Rosset, Antoine; Ratib, Osman

    2005-12-01

    We elected to explore new technologies emerging on the general consumer market that can improve and facilitate image and data communication in medical and clinical environment. These new technologies developed for communication and storage of data can improve the user convenience and facilitate the communication and transport of images and related data beyond the usual limits and restrictions of a traditional picture archiving and communication systems (PACS) network. We specifically tested and implemented three new technologies provided on Apple computer platforms. (1) We adopted the iPod, a MP3 portable player with a hard disk storage, to easily and quickly move large number of DICOM images. (2) We adopted iChat, a videoconference and instant-messaging software, to transmit DICOM images in real time to a distant computer for conferencing teleradiology. (3) Finally, we developed a direct secure interface to use the iDisk service, a file-sharing service based on the WebDAV technology, to send and share DICOM files between distant computers. These three technologies were integrated in a new open-source image navigation and display software called OsiriX allowing for manipulation and communication of multimodality and multidimensional DICOM image data sets. This software is freely available as an open-source project at http://homepage.mac.com/rossetantoine/OsiriX. Our experience showed that the implementation of these technologies allowed us to significantly enhance the existing PACS with valuable new features without any additional investment or the need for complex extensions of our infrastructure. The added features such as teleradiology, secure and convenient image and data communication, and the use of external data storage services open the gate to a much broader extension of our imaging infrastructure to the outside world.

  6. PISCES High Contrast Integral Field Spectrograph Simulations and Data Reduction Pipeline

    Science.gov (United States)

    Llop Sayson, Jorge Domingo; Memarsadeghi, Nargess; McElwain, Michael W.; Gong, Qian; Perrin, Marshall; Brandt, Timothy; Grammer, Bryan; Greeley, Bradford; Hilton, George; Marx, Catherine

    2015-01-01

    The PISCES (Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies) is a lenslet array based integral field spectrograph (IFS) designed to advance the technology readiness of the WFIRST (Wide Field Infrared Survey Telescope)-AFTA (Astrophysics Focused Telescope Assets) high contrast Coronagraph Instrument. We present the end to end optical simulator and plans for the data reduction pipeline (DRP). The optical simulator was created with a combination of the IDL (Interactive Data Language)-based PROPER (optical propagation) library and Zemax (a MatLab script), while the data reduction pipeline is a modified version of the Gemini Planet Imager's (GPI) IDL pipeline. The simulations of the propagation of light through the instrument are based on Fourier transform algorithms. The DRP enables transformation of the PISCES IFS data to calibrated spectral data cubes.

  7. Development and Performance Analysis of a Lossless Data Reduction Algorithm for VoIP

    Directory of Open Access Journals (Sweden)

    Syed Misbahuddin

    2014-01-01

    Full Text Available VoIP (Voice Over IP is becoming an alternative way of voice communications over the Internet. To better utilize voice call bandwidth, some standard compression algorithms are applied in VoIP systems. However, these algorithms affect the voice quality with high compression ratios. This paper presents a lossless data reduction technique to improve VoIP data transfer rate over the IP network. The proposed algorithm exploits the data redundancies in digitized VFs (Voice Frames generated by VoIP systems. Performance of proposed data reduction algorithm has been presented in terms of compression ratio. The proposed algorithm will help retain the voice quality along with the improvement in VoIP data transfer rates.

  8. The QuickReduce data reduction pipeline for the WIYN One Degree Imager

    CERN Document Server

    Kotulla, Ralf

    2013-01-01

    Optimizing one's observing strategy while at the telescope relies on knowing the current observing conditions and the obtained data quality. In particular the latter is not straight forward with current wide-field imagers, such as the WIYN One Degree Imager (ODI), currently consisting of 13 detectors, each of them read out in 64 independent cells. Here we present a fast data reduction software for ODI, optimized for a first data inspection during acquisition at the the telescope, but capable enough for science-quality data reductions. The pipeline is coded in pure python with minimal additional requirements. It is installed on the ODI observer's interface and publicly available from the author's webpage. It performs all basic reduction steps as well as more advanced corrections for pupil-ghost removal, fringe correction and masking of persistent pixels. Additional capabilities include adding an accurate astrometric WCS solution based on the 2MASS reference system as well as photometric zeropoint calibration f...

  9. The Mid-Infrared Instrument for the James Webb Space Telescope, X. Operations and Data Reduction

    CERN Document Server

    Gordon, Karl D; Anderson, Rachel E; Azzollini, Ruyman; Bergeron, L; Bouchet, Patrice; Bouwman, Jeroen; Cracraft, Misty; Fischer, Sebastian; Friedman, Scott D; Garcia-Marin, Macarena; Glasse, Alistair; Glauser, Adrian M; Goodson, G B; Greene, T P; Hines, Dean C; Khorrami, M A; Lahuis, Fred; Lajoie, C -P; Meixner, M E; Morrison, Jane E; O'Sullivan, Brian; Pontoppidan, K M; Regan, M W; Ressler, M E; Rieke, G H; Scheithauer, Silvia; Walker, Helen; Wright, G S

    2015-01-01

    We describe the operations concept and data reduction plan for the Mid- Infrared Instrument (MIRI) for the James Webb Space Telescope (JWST). The overall JWST operations concept is to use Observation Templates (OTs) to provide a straightforward and intuitive way for users to specify observations. MIRI has four OTs that correspond to the four observing modes: 1.) Imaging, 2.) Coronagraphy, 3.) Low Resolution Spectroscopy, and 4.) Medium Resolution Spectroscopy. We outline the user choices and expansion of these choices into detailed instrument operations. The data reduction plans for MIRI are split into three stages, where the specificity of the reduction steps to the observation type increases with stage. The reduction starts with integration ramps: stage 1 yields uncalibrated slope images; stage 2 calibrates the slope images; and then stage 3 combines multiple calibrated slope images into high level data products (e.g. mosaics, spectral cubes, and extracted source information). Finally, we give examples of t...

  10. A marked bounding box method for image data reduction and reconstruction of sole patterns

    Science.gov (United States)

    Wang, Xingyue; Wu, Jianhua; Zhao, Qingmin; Cheng, Jian; Zhu, Yican

    2012-01-01

    A novel and efficient method called marked bounding box method based on marching cubes is presented for the point cloud data reduction of sole patterns. This method is characterized in that each bounding box is marked with an index during the process of data reduction and later for use of data reconstruction. The data reconstruction is implemented from the simplified data set by using triangular meshes, the indices being used to search the nearest points from adjacent bounding boxes. Afterwards, the normal vectors are estimated to determine the strength and direction of the surface reflected light. The proposed method is used in a sole pattern classification and query system which uses OpenGL under Visual C++ to render the image of sole patterns. Digital results are given to demonstrate the efficiency and novelty of our method. Finally, conclusion and discussions are made.

  11. PISCES High Contrast Integral Field Spectrograph Simulations and Data Reduction Pipeline

    Science.gov (United States)

    Llop Sayson, Jorge Domingo; Memarsadeghi, Nargess; McElwain, Michael W.; Gong, Qian; Perrin, Marshall; Brandt, Timothy; Grammer, Bryan; Greeley, Bradford; Hilton, George; Marx, Catherine

    2015-01-01

    The PISCES (Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies) is a lenslet array based integral field spectrograph (IFS) designed to advance the technology readiness of the WFIRST (Wide Field Infrared Survey Telescope)-AFTA (Astrophysics Focused Telescope Assets) high contrast Coronagraph Instrument. We present the end to end optical simulator and plans for the data reduction pipeline (DRP). The optical simulator was created with a combination of the IDL (Interactive Data Language)-based PROPER (optical propagation) library and Zemax (a MatLab script), while the data reduction pipeline is a modified version of the Gemini Planet Imager's (GPI) IDL pipeline. The simulations of the propagation of light through the instrument are based on Fourier transform algorithms. The DRP enables transformation of the PISCES IFS data to calibrated spectral data cubes.

  12. Participants' evaluation of a group-based organisational assessment tool in Danish general practice: the Maturity Matrix.

    Science.gov (United States)

    Buch, Martin Sandberg; Edwards, Adrian; Eriksson, Tina

    2009-01-01

    The Maturity Matrix is a group-based formative self-evaluation tool aimed at assessing the degree of organisational development in general practice and providing a starting point for local quality improvement. Earlier studies of the Maturity Matrix have shown that participants find the method a useful way of assessing their practice's organisational development. However, little is known about participants' views on the resulting efforts to implement intended changes. To explore users' perspectives on the Maturity Matrix method, the facilitation process, and drivers and barriers for implementation of intended changes. Observation of two facilitated practice meetings, 17 semi-structured interviews with participating general practitioners (GPs) or their staff, and mapping of reasons for continuing or quitting the project. General practices in Denmark Main outcomes: Successful change was associated with: a clearly identified anchor person within the practice, a shared and regular meeting structure, and an external facilitator who provides support and counselling during the implementation process. Failure to implement change was associated with: a high patient-related workload, staff or GP turnover (that seemed to affect small practices more), no clearly identified anchor person or anchor persons who did not do anything, no continuous support from an external facilitator, and no formal commitment to working with agreed changes. Future attempts to improve the impact of the Maturity Matrix, and similar tools for quality improvement, could include: (a) attention to matters of variation caused by practice size, (b) systematic counselling on barriers to implementation and support to structure the change processes, (c) a commitment from participants that goes beyond participation in two-yearly assessments, and (d) an anchor person for each identified goal who takes on the responsibility for improvement in practice.

  13. FIRBACK Far Infrared Survey with ISO Data Reduction, Analysis and First Results

    CERN Document Server

    Dole, H; Puget, J L; Aussel, H; Bouchet, F R; Ciliegi, C; Clements, D L; Césarsky, C J; Désert, F X; Elbaz, D; Franceschini, A; Gispert, R; Guiderdoni, B; Harwit, M; Laureijs, R J; Lemke, D; McMahon, R; Moorwood, A F M; Oliver, S; Reach, W T; Rowan-Robinson, M; Stickel, M; Dole, Herve; Lagache, Guilaine; Puget, Jean-Loup

    1999-01-01

    FIRBACK is one of the deepest cosmological surveys performed in the far infrared, using ISOPHOT. We describe this survey, its data reduction and analysis. We present the maps of fields at 175 microns. We point out some first results: source identifications with radio and mid infrared, and source counts at 175 microns. These two results suggest that half of the FIRBACK sources are probably at redshifts greater than 1. We also present briefly the large follow-up program.

  14. Data Reduction and Control Software for Meteor Observing Stations Based on CCD Video Systems

    Science.gov (United States)

    Madiedo, J. M.; Trigo-Rodriguez, J. M.; Lyytinen, E.

    2011-01-01

    The SPanish Meteor Network (SPMN) is performing a continuous monitoring of meteor activity over Spain and neighbouring countries. The huge amount of data obtained by the 25 video observing stations that this network is currently operating made it necessary to develop new software packages to accomplish some tasks, such as data reduction and remote operation of autonomous systems based on high-sensitivity CCD video devices. The main characteristics of this software are described here.

  15. Efficient Data Reduction Techniques for Remote Applications of a Wireless Visual Sensor Network

    Directory of Open Access Journals (Sweden)

    Khursheed Khursheed

    2013-05-01

    Full Text Available A Wireless Visual Sensor Network (WVSN is formed by deploying many Visual Sensor Nodes (VSNs in the field. After acquiring an image of the area of interest, the VSN performs local processing on it and transmits the result using an embedded wireless transceiver. Wireless data transmission consumes a great deal of energy, where energy consumption is mainly dependent on the amount of information being transmitted. The image captured by the VSN contains a huge amount of data. For certain applications, segmentation can be performed on the captured images. The amount of information in the segmented images can be reduced by applying efficient bi‐level image compression methods. In this way, the communication energy consumption of each of the VSNs can be reduced. However, the data reduction capability of bi‐level image compression standards is fixed and is limited by the used compression algorithm. For applications attributing few changes in adjacent frames, change coding can be applied for further data reduction. Detecting and compressing only the Regions of Interest (ROIs in the change frame is another possibility for further data reduction. In a communication system, where both the sender and the receiver know the employed compression standard, there is a possibility for further data reduction by not including the header information in the compressed bit stream of the sender. This paper summarizes different information reduction techniques such as image coding, change coding and ROI coding. The main contribution is the investigation of the combined effect of all these coding methods and their application to a few representative real life applications. This paper is intended to be a resource for researchers interested in techniques for information reduction in energy constrained embedded applications.

  16. S-Preconditioner for Multi-fold Data Reduction with Guaranteed User-Controlled Accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Ye; Lakshminarasimhan, Sriram; Shah, Neil; Gong, Zhenhuan; Chang, C. S.; Chen, Jacqueline H.; Ethier, Stephane; Kolla, Hemanth; Ku, Seung-Hoe; Klasky, S.; Latham, Robert J.; Ross, Rob; Schuchardt, Karen L.; Samatova, Nagiza F.

    2011-12-14

    The growing gap between the massive amounts of data generated by petascale scientific simulation codes and the capability of system hardware and software to effectively analyze this data necessitates data reduction. Yet, the increasing data complexity challenges most, if not all, of the existing data compression methods. In fact, lossless compression techniques offer no more than 10% reduction on scientific data that we have experience with, which is widely regarded as effectively incompressible. To bridge this gap, in this paper, we advocate a transformative strategy that enables fast, accurate, and multi-fold reduction of double-precision floating-point scientific data. The intuition behind our method is inspired by an effective use of preconditioners for linear algebra solvers optimized for a particular class of computational dwarfs (e.g., dense or sparse matrices). Focusing on a commonly used multi-resolution wavelet compression technique as the underlying solver for data reduction we propose the S-preconditioner, which transforms scientific data into a form with high global regularity to ensure a significant decrease in the number of wavelet coefficients stored for a segment of data. Combined with the subsequent EQ-calibrator, our resultant method (called S-Preconditioned EQ-Calibrated Wavelets (SPEQC-WAVELETS)), robustly achieved a 4- to 5- fold data reduction while guaranteeing user-defined accuracy of reconstructed data to be within 1% point-by-point relative error, lower than 0:01 Normalized RMSE, and higher than 0:99 Pearson Correlation. In this paper, we show the results we obtained by testing our method on six petascale simulation codes including fusion, combustion, climate, astrophysics, and subsurface groundwater in addition to 13 publicly available scientific datasets. We also demonstrate that application-driven data mining tasks performed on decompressed variables or their derived quantities produce results of comparable quality with the ones for

  17. Peer mentoring of telescope operations and data reduction at Western Kentucky University

    Science.gov (United States)

    Williams, Joshua; Carini, M. T.

    2014-01-01

    Peer mentoring plays an important role in the astronomy program at Western Kentucky University. I will describe how undergraduates teach and mentor other undergraduates the basics of operating our 0.6m telescope and data reduction (IRAF) techniques. This peer to peer mentoring creates a community of undergraduate astronomy scholars at WKU. These scholars bond and help each other with research, coursework, social, and personal issues. This community atmosphere helps to draw in and retain other students interested in astronomy and other STEM careers.

  18. The Wide Field Spectrograph (WiFeS): Performance and Data Reduction

    CERN Document Server

    Dopita, Michael; Farage, Catherine; McGregor, Peter; Bloxham, Gabe; Green, Anthony; Roberts, Bill; Nielson, Jon; Wilson, Greg; Young, Peter; 10.1007/s10509-010-0335-9

    2010-01-01

    This paper describes the on-telescope performance of the Wide Field Spectrograph (WiFeS). The design characteristics of this instrument, at the Research School of Astronomy and Astrophysics (RSAA) of the Australian National University (ANU) and mounted on the ANU 2.3m telescope at the Siding Spring Observatory has been already described in an earlier paper (Dopita et al. 2007). Here we describe the throughput, resolution and stability of the instrument, and describe some minor issues which have been encountered. We also give a description of the data reduction pipeline, and show some preliminary results.

  19. Laboratory procedures and data reduction techniques to determine rheologic properties of mass flows

    Science.gov (United States)

    Holmes, R.R.; Huizinga, R.J.; Brown, S.M.; Jobson, H.E.

    1993-01-01

    Determining the rheologic properties of coarse- grained mass flows is an important step to mathematically simulate potential inundation zones. Using the vertically rotating flume designed and built by the U.S. Geological Survey, laboratory procedures and subsequent data reduction have been developed to estimate shear stresses and strain rates of various flow materials. Although direct measurement of shear stress and strain rate currently (1992) are not possible in the vertically rotating flume, methods were derived to estimate these values from measurements of flow geometry, surface velocity, and flume velocity.

  20. Data Reduction Pipeline for EMIR, the Near-IR Multi-Object Spectrograph for GTC

    Science.gov (United States)

    Pascual, S.; Gallego, J.; Cardiel, N.; Zamorano, J.; Gorgas, F. J.; García-Dabó, C. E.; Gil de Paz, A.

    2006-07-01

    EMIR is a near-infrared wide-field camera and multi-object spectrograph being built for the 10.4m Spanish telescope (Gran Telescopio Canarias, GTC) at La Palma Observatory. The Data Reduction Pipeline, which is being designed and built by the EMIR Universidad Complutense de Madrid group, will be optimized for handling and reducing near-infrared data acquired with EMIR. Both reduced data and associated error frames will be delivered to the end-users as a final product.

  1. The ESPAT tool: a general-purpose DSS shell for solving stochastic optimization problems in complex river-aquifer systems

    Science.gov (United States)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel; Tilmant, Amaury

    2015-04-01

    Stochastic programming methods are better suited to deal with the inherent uncertainty of inflow time series in water resource management. However, one of the most important hurdles in their use in practical implementations is the lack of generalized Decision Support System (DSS) shells, usually based on a deterministic approach. The purpose of this contribution is to present a general-purpose DSS shell, named Explicit Stochastic Programming Advanced Tool (ESPAT), able to build and solve stochastic programming problems for most water resource systems. It implements a hydro-economic approach, optimizing the total system benefits as the sum of the benefits obtained by each user. It has been coded using GAMS, and implements a Microsoft Excel interface with a GAMS-Excel link that allows the user to introduce the required data and recover the results. Therefore, no GAMS skills are required to run the program. The tool is divided into four modules according to its capabilities: 1) the ESPATR module, which performs stochastic optimization procedures in surface water systems using a Stochastic Dual Dynamic Programming (SDDP) approach; 2) the ESPAT_RA module, which optimizes coupled surface-groundwater systems using a modified SDDP approach; 3) the ESPAT_SDP module, capable of performing stochastic optimization procedures in small-size surface systems using a standard SDP approach; and 4) the ESPAT_DET module, which implements a deterministic programming procedure using non-linear programming, able to solve deterministic optimization problems in complex surface-groundwater river basins. The case study of the Mijares river basin (Spain) is used to illustrate the method. It consists in two reservoirs in series, one aquifer and four agricultural demand sites currently managed using historical (XIV century) rights, which give priority to the most traditional irrigation district over the XX century agricultural developments. Its size makes it possible to use either the SDP or

  2. Selection of key ambient particulate variables for epidemiological studies - applying cluster and heatmap analyses as tools for data reduction.

    Science.gov (United States)

    Gu, Jianwei; Pitz, Mike; Breitner, Susanne; Birmili, Wolfram; von Klot, Stephanie; Schneider, Alexandra; Soentgen, Jens; Reller, Armin; Peters, Annette; Cyrys, Josef

    2012-10-01

    The success of epidemiological studies depends on the use of appropriate exposure variables. The purpose of this study is to extract a relatively small selection of variables characterizing ambient particulate matter from a large measurement data set. The original data set comprised a total of 96 particulate matter variables that have been continuously measured since 2004 at an urban background aerosol monitoring site in the city of Augsburg, Germany. Many of the original variables were derived from measured particle size distribution (PSD) across the particle diameter range 3 nm to 10 μm, including size-segregated particle number concentration, particle length concentration, particle surface concentration and particle mass concentration. The data set was complemented by integral aerosol variables. These variables were measured by independent instruments, including black carbon, sulfate, particle active surface concentration and particle length concentration. It is obvious that such a large number of measured variables cannot be used in health effect analyses simultaneously. The aim of this study is a pre-screening and a selection of the key variables that will be used as input in forthcoming epidemiological studies. In this study, we present two methods of parameter selection and apply them to data from a two-year period from 2007 to 2008. We used the agglomerative hierarchical cluster method to find groups of similar variables. In total, we selected 15 key variables from 9 clusters which are recommended for epidemiological analyses. We also applied a two-dimensional visualization technique called "heatmap" analysis to the Spearman correlation matrix. 12 key variables were selected using this method. Moreover, the positive matrix factorization (PMF) method was applied to the PSD data to characterize the possible particle sources. Correlations between the variables and PMF factors were used to interpret the meaning of the cluster and the heatmap analyses. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Design and capabilities of the MUSE data reduction software and pipeline

    Science.gov (United States)

    Weilbacher, Peter M.; Streicher, Ole; Urrutia, Tanya; Jarno, Aurélien; Pécontal-Rousset, Arlette; Bacon, Roland; Böhm, Petra

    2012-09-01

    MUSE, the Multi Unit Spectroscopic Explorer,1 is an integral-field spectrograph under construction for the ESO VLT to see first light in 2013. It can record spectra of a 1'x1' field on the sky at a sampling of 0''.2x0''.2, over a wavelength range from 4650 to 9300Å. The data reduction for this instrument is the process which converts raw data from the 24 CCDs into a combined datacube (with two spatial and one wavelength axis) which is corrected for instrumental and atmospheric effects. Since the instrument consists of many subunits (24 integral-field units, each slicing the light into 48 parts, i. e. 1152 regions with a total of almost 90000 spectra per exposure), this task requires many steps and is computationally expensive, in terms of processing speed, memory usage, and disk input/output. The data reduction software is designed to be mostly run as an automated pipeline and to fit into the open source environment of the ESO data flow as well as into a data management system based on AstroWISE. We describe the functionality of the pipeline, highlight details of new and unorthodox processing steps, discuss which algorithms and code could be used from other projects. Finally, we show the performance on both laboratory data as well as simulated scientific data.

  4. Online data reduction with FPGA-based track reconstruction for the Belle II DEPFET pixel detector

    Energy Technology Data Exchange (ETDEWEB)

    Deschamps, Bruno; Wessel, Christian; Marinas, Carlos; Dingfelder, Jochen [Physikalisches Institut, Universitaet Bonn (Germany)

    2016-07-01

    The innermost two layers of the Belle II vertex detector at the KEK facility in Tsukuba, Japan, will be covered by high-granularity DEPFET pixel sensors (PXD). The large number of pixels leads to a maximum data rate of 256 Gbps, which has to be significantly reduced by the Data Acquisition System (DATCON). For the data reduction the hit information of the surrounding Silicon strip Vertex Detector (SVD) is utilized to define so-called Regions of Interest (ROI). Only hit information of the pixels located inside these ROIs are saved. The ROIs for the PXD are computed by reconstructing track segments from SVD data and extrapolation to the PXD. The goal is to achieve a data reduction of at least a factor of 10 with this ROI selection. All the necessary processing stages, the receiving, decoding and multiplexing of SVD data on 48 optical fibers, the track reconstruction and the definition of the ROIs, will be performed by the presented system. The planned hardware design is based on a distributed set of Advanced Mezzanine Cards (AMC) each equipped with a Field Programmable Gate Array (FPGA) and 4 optical transceivers. In this talk, the status and plans for the DATCON prototype and the FPGA-based tracking algorithm are introduced as well as the plans for their test in the upcoming test beam at DESY.

  5. The DATCON system of the Belle II experiment. Tracking and data reduction

    Energy Technology Data Exchange (ETDEWEB)

    Wessel, Christian; Dingfelder, Jochen; Marinas, Carlos; Deschamps, Bruno [Universitaet Bonn (Germany). Physikalisches Institut

    2016-07-01

    The SuperKEKB e{sup +}e{sup -} accelerator at KEK in Japan will have a luminosity which is a factor of 40 higher than the luminosity of its predecessor KEKB. The Belle II detector at SuperKEKB will contain a two-layer pixel detector at radii of 1.421 and 2.179 cm from the interaction point, based on the DEPFET (DEpleted P-channel Field Effect Transistor) technology. It is surrounded by four layers of strip detectors. Due to the high collision rate, the data rate of the pixel detector needs to by drastically reduced by an online data reduction system. The DATCON (Data Acquisition Tracking and Concentrator Online Node) system performs track reconstruction in the SVD (Strip Vertex Detector) and extrapolates to the PXD (PiXel Detector) to calculate ROI and to keep only hits in the ROI. The track reconstruction algorithm is based on a Hough transform, which reduces track finding to finding intersection points in the Hough parameter space. In this talk the employed algorithm for fast online track reconstruction on FPGA, ROI finding and the performance of the data reduction are presented.

  6. Recommendations for autonomous underway pCO 2 measuring systems and data-reduction routines

    Science.gov (United States)

    Pierrot, Denis; Neill, Craig; Sullivan, Kevin; Castle, Robert; Wanninkhof, Rik; Lüger, Heike; Johannessen, Truls; Olsen, Are; Feely, Richard A.; Cosca, Catherine E.

    2009-04-01

    In order to facilitate the collection of high quality and uniform surface water pCO 2 data, an underway pCO 2 instrument has been designed based on community input and is now commercially available. Along with instrumentation, agreements were reached on data reduction and quality control that can be easily applied to data from these systems by using custom-made freeware. This new automated underway pCO 2 measuring system is designed to be accurate to within 0.1 μatm for atmospheric pCO 2 measurements and to within 2 μatm for seawater pCO 2, targeted by the scientific community to constrain the regional air-sea CO 2 fluxes to 0.2 Pg C year -1. The procedure to properly reduce the underway pCO 2 data and perform the steps necessary for calculation of the fugacity of CO 2 from the measurements is described. This system is now widely used by the scientific community on many different types of ships. Combined with the recommended data-reduction procedures, it will facilitate producing data sets that will significantly decrease the uncertainty currently present in estimates of air-sea CO 2 fluxes.

  7. The JCMT Gould Belt Survey: A Quantitative Comparison Between SCUBA-2 Data Reduction Methods

    CERN Document Server

    Mairs, S; Kirk, H; Graves, S; Buckle, J; Beaulieu, S F; Berry, D S; Broekhoven-Fiene, H; Currie, M J; Fich, M; Hatchell, J; Jenness, T; Mottram, J C; Nutter, D; Pattle, K; Pineda, J E; Salji, C; Di Francesco, J; Hogerheijde, M R; Ward-Thompson, D

    2015-01-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artifacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software, Starlink, but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth ...

  8. Nulling Data Reduction and On-Sky Performance of the Large Binocular Telescope Interferometer

    CERN Document Server

    Defrère, D; Mennesson, B; Hoffmann, W F; Millan-Gabet, R; Skemer, A J; Bailey, V; Danchi, W C; Downey, E C; Durney, O; Grenz, P; Hill, J M; McMahon, T J; Montoya, M; Spalding, E; Vaz, A; Absil, O; Arbo, P; Bailey, H; Brusa, G; Bryden, G; Esposito, S; Gaspar, A; Haniff, C A; Kennedy, G M; Leisenring, J M; Marion, L; Nowak, M; Pinna, E; Powell, K; Puglisi, A; Rieke, G; Roberge, A; Serabyn, E; Sosa, R; Stapeldfeldt, K; Su, K; Weinberger, A J; Wyatt, M C

    2016-01-01

    The Large Binocular Telescope Interferometer (LBTI) is a versatile instrument designed for high-angular resolution and high-contrast infrared imaging (1.5-13 microns). In this paper, we focus on the mid-infrared (8-13 microns) nulling mode and present its theory of operation, data reduction, and on-sky performance as of the end of the commissioning phase in March 2015. With an interferometric baseline of 14.4 meters, the LBTI nuller is specifically tuned to resolve the habitable zone of nearby main-sequence stars, where warm exozodiacal dust emission peaks. Measuring the exozodi luminosity function of nearby main-sequence stars is a key milestone to prepare for future exoEarth direct imaging instruments. Thanks to recent progress in wavefront control and phase stabilization, as well as in data reduction techniques, the LBTI demonstrated in February 2015 a calibrated null accuracy of 0.05% over a three-hour long observing sequence on the bright nearby A3V star beta Leo. This is equivalent to an exozodiacal dis...

  9. Optimized Herschel/PACS photometer observing and data reduction strategies for moving solar system targets

    CERN Document Server

    Cs., Kiss; E., Vilenius; A., Pál; P., Santos-Sanz; E., Lellouch; G., Marton; E., Verebélyi; N., Szalai; P., Hartogh; J., Stansberry; F., Henry; A, Delsanti

    2013-01-01

    The "TNOs are Cool!: A survey of the trans-Neptunian region" is a Herschel Open Time Key Program that aims to characterize planetary bodies at the outskirts of the Solar System using PACS and SPIRE data, mostly taken as scan-maps. In this paper we summarize our PACS data reduction scheme that uses a modified version of the standard pipeline for basic data reduction, optimized for faint, moving targets. Due to the low flux density of our targets the observations are confusion noise limited or at least often affected by bright nearby background sources at 100 and 160\\,$\\mu$m. To overcome these problems we developed techniques to characterize and eliminate the background at the positions of our targets and a background matching technique to compensate for pointing errors. We derive a variety of maps as science data products that are used depending on the source flux and background levels and the scientific purpose. Our techniques are also applicable to a wealth of other Herschel solar system photometric observat...

  10. Online data reduction with FPGA-based track reconstruction for the Belle II DEPFET pixel detector

    Energy Technology Data Exchange (ETDEWEB)

    Schnell, Michael; Deschamps, Bruno; Dingfelder, Jochen; Marinas, Carlos [University of Bonn (Germany); Collaboration: Belle II-Collaboration

    2015-07-01

    The innermost two layers of the Belle II vertex detector at the KEK facility in Tsukuba, Japan, will be covered by high-granularity DEPFET pixel sensors (PXD). The large number of pixels leads to a maximum data rate of 256 Gbps, which has to be significantly reduced by the Data Acquisition System. For the data reduction the hit information of the surrounding Silicon strip Vertex Detector (SVD) is utilized to define so-called Regions of Interest (ROI). Only hit information of the pixels located inside these ROIs are saved. The ROIs for the PXD are computed by reconstructing track segments from SVD data and extrapolation to the PXD. The goal is to achieve a data reduction of up to a factor of 10 with this ROI selection. All the necessary processing stages, the receiving, decoding and multiplexing of SVD data on 48 optical fibers, the track reconstruction and the definition of the ROIs, will be performed by the presented system. The planned hardware design is based on a distributed set of Advanced Mezzanine Cards (AMC) each equipped with a Field Programmable Gate Array (FPGA) and 4 optical transceivers. In this talk, the hardware and the FPGA-based tracking algorithm is introduced with some recent performance results from simulation and the latest test beam campaigns.

  11. Online Approach for Spatio-Temporal Trajectory Data Reduction for Portable Devices

    Institute of Scientific and Technical Information of China (English)

    Heemin Park; Young-Jun Lee; Jinseok Chae; Wonik Choi

    2013-01-01

    As location data are widely available to portable devices,trajectory tracking of moving objects has become an essential technology for most location-based services.To maintain such streaming data of location updates from mobile clients,conventional approaches such as time-based regular location updating and distance-based location updating have been used.However,these methods suffer from the large amount of data,redundant location updates,and large trajectory estimation errors due to the varying speed of moving objects.In this paper,we propose a simple but efficient online trajectory data reduction method for portable devices.To solve the problems of redundancy and large estimation errors,the proposed algorithm computes trajectory errors and finds a recent location update that should be sent to the server to satisfy the user requirements.We evaluate the proposed algorithm with real GPS trajectory data consisting of 17 201 trajectories.The intensive simulation results prove that the proposed algorithm always meets the given user requirements and exhibits a data reduction ratio of greater than 87% when the acceptable trajectory error is greater than or equal to 10 meters.

  12. Energy-efficient data reduction techniques for wireless seizure detection systems.

    Science.gov (United States)

    Chiang, Joyce; Ward, Rabab K

    2014-01-24

    The emergence of wireless sensor networks (WSNs) has motivated a paradigm shift in patient monitoring and disease control. Epilepsy management is one of the areas that could especially benefit from the use of WSN. By using miniaturized wireless electroencephalogram (EEG) sensors, it is possible to perform ambulatory EEG recording and real-time seizure detection outside clinical settings. One major consideration in using such a wireless EEG-based system is the stringent battery energy constraint at the sensor side. Different solutions to reduce the power consumption at this side are therefore highly desired. The conventional approach incurs a high power consumption, as it transmits the entire EEG signals wirelessly to an external data server (where seizure detection is carried out). This paper examines the use of data reduction techniques for reducing the amount of data that has to be transmitted and, thereby, reducing the required power consumption at the sensor side. Two data reduction approaches are examined: compressive sensing-based EEG compression and low-complexity feature extraction. Their performance is evaluated in terms of seizure detection effectiveness and power consumption. Experimental results show that by performing low-complexity feature extraction at the sensor side and transmitting only the features that are pertinent to seizure detection to the server, a considerable overall saving in power is achieved. The battery life of the system is increased by 14 times, while the same seizure detection rate as the conventional approach (95%) is maintained.

  13. Improving Prediction Accuracy for WSN Data Reduction by Applying Multivariate Spatio-Temporal Correlation

    Directory of Open Access Journals (Sweden)

    José Neuman de Souza

    2011-10-01

    Full Text Available This paper proposes a method based on multivariate spatial and temporal correlation to improve prediction accuracy in data reduction for Wireless Sensor Networks (WSN. Prediction of data not sent to the sink node is a technique used to save energy in WSNs by reducing the amount of data traffic. However, it may not be very accurate. Simulations were made involving simple linear regression and multiple linear regression functions to assess the performance of the proposed method. The results show a higher correlation between gathered inputs when compared to time, which is an independent variable widely used for prediction and forecasting. Prediction accuracy is lower when simple linear regression is used, whereas multiple linear regression is the most accurate one. In addition to that, our proposal outperforms some current solutions by about 50% in humidity prediction and 21% in light prediction. To the best of our knowledge, we believe that we are probably the first to address prediction based on multivariate correlation for WSN data reduction.

  14. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Selin Aviyente

    2010-01-01

    Full Text Available Joint time-frequency representations offer a rich representation of event related potentials (ERPs that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.

  15. SPAM: A data reduction recipe for high-resolution, low-frequency radio-interferometric observations

    CERN Document Server

    Intema, H T

    2014-01-01

    High-resolution astronomical imaging at sub-GHz radio frequencies has been available for more than 15 years, with the VLA at 74 and 330 MHz, and the GMRT at 150, 240, 330 and 610 MHz. Recent developments include wide-bandwidth upgrades for VLA and GMRT, and commissioning of the aperture-array-based, multi-beam telescope LOFAR. A common feature of these telescopes is the necessity to deconvolve the very many detectable sources within their wide fields-of-view and beyond. This is complicated by gain variations in the radio signal path that depend on viewing direction. One such example is phase errors due to the ionosphere. Here I discuss the inner workings of SPAM, a set of AIPS-based data reduction scripts in Python that includes direction-dependent calibration and imaging. Since its first version in 2008, SPAM has been applied to many GMRT data sets at various frequencies. Many valuable lessons were learned, and translated into various SPAM software modifications. Nowadays, semi-automated SPAM data reduction ...

  16. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    Science.gov (United States)

    Aviyente, Selin; Bernat, Edward M.; Malone, Stephen M.; Iacono, William G.

    2010-12-01

    Joint time-frequency representations offer a rich representation of event related potentials (ERPs) that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP) approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.

  17. S2O - A software tool for integrating research data from general purpose statistic software into electronic data capture systems.

    Science.gov (United States)

    Bruland, Philipp; Dugas, Martin

    2017-01-07

    Data capture for clinical registries or pilot studies is often performed in spreadsheet-based applications like Microsoft Excel or IBM SPSS. Usually, data is transferred into statistic software, such as SAS, R or IBM SPSS Statistics, for analyses afterwards. Spreadsheet-based solutions suffer from several drawbacks: It is generally not possible to ensure a sufficient right and role management; it is not traced who has changed data when and why. Therefore, such systems are not able to comply with regulatory requirements for electronic data capture in clinical trials. In contrast, Electronic Data Capture (EDC) software enables a reliable, secure and auditable collection of data. In this regard, most EDC vendors support the CDISC ODM standard to define, communicate and archive clinical trial meta- and patient data. Advantages of EDC systems are support for multi-user and multicenter clinical trials as well as auditable data. Migration from spreadsheet based data collection to EDC systems is labor-intensive and time-consuming at present. Hence, the objectives of this research work are to develop a mapping model and implement a converter between the IBM SPSS and CDISC ODM standard and to evaluate this approach regarding syntactic and semantic correctness. A mapping model between IBM SPSS and CDISC ODM data structures was developed. SPSS variables and patient values can be mapped and converted into ODM. Statistical and display attributes from SPSS are not corresponding to any ODM elements; study related ODM elements are not available in SPSS. The S2O converting tool was implemented as command-line-tool using the SPSS internal Java plugin. Syntactic and semantic correctness was validated with different ODM tools and reverse transformation from ODM into SPSS format. Clinical data values were also successfully transformed into the ODM structure. Transformation between the spreadsheet format IBM SPSS and the ODM standard for definition and exchange of trial data is feasible

  18. Can we import quality tools? a feasibility study of European practice assessment in a country with less organised general practice

    Directory of Open Access Journals (Sweden)

    Pestiaux Dominique

    2009-10-01

    Full Text Available Abstract Background Quality is on the agenda of European general practice (GP. European researchers have, in collaboration, developed tools to assess quality of GPs. In this feasibility study, we tested the European Practice Assessment (EPA in a one-off project in Belgium, where general practice has a low level of GP organisation. Methods A framework for feasibility analysis included describing the recruiting of participants, a brief telephone study survey among non-responders, organisational and logistic problems. Using field notes and focus groups, we studied the participants' opinions. Results In this study, only 36 of 1000 invited practices agreed to participate. Co-ordination, administrative work, practice visits and organisational problems required several days per practice. The researchers further encountered technical problems, for instance when entering the data and uploading to the web-based server. In subsequent qualitative analysis using two focus groups, most participant GPs expressed a positive feeling after the EPA procedure. In the short period of follow-up, only a few GPs reported improvements after the visit. The participant GPs suggested that follow-up and coaching would probably facilitate the implementation of changes. Conclusion This feasibility study shows that prior interest in EPA is low in the GP community. We encountered a number of logistic and organisational problems. It proved attractive to participants, but it can be augmented by coaching of participants in more than a one-off project to identify and achieve targets for quality improvement. In the absence of commitment of the government, a network of universities and one scientific organisation will offer EPA as a service to training practices.

  19. The translators’ workstation for 2015: the example of the CAT tools of the European Commission’s Directorate General for Translation

    Directory of Open Access Journals (Sweden)

    Anna Walicka

    2016-03-01

    Full Text Available The aim of this article is to provide an answer to the question about the current state of advancement of computer-assisted translation tools. We assume that several decades of research in the field carried out by the EU institutions in the context of the European integration process have provided the most advanced computer-assisted translation tools available in the biggest translation service in the world, i.e., the Directorate General for Translation of the European Commission. The present work therefore focuses on the following three main types of CAT tools employed by the EU translators: translation memory tools, terminology management tools and machine translation tools. The same types of tools, offered by the EU providers, i.e. SDL and SYSTRAN, are also used by translators working outside the EU structures. We can therefore presume that the EU translation services set work standards which are then accepted by all professional translators. For that reason, in order to define the most probable directions of future development of these tools, this article also reports the current research conducted by the EU in the CAT tools field.

  20. Data reduction strategy of the Effelsberg-Bonn HI Survey (EBHIS)

    CERN Document Server

    Winkel, B; Kalberla, P M W

    2009-01-01

    Since autumn 2008 a new L-band 7-Feed-Array receiver is used for an HI 21-cm line survey performed with the 100-m Effelsberg telescope. The survey will cover the whole northern hemisphere comprising both, the galactic and extragalactic sky in parallel. Using state-of-the-art FPGA based digital Fast Fourier Transform spectrometers, superior in dynamic range and temporal resolution, allows to apply sophisticated radio frequency interferences (RFI) mitigation schemes to the survey data. The EBHIS data reduction software includes the RFI mitigation, gain-curve correction, intensity calibration, stray-radiation correction, gridding, and source detection. We discuss the severe degradation of radio astronomical HI data by RFI signals and the gain in scientific yield when applying modern RFI mitigation schemes. For this aim simulations of the galaxy distribution within the local volume (z<0.07) with and without RFI degradation were performed. These simulations, allow us to investigate potential biases and selectio...

  1. Gemini Planet Imager Observational Calibrations XIV: Polarimetric Contrasts and New Data Reduction Techniques

    CERN Document Server

    Millar-Blanchaer, Maxwell A; Hung, Li-Wei; Fitzgerald, Michael P; Wang, Jason J; Chilcote, Jeffrey; Graham, James R; Bruzzone, Sebastian; Kalas, Paul G

    2016-01-01

    The Gemini Planet Imager (GPI) has been designed for the direct detection and characterization of exoplanets and circumstellar disks. GPI is equipped with a dual channel polarimetry mode designed to take advantage of the inherently polarized light scattered off circumstellar material to further suppress the residual seeing halo left uncorrected by the adaptive optics. We explore how recent advances in data reduction techniques reduce systematics and improve the achievable contrast in polarimetry mode. In particular, we consider different flux extraction techniques when constructing datacubes from raw data, division by a polarized flat-field and a method for subtracting instrumental polarization. Using observations of unpolarized standard stars we find that GPI's instrumental polarization is consistent with being wavelength independent within our errors. In addition, we provide polarimetry contrast curves that demonstrate typical performance throughout the GPIES campaign.

  2. The IRCAL Polarimeter: Design, Calibration, and Data Reduction for an Adaptive Optics Imaging Polarimeter

    CERN Document Server

    Perrin, Marshall D; Lloyd, James P

    2008-01-01

    We have upgraded IRCAL, the near-infrared science camera of the Lick Observatory adaptive optics system, to add a dual-channel imaging polarimetry mode. This mode uses an optically contacted YLF (LiYF_4) Wollaston prism to provide simultaneous images in perpendicular linear polarizations, providing high resolution, high dynamic range polarimetry in the near infrared. We describe the design and construction of the polarimeter, discuss in detail the data reduction algorithms adopted, and evaluate the instrument's on-the-sky performance. The IRCAL polarimeter is capable of reducing the stellar PSF halo by about two orders of magnitude, thereby increasing contrast for studies of faint circumstellar dust-scattered light. We discuss the various factors that limit the achieved contrast, and present lessons applicable to future high contrast imaging polarimeters.

  3. Alternative Data Reduction Procedures for UVES: Wavelength Calibration and Spectrum Addition

    CERN Document Server

    Thompson, Rodger I; Black, John H; Martins, C J A P

    2008-01-01

    This paper addresses alternative procedures to the ESO supplied pipeline procedures for the reduction of UVES spectra of two quasar spectra to determine the value of the fundamental constant mu = Mp/Me at early times in the universe. The procedures utilize intermediate product images and spectra produced by the pipeline with alternative wavelength calibration and spectrum addition methods. Spectroscopic studies that require extreme wavelength precision need customized wavelength calibration procedures beyond that usually supplied by the standard data reduction pipelines. An example of such studies is the measurement of the values of the fundamental constants at early times in the universe. This article describes a wavelength calibration procedure for the UV-Visual Echelle Spectrometer on the Very Large Telescope, however, it can be extended to other spectrometers as well. The procedure described here provides relative wavelength precision of better than 3E-7 for the long-slit Thorium-Argon calibration lamp ex...

  4. CalFUSE v3: A Data-Reduction Pipeline for the Far Ultraviolet Spectroscopic Explorer

    CERN Document Server

    Dixon, W V; Barrett, P E; Civeit, T; Dupuis, J; Fullerton, A W; Godard, B; Hsu, J C; Kaiser, M E; Kruk, J W; Lacour, S; Lindler, D J; Massa, D; Robinson, R D; Romelfanger, M L; Sonnentrucker, P

    2007-01-01

    Since its launch in 1999, the Far Ultraviolet Spectroscopic Explorer (FUSE) has made over 4600 observations of some 2500 individual targets. The data are reduced by the Principal Investigator team at the Johns Hopkins University and archived at the Multimission Archive at Space Telescope (MAST). The data-reduction software package, called CalFUSE, has evolved considerably over the lifetime of the mission. The entire FUSE data set has recently been reprocessed with CalFUSE v3.2, the latest version of this software. This paper describes CalFUSE v3.2, the instrument calibrations upon which it is based, and the format of the resulting calibrated data files.

  5. Data reduction for time-of-flight small-angle neutron scattering with virtual neutrons

    Science.gov (United States)

    Du, Rong; Tian, Haolai; Zuo, Taisen; Tang, Ming; Yan, Lili; Zhang, Junrong

    2017-09-01

    Small-angle neutron scattering (SANS) is an experimental technique to detect material structures in the nanometer to micrometer range. The solution of the structural model constructed from SANS strongly depends on the accuracy of the reduced data. The time-of-flight (TOF) SANS data are dependent on the wavelength of the pulsed neutron source. Therefore, data reduction must be handled very carefully to transform measured neutron events into neutron scattering intensity. In this study, reduction algorithms for TOF SANS data are developed and optimized using simulated data from a virtual neutron experiment. Each possible effect on the measured data is studied systematically, and suitable corrections are performed to obtain high-quality data. This work will facilitate scientific research and the instrument design at China Spallation Neutron Source.

  6. Parallel Landscape Driven Data Reduction & Spatial Interpolation Algorithm for Big LiDAR Data

    Directory of Open Access Journals (Sweden)

    Rahil Sharma

    2016-06-01

    Full Text Available Airborne Light Detection and Ranging (LiDAR topographic data provide highly accurate digital terrain information, which is used widely in applications like creating flood insurance rate maps, forest and tree studies, coastal change mapping, soil and landscape classification, 3D urban modeling, river bank management, agricultural crop studies, etc. In this paper, we focus mainly on the use of LiDAR data in terrain modeling/Digital Elevation Model (DEM generation. Technological advancements in building LiDAR sensors have enabled highly accurate and highly dense LiDAR point clouds, which have made possible high resolution modeling of terrain surfaces. However, high density data result in massive data volumes, which pose computing issues. Computational time required for dissemination, processing and storage of these data is directly proportional to the volume of the data. We describe a novel technique based on the slope map of the terrain, which addresses the challenging problem in the area of spatial data analysis, of reducing this dense LiDAR data without sacrificing its accuracy. To the best of our knowledge, this is the first ever landscape-driven data reduction algorithm. We also perform an empirical study, which shows that there is no significant loss in accuracy for the DEM generated from a 52% reduced LiDAR dataset generated by our algorithm, compared to the DEM generated from an original, complete LiDAR dataset. For the accuracy of our statistical analysis, we perform Root Mean Square Error (RMSE comparing all of the grid points of the original DEM to the DEM generated by reduced data, instead of comparing a few random control points. Besides, our multi-core data reduction algorithm is highly scalable. We also describe a modified parallel Inverse Distance Weighted (IDW spatial interpolation method and show that the DEMs it generates are time-efficient and have better accuracy than the one’s generated by the traditional IDW method.

  7. Simpler methods do it better: Success of Recurrence Quantification Analysis as a general purpose data analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Webber, Charles L., E-mail: cwebber@lumc.ed [Department of Cell and Molecular Physiology, Loyola University Medical Center, Maywood, IL (United States); Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research (PIK), 14412 Potsdam (Germany); Facchini, Angelo, E-mail: a.facchini@unisi.i [Center the Study of Complex Systmes and Department of Information Enginering, University of Siena, 53100 Siena (Italy); Giuliani, Alessandro, E-mail: alessandro.giuliani@iss.i [Environment and Health Department, Istituto Superiore di Sanita, Roma (Italy)

    2009-10-05

    Over the last decade, Recurrence Quantification Analysis (RQA) has become a new standard tool in the toolbox of nonlinear methodologies. In this Letter we trace the history and utility of this powerful tool and cite some common applications. RQA continues to wend its way into numerous and diverse fields of study.

  8. Automating U-Pb IDTIMS data reduction and reporting: Cyberinfrastructure meets geochronology

    Science.gov (United States)

    Bowring, J. F.; McLean, N.; Walker, J. D.; Ash, J. M.

    2009-12-01

    We demonstrate the efficacy of an interdisciplinary effort between software engineers and geochemists to produce working cyberinfrastructure for geochronology. This collaboration between CIRDLES, EARTHTIME and EarthChem has produced the software programs Tripoli and U-Pb_Redux as the cyber-backbone for the ID-TIMS community. This initiative incorporates shared isotopic tracers, data-reduction algorithms and the archiving and retrieval of data and results. The resulting system facilitates detailed inter-laboratory comparison and a new generation of cooperative science. The resolving power of geochronological data in the earth sciences is dependent on the precision and accuracy of many isotopic measurements and corrections. Recent advances in U-Pb geochronology have reinvigorated its application to problems such as precise timescale calibration, processes of crustal evolution, and early solar system dynamics. This project provides a heretofore missing common data reduction protocol, thus promoting the interpretation of precise geochronology and enabling inter-laboratory comparison. U-Pb_Redux is an open-source software program that provides end-to-end support for the analysis of uranium-lead geochronological data. The system reduces raw mass spectrometer data to U-Pb dates, allows users to interpret ages from these data, and then provides for the seamless federation of the results, coming from many labs, into a community web-accessible database using standard and open techniques. This EarthChem GeoChron database depends also on keyed references to the SESAR sample database. U-Pb_Redux currently provides interactive concordia and weighted mean plots and uncertainty contribution visualizations; it produces publication-quality concordia and weighted mean plots and customizable data tables. This initiative has achieved the goal of standardizing the data elements of a complete reduction and analysis of uranium-lead data, which are expressed using extensible markup

  9. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    Science.gov (United States)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Connolly, A. J.; Kaiser, N.; Kirby, Evan N.; Lemaux, Brian C.; Lin, Lihwai; Lotz, Jennifer M.; Luppino, G. A.; Marinoni, C.; Matthews, Daniel J.; Metevier, Anne; Schiavon, Ricardo P.

    2013-09-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ~ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z ~ 1 via ~90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg2 divided into four separate fields observed to a limiting apparent magnitude of R AB = 24.1. Objects with z 0.7 to be targeted ~2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ~ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm-1 grating used for the survey delivers high spectral resolution (R ~ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is

  10. THE DEEP2 GALAXY REDSHIFT SURVEY: DESIGN, OBSERVATIONS, DATA REDUCTION, AND REDSHIFTS

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jeffrey A. [Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Cooper, Michael C. [Center for Galaxy Evolution, Department of Physics and Astronomy, University of California, Irvine, 4129 Frederick Reines Hall, Irvine, CA 92697 (United States); Davis, Marc [Department of Astronomy and Physics, University of California, 601 Campbell Hall, Berkeley, CA 94720 (United States); Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Harker, Justin J.; Lai, Kamson [UCO/Lick Observatory, University of California, 1156 High Street, Santa Cruz, CA 95064 (United States); Coil, Alison L. [Department of Physics, University of California, San Diego, La Jolla, CA 92093 (United States); Dutton, Aaron A. [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Finkbeiner, Douglas P. [Harvard-Smithsonian Center for Astrophysics, Harvard University, 60 Garden St., Cambridge, MA 02138 (United States); Gerke, Brian F. [Lawrence Berkeley National Laboratory, 1 Cyclotron Rd., MS 90R4000, Berkeley, CA 94720 (United States); Rosario, David J. [Max-Planck-Institut fuer Extraterrestrische Physik, Giessenbachstrasse, D-85748 Garching (Germany); Weiner, Benjamin J.; Willmer, C. N. A. [Steward Observatory, University of Arizona, 933 N. Cherry Ave., Tucson, AZ 85721-0065 (United States); Yan Renbin [Department of Physics and Astronomy, University of Kentucky, 505 Rose Street, Lexington, KY 40506-0055 (United States); Kassin, Susan A. [Astrophysics Science Division, Goddard Space Flight Center, Code 665, Greenbelt, MD 20771 (United States); Konidaris, N. P., E-mail: janewman@pitt.edu, E-mail: djm70@pitt.edu, E-mail: m.cooper@uci.edu, E-mail: mdavis@berkeley.edu, E-mail: faber@ucolick.org, E-mail: koo@ucolick.org, E-mail: raja@ucolick.org, E-mail: phillips@ucolick.org [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); and others

    2013-09-15

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z {approx} 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M{sub B} = -20 at z {approx} 1 via {approx}90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg{sup 2} divided into four separate fields observed to a limiting apparent magnitude of R{sub AB} = 24.1. Objects with z {approx}< 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted {approx}2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z {approx} 1.45, where the [O II] 3727 A doublet lies in the infrared. The DEIMOS 1200 line mm{sup -1} grating used for the survey delivers high spectral resolution (R {approx} 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or

  11. The DEEP2 Galaxy Redshift Survey: Design, Observations, Data Reduction, and Redshifts

    Science.gov (United States)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Coil, Alison L; Guhathakurta, Puraga; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Wilmer, C. N. A.; Yan, Renbin; Harker, Justin J.; Kassin, Susan A.; Konidaris, N. P.; Lai, Kamson; Madgwick, Darren S.; Noeske, K. G.; Wirth, Gregory D.; Kirby, Evan N.; Lotz, Jennifer M.

    2013-01-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z approx. 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = -20 at z approx. 1 via approx.90 nights of observation on the Keck telescope. The survey covers an area of 2.8 Sq. deg divided into four separate fields observed to a limiting apparent magnitude of R(sub AB) = 24.1. Objects with z approx. galaxies with z > 0.7 to be targeted approx. 2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z approx. 1.45, where the [O ii] 3727 Ang. doublet lies in the infrared. The DEIMOS 1200 line mm(exp -1) grating used for the survey delivers high spectral resolution (R approx. 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed

  12. Classification of traumatic brain injury severity using informed data reduction in a series of binary classifier algorithms.

    Science.gov (United States)

    Prichep, Leslie S; Jacquin, Arnaud; Filipenko, Julie; Dastidar, Samanwoy Ghosh; Zabele, Stephen; Vodencarević, Asmir; Rothman, Neil S

    2012-11-01

    Assessment of medical disorders is often aided by objective diagnostic tests which can lead to early intervention and appropriate treatment. In the case of brain dysfunction caused by head injury, there is an urgent need for quantitative evaluation methods to aid in acute triage of those subjects who have sustained traumatic brain injury (TBI). Current clinical tools to detect mild TBI (mTBI/concussion) are limited to subjective reports of symptoms and short neurocognitive batteries, offering little objective evidence for clinical decisions; or computed tomography (CT) scans, with radiation-risk, that are most often negative in mTBI. This paper describes a novel methodology for the development of algorithms to provide multi-class classification in a substantial population of brain injured subjects, across a broad age range and representative subpopulations. The method is based on age-regressed quantitative features (linear and nonlinear) extracted from brain electrical activity recorded from a limited montage of scalp electrodes. These features are used as input to a unique "informed data reduction" method, maximizing confidence of prospective validation and minimizing over-fitting. A training set for supervised learning was used, including: "normal control," "concussed," and "structural injury/CT positive (CT+)." The classifier function separating CT+ from the other groups demonstrated a sensitivity of 96% and specificity of 78%; the classifier separating "normal controls" from the other groups demonstrated a sensitivity of 81% and specificity of 74%, suggesting high utility of such classifiers in acute clinical settings. The use of a sequence of classifiers where the desired risk can be stratified further supports clinical utility.

  13. Comparing the Efficacy of an Engineered-Based System (College Livetext) with an Off-the-Shelf General Tool (Hyperstudio) for Developing Electronic Portfolios in Teacher Education

    Science.gov (United States)

    Johnson-Leslie, Natalie A.

    2009-01-01

    In teacher education, electronic portfolios provide an authentic form of assessment documenting students' personal and professional growth. Using the engineered-based system, College LiveText, and an off-the-shelf general tool, HyperStudio, pre-service teachers constructed e-portfolios as part of their teacher preparation requirements. This case…

  14. The PRIsm MUlti-object Survey (PRIMUS). II. Data Reduction and Redshift Fitting

    CERN Document Server

    Cool, Richard J; Blanton, Michael R; Burles, Scott M; Coil, Alison L; Eisenstein, Daniel J; Wong, Kenneth C; Zhu, Guangtun; Aird, James; Bernstein, Rebecca A; Bolton, Adam S; Hogg, David W; Mendez, Alexander J

    2013-01-01

    The PRIsm MUti-object Survey (PRIMUS) is a spectroscopic galaxy redshift survey to z~1 completed with a low-dispersion prism and slitmasks allowing for simultaneous observations of ~2,500 objects over 0.18 square degrees. The final PRIMUS catalog includes ~130,000 robust redshifts over 9.1 sq. deg. In this paper, we summarize the PRIMUS observational strategy and present the data reduction details used to measure redshifts, redshift precision, and survey completeness. The survey motivation, observational techniques, fields, target selection, slitmask design, and observations are presented in Coil et al 2010. Comparisons to existing higher-resolution spectroscopic measurements show a typical precision of sigma_z/(1+z)=0.005. PRIMUS, both in area and number of redshifts, is the largest faint galaxy redshift survey completed to date and is allowing for precise measurements of the relationship between AGNs and their hosts, the effects of environment on galaxy evolution, and the build up of galactic systems over t...

  15. Analysis and interpretation of dynamic FDG PET oncological studies using data reduction techniques

    Directory of Open Access Journals (Sweden)

    Santos Andres

    2007-10-01

    Full Text Available Abstract Background Dynamic positron emission tomography studies produce a large amount of image data, from which clinically useful parametric information can be extracted using tracer kinetic methods. Data reduction methods can facilitate the initial interpretation and visual analysis of these large image sequences and at the same time can preserve important information and allow for basic feature characterization. Methods We have applied principal component analysis to provide high-contrast parametric image sets of lower dimensions than the original data set separating structures based on their kinetic characteristics. Our method has the potential to constitute an alternative quantification method, independent of any kinetic model, and is particularly useful when the retrieval of the arterial input function is complicated. In independent component analysis images, structures that have different kinetic characteristics are assigned opposite values, and are readily discriminated. Furthermore, novel similarity mapping techniques are proposed, which can summarize in a single image the temporal properties of the entire image sequence according to a reference region. Results Using our new cubed sum coefficient similarity measure, we have shown that structures with similar time activity curves can be identified, thus facilitating the detection of lesions that are not easily discriminated using the conventional method employing standardized uptake values.

  16. Integral Field Spectroscopy of a sample of nearby galaxies. I. Sample, Observations and Data Reduction

    CERN Document Server

    Marmol-Queralto, E; Marino, R A; Mast, D; Viironen, K; de Paz, A Gil; Iglesias-Paramo, J; Rosales-Ortega, F F; Vilchez, J M

    2011-01-01

    Aims: Integral Field Spectroscopy (IFS) is a powerful approach for the study of nearby galaxies since it enables a detailed analysis of their resolved physical properties. Here we present the sample of nearby galaxies selected to exploit the two dimensional information provided by the IFS. Methods: We observed a sample of 48 galaxies from the Local Universe with the PPAK Integral Field Spectroscopy unit (IFU), of the PMAS spectrograph, mounted at the 3.5m telescope at Calar Alto Observatory (Almeria, Spain). Two different setups were used during these studies (low -V300- and medium -V600- resolution mode) covering a spectral range of around 3700-7000 Angs. We developed a full automatic pipeline for the data reduction, that includes an analysis of the quality of the final data products. We applied a decoupling method to obtain the ionised gas and stellar content of these galaxies, and to derive the main physical properties of the galaxies. To asses the accuracy in the measurements of the different parameters, ...

  17. A Flexible and Modular Data Reduction Library for Fiber-fed Echelle Spectrographs

    CERN Document Server

    Sosnowska, Danuta; Figueira, Pedro; Modigliani, Andrea; Di Marcantonio, Paolo; Megevand, Denis; Pepe, Francesco

    2015-01-01

    Within the ESPRESSO project a new flexible data reduction library is being built. ESPRESSO, the Echelle SPectrograph for Rocky Exoplanets and Stable Spectral Observations is a fiber-fed, high-resolution, cross-dispersed echelle spectrograph. One of its main scientific goals is to search for terrestrial exoplanets using the radial velocity technique. A dedicated pipeline is being developed. It is designed to be able to reduce data from different similar spectrographs: not only ESPRESSO, but also HARPS, HARPS-N and possibly others. Instrument specifics are configurable through an input static configuration table. The first written recipes are already tested on HARPS and HARPS-N real data and ESPRESSO simulated data. The final scientific products of the pipeline will be the extracted 1-dim and 2-dim spectra. Using these products the radial velocity of the observed object can be computed with high accuracy. The library is developed within the standard ESO pipeline environment. It is being written in ANSI C and ma...

  18. The Data Reduction Pipeline for the SDSS-IV MaNGA IFU Galaxy Survey

    CERN Document Server

    Law, David R; Yan, Renbin; Andrews, Brett H; Bershady, Matthew A; Bizyaev, Dmitry; Blanc, Guillermo A; Blanton, Michael R; Bolton, Adam S; Brownstein, Joel R; Bundy, Kevin; Chen, Yanmei; Drory, Niv; D'Souza, Richard; Fu, Hai; Jones, Amy; Kauffmann, Guinevere; MacDonald, Nicholas; Masters, Karen L; Newman, Jeffrey A; Parejko, John K; Sánchez-Gallego, José R; Sánchez, Sebastian F; Schlegel, David J; Thomas, Daniel; Wake, David A; Weijmans, Anne-Marie; Westfall, Kyle B; Zhang, Kai

    2016-01-01

    Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) is an optical fiber-bundle integral-field unit (IFU) spectroscopic survey that is one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV). With a spectral coverage of 3622 - 10,354 Angstroms and an average footprint of ~ 500 arcsec^2 per IFU the scientific data products derived from MaNGA will permit exploration of the internal structure of a statistically large sample of 10,000 low redshift galaxies in unprecedented detail. Comprising 174 individually pluggable science and calibration IFUs with a near-constant data stream, MaNGA is expected to obtain ~ 100 million raw-frame spectra and ~ 10 million reduced galaxy spectra over the six-year lifetime of the survey. In this contribution, we describe the MaNGA Data Reduction Pipeline (DRP) algorithms and centralized metadata framework that produces sky-subtracted, spectrophotometrically calibrated spectra and rectified 3-D data cubes that combine individual dithered observa...

  19. THE DATA REDUCTION PIPELINE FOR THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Nidever, David L. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Holtzman, Jon A. [New Mexico State University, Las Cruces, NM 88003 (United States); Prieto, Carlos Allende; Mészáros, Szabolcs [Instituto de Astrofísica de Canarias, Via Láctea s/n, E-38205 La Laguna, Tenerife (Spain); Beland, Stephane [Laboratory for Atmospheric and Space Sciences, University of Colorado at Boulder, Boulder, CO (United States); Bender, Chad; Desphande, Rohit [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Bizyaev, Dmitry [Apache Point Observatory and New Mexico State University, P.O. Box 59, sunspot, NM 88349-0059 (United States); Burton, Adam; García Pérez, Ana E.; Hearty, Fred R.; Majewski, Steven R.; Skrutskie, Michael F.; Sobeck, Jennifer S.; Wilson, John C. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Fleming, Scott W. [Computer Sciences Corporation, 3700 San Martin Dr, Baltimore, MD 21218 (United States); Muna, Demitri [Department of Astronomy and the Center for Cosmology and Astro-Particle Physics, The Ohio State University, Columbus, OH 43210 (United States); Nguyen, Duy [Department of Astronomy and Astrophysics, University of Toronto, Toronto, Ontario, M5S 3H4 (Canada); Schiavon, Ricardo P. [Gemini Observatory, 670 N. A’Ohoku Place, Hilo, HI 96720 (United States); Shetrone, Matthew, E-mail: dnidever@umich.edu [University of Texas at Austin, McDonald Observatory, Fort Davis, TX 79734 (United States)

    2015-12-15

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, explores the stellar populations of the Milky Way using the Sloan 2.5-m telescope linked to a high resolution (R ∼ 22,500), near-infrared (1.51–1.70 μm) spectrograph with 300 optical fibers. For over 150,000 predominantly red giant branch stars that APOGEE targeted across the Galactic bulge, disks and halo, the collected high signal-to-noise ratio (>100 per half-resolution element) spectra provide accurate (∼0.1 km s{sup −1}) RVs, stellar atmospheric parameters, and precise (≲0.1 dex) chemical abundances for about 15 chemical species. Here we describe the basic APOGEE data reduction software that reduces multiple 3D raw data cubes into calibrated, well-sampled, combined 1D spectra, as implemented for the SDSS-III/APOGEE data releases (DR10, DR11 and DR12). The processing of the near-IR spectral data of APOGEE presents some challenges for reduction, including automated sky subtraction and telluric correction over a 3°-diameter field and the combination of spectrally dithered spectra. We also discuss areas for future improvement.

  20. Using multivariate data reduction to predict postsurgery memory decline in patients with mesial temporal lobe epilepsy.

    Science.gov (United States)

    St-Laurent, Marie; McCormick, Cornelia; Cohn, Mélanie; Mišić, Bratislav; Giannoylis, Irene; McAndrews, Mary Pat

    2014-02-01

    Predicting postsurgery memory decline is crucial to clinical decision-making for individuals with mesial temporal lobe epilepsy (mTLE) who are candidates for temporal lobe excisions. Extensive neuropsychological testing is critical to assess risk, but the numerous test scores it produces can make deriving a formal prediction of cognitive change quite complex. In order to benefit from the information contained in comprehensive memory assessment, we used principal component analysis (PCA) to simplify neuropsychological test scores (presurgical and pre- to postsurgical change) obtained from a cohort of 56 patients with mTLE into a few easily interpretable latent components. We next performed discriminant analyses using presurgery latent components to categorize seizure laterality and then regression analyses to assess how well presurgery latent components could predict postsurgery memory decline. Finally, we validated the predictive power of these regression models in an independent sample of 18 patients with mTLE. Principal component analysis identified three significant latent components that reflected IQ, verbal memory, and visuospatial memory, respectively. Together, the presurgery verbal and visuospatial memory components classified 80% of patients with mTLE correctly according to their seizure laterality. Furthermore, the presurgery verbal memory component predicted postsurgery verbal memory decline, while the presurgery visuospatial memory component predicted visuospatial memory decline. These regression models also predicted postsurgery memory decline successfully in the independent cohort of patients with mTLE. Our results demonstrate the value of data reduction techniques in identifying cognitive metrics that can characterize laterality of damage and risk of postoperative decline.

  1. South Galactic Cap u-band Sky Survey (SCUSS): Data Reduction

    CERN Document Server

    Zou, Hu; Zhou, Xu; Wu, Zhenyu; Ma, Jun; Fan, Xiaohui; Fan, Zhou; He, Boliang; Jing, Yipeng; Lesser, Michael; Li, Cheng; Nie, Jundan; Shen, Shiyin; Wang, Jiali; Zhang, Tianmeng; Zhou, Zhimin

    2015-01-01

    The South Galactic Cap u-band Sky Survey (SCUSS) is a deep u-band imaging survey in the Southern Galactic Cap, using the 90Prime wide-field imager on the 2.3m Bok telescope at Kitt Peak. The survey observations started in 2010 and ended in 2013. The final survey area is about 5000 deg2 with a median 5-sigma point source limiting magnitude of about 23.2. This paper describes the survey data reduction process, which includes basic imaging processing, astrometric and photometric calibrations, image stacking, and photometric measurements. Survey photometry is performed on objects detected both on SCUSS u-band images and in the SDSS database. Automatic, aperture, point-spread function (PSF), and model magnitudes are measured on stacked images. Co-added aperture, PSF, and model magnitudes are derived from measurements on single-epoch images. We also present comparisons of the SCUSS photometric catalog with those of the SDSS and CFHTLS.

  2. The Data Reduction Pipeline for the SDSS-IV MaNGA IFU Galaxy Survey

    Science.gov (United States)

    Law, David R.; Cherinka, Brian; Yan, Renbin; Andrews, Brett H.; Bershady, Matthew A.; Bizyaev, Dmitry; Blanc, Guillermo A.; Blanton, Michael R.; Bolton, Adam S.; Brownstein, Joel R.; Bundy, Kevin; Chen, Yanmei; Drory, Niv; D'Souza, Richard; Fu, Hai; Jones, Amy; Kauffmann, Guinevere; MacDonald, Nicholas; Masters, Karen L.; Newman, Jeffrey A.; Parejko, John K.; Sánchez-Gallego, José R.; Sánchez, Sebastian F.; Schlegel, David J.; Thomas, Daniel; Wake, David A.; Weijmans, Anne-Marie; Westfall, Kyle B.; Zhang, Kai

    2016-10-01

    Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) is an optical fiber-bundle integral-field unit (IFU) spectroscopic survey that is one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV). With a spectral coverage of 3622-10354 Å and an average footprint of ˜500 arcsec2 per IFU the scientific data products derived from MaNGA will permit exploration of the internal structure of a statistically large sample of 10,000 low-redshift galaxies in unprecedented detail. Comprising 174 individually pluggable science and calibration IFUs with a near-constant data stream, MaNGA is expected to obtain ˜100 million raw-frame spectra and ˜10 million reduced galaxy spectra over the six-year lifetime of the survey. In this contribution, we describe the MaNGA Data Reduction Pipeline algorithms and centralized metadata framework that produce sky-subtracted spectrophotometrically calibrated spectra and rectified three-dimensional data cubes that combine individual dithered observations. For the 1390 galaxy data cubes released in Summer 2016 as part of SDSS-IV Data Release 13, we demonstrate that the MaNGA data have nearly Poisson-limited sky subtraction shortward of ˜8500 Å and reach a typical 10σ limiting continuum surface brightness μ = 23.5 AB arcsec-2 in a five-arcsecond-diameter aperture in the g-band. The wavelength calibration of the MaNGA data is accurate to 5 km s-1 rms, with a median spatial resolution of 2.54 arcsec FWHM (1.8 kpc at the median redshift of 0.037) and a median spectral resolution of σ = 72 km s-1.

  3. Data Reduction Processes Using FPGA for MicroBooNE Liquid Argon Time Projection Chamber

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jinyuan

    2010-05-26

    MicroBooNE is a liquid Argon time projection chamber to be built at Fermilab for an accelerator-based neutrino physics experiment and as part of the R&D strategy for a large liquid argon detector at DUSEL. The waveforms of the {approx}9000 sense wires in the chamber are continuously digitized at 2 M samples/s - which results in a large volume of data coming off the TPC. We have developed a lossless data reduction scheme based on Huffman Coding and have tested the scheme on cosmic ray data taken from a small liquid Argon time projection chamber, the BO detector. For sense wire waveforms produced by cosmic ray tracks, the Huffman Coding scheme compresses the data by a factor of approximately 10. The compressed data can be fully recovered back to the original data since the compression is lossless. In addition to accelerator neutrino data, which comes with small duty cycle in sync with the accelerator beam spill, continuous digitized waveforms are to be temporarily stored in the MicroBooNE data-acquisition system for about an hour, long enough for an external alert from possible supernova events. Another scheme, Dynamic Decimation, has been developed to compress further the potential supernova data so that the storage can be implemented within a reasonable budget. In the Dynamic Decimation scheme, data are sampled at the full sampling rate in the regions-of-interest (ROI) containing waveforms of track-hits and are decimated down to lower sampling rate outside the ROI. Note that unlike in typical zerosuppression schemes, in Dynamic Decimation, the data in the pedestal region are not thrown away but kept at a lower sampling rate. An additional factor of 10 compression ratio is achieved using the Dynamic Decimation scheme on the BO detector data, making a total compression rate of approximate 100 when the Dynamic Decimation and the Huffman Coding functional blocks are cascaded. Both of the blocks are compiled in low-cost FPGA and their silicon resource usages are low.

  4. Binary video codec for data reduction in wireless visual sensor networks

    Science.gov (United States)

    Khursheed, Khursheed; Ahmad, Naeem; Imran, Muhammad; O'Nils, Mattias

    2013-02-01

    of both the change coding and ROI coding becomes worse than that of image coding. This paper explores the compression efficiency of the Binary Video Codec (BVC) for the data reduction in WVSN. We proposed to implement all the three compression techniques i.e. image coding, change coding and ROI coding at the VSN and then select the smallest bit stream among the results of the three compression techniques. In this way the compression performance of the BVC will never become worse than that of image coding. We concluded that the compression efficiency of BVC is always better than that of change coding and is always better than or equal that of ROI coding and image coding.

  5. Semi-structured interview is a reliable and feasible tool for selection of doctors for general practice specialist training

    DEFF Research Database (Denmark)

    Isaksen, Jesper; Hertel, Niels Thomas; Kjær, Niels Kristian

    2013-01-01

    In order to optimise the selection process for admission to specialist training in family medicine, we developed a new design for structured applications and selection interviews. The design contains semi-structured interviews, which combine individualised elements from the applications with stan...... with standardised behaviour-based questions. This paper describes the design of the tool, and offers reflections concerning its acceptability, reliability and feasibility....

  6. COED Transactions, Vol. X, No. 6, June 1978. Concentric-Tube Heat Exchanger Analysis and Data Reduction.

    Science.gov (United States)

    Marcovitz, Alan B., Ed.

    Four computer programs written in FORTRAN and BASIC develop theoretical predictions and data reduction for a junior-senior level heat exchanger experiment. Programs may be used at the terminal in the laboratory to check progress of the experiment or may be used in the batch mode for interpretation of final information for a formal report. Several…

  7. The Patient Participation Culture Tool for healthcare workers (PaCT-HCW) on general hospital wards: A development and psychometric validation study.

    Science.gov (United States)

    Malfait, S; Eeckloo, K; Van Daele, J; Van Hecke, A

    2016-09-01

    Patient participation is an important subject for modern healthcare. In order to improve patient participation on a ward, the ward's culture regarding patient participation should first be measured. In this study a measurement tool for patient participation culture from the healthcare worker's perspective, the Patient Participation Culture Tool for healthcare workers (PaCT-HCW), was developed and psychometrically evaluated. The aim of this study was to develop and validate a tool that measures the healthcare worker-related factors of patient participation and information sharing and dialogue in patient participation from the healthcare worker's perspective in order to represent the patient participation culture on general and university hospital wards. A four-phased validation study was conducted: (1) defining the construct of the PaCT-HCW, (2) development of the PaCT-HCW, (3) content validation, and (4) psychometric evaluation. The Belgian Federal Government invited all Flemish general and university hospitals by e-mail to distribute the PaCT-HCW in their organization. Fifteen general hospitals took part in the study. Units for surgery, general medicine, medical rehabilitation, geriatric and maternal care were included. Intensive care-units, emergency room-units, psychiatric units and units with no admitted patients (e.g. radiology) were excluded. The respondents had to be caregivers, with hands-on patient contact, who worked on the same ward for more than six months. Nursing students and other healthcare workers with short-time internship on the ward were excluded. The tool was completed by 1329 respondents on 163 wards. The PaCT-HCW was psychometrically evaluated by use of an exploratory factor analysis and calculation of the internal consistency. A model containing eight components was developed through a literature review, individual interviews, and focus interviews. The developed model showed high sampling adequacy and the Bartlett's test of sphericity was

  8. The Potential of Web 2.0 Tools to Promote Reading Engagement in a General Education Course

    Science.gov (United States)

    Park, Seung Won

    2013-01-01

    General education classes involve extensive course readings. College instructors have a limited time to cover every detail of the materials students are supposed to learn in class; thus, they expect students to learn through course readings. However, many college students demonstrate a low level of engagement in course reading tasks. Existing…

  9. Preliminary validation of a consumer-oriented colorectal cancer risk assessment tool compatible with the US Surgeon General's My Family Health Portrait.

    Science.gov (United States)

    Feero, W Gregory; Facio, Flavia M; Glogowski, Emily A; Hampel, Heather L; Stopfer, Jill E; Eidem, Haley; Pizzino, Amy M; Barton, David K; Biesecker, Leslie G

    2015-09-01

    This study examines the analytic validity of a software tool designed to provide individuals with risk assessments for colorectal cancer based on personal health and family history information. The software is compatible with the US Surgeon General's My Family Health Portrait (MFHP). An algorithm for risk assessment was created using accepted colorectal risk assessment guidelines and programmed into a software tool (MFHP). Risk assessments derived from 150 pedigrees using the MFHP tool were compared with "gold standard" risk assessments developed by three expert cancer genetic counselors. Genetic counselor risk assessments showed substantial, but not perfect, agreement. MFHP risk assessments for colorectal cancer yielded a sensitivity for colorectal cancer risk of 81% (95% confidence interval: 54-96%) and specificity of 90% (95% confidence interval: 83-94%), as compared with genetic counselor pedigree review. The positive predictive value for risk for MFHP was 48% (95% confidence interval: 29-68%), whereas the negative predictive value was 98% (95% confidence interval: 93-99%). Agreement between MFHP and genetic counselor pedigree review was moderate (κ = 0.54). The analytic validity of the MFHP colorectal cancer risk assessment software is similar to those of other types of screening tools used in primary care. Future investigations should explore the clinical validity and utility of the software in diverse population groups.Genet Med 17 9, 753-756.

  10. Problems and prospects of use volleyball as a tool for improvement of physical condition general education schools pupils

    Directory of Open Access Journals (Sweden)

    Prozar N.V.

    2010-05-01

    Full Text Available The state of problem and prospect of the use of volleyball is considered for the decision of educational tasks and improvement of bodily condition of students. The process of physical education of students is investigational 4-5 classes of general educational establishment. Attention in the presence of motor low-density of most lessons of physical education is accented. The necessity of introduction of new technology of forming students playing volleyball skills is grounded. Technology takes into account alternative traditional approaches and directed on the improvement of bodily condition of students in the process of physical education.

  11. Experimental charge-density studies: data reduction and model quality: the more the better?

    Science.gov (United States)

    Herbst-Irmer, Regine; Stalke, Dietmar

    2017-08-01

    In this review, recent developments concerning data and model quality in experimental charge-density investigations from a personal view-point are described. Data quality is not only achieved by the high resolution, high I/σ(I) values, low merging R values and high multiplicity. The quality of the innermost reflections especially is crucial for mapping the density distribution of the outermost valence electrons and can be monitored by (I/σ)(asymptotic). New detector technologies seem to be promising improvements. Empirical corrections to correct for low-energy contamination of mirror-focused X-ray data and for resolution- and temperature-dependent errors caused by factors such as thermal diffuse scattering are described. Shashlik-like residual density patterns can indicate the need for an anharmonic description of the thermal motion of individual atoms. The physical reliability of the derived model must be thoroughly analysed. The derived probability density functions for the mean-squared atomic vibrational displacements especially should have only small negative values. The treatment of H atoms has been improved by methods to estimate anisotropic thermal motion. For very high resolution data, the polarization of the core density cannot be neglected. Several tools to detect systematic errors are described. A validation tool is presented that easily detects when the refinement of additional parameters yields a real improvement in the model or simply overfits the given data. In all investigated structures, it is proved that the multipole parameters of atoms with a comparable chemical environment should be constrained to be identical. The use of restraints could be a promising alternative.

  12. PyWiFeS: A Rapid Data Reduction Pipeline for the Wide Field Spectrograph (WiFeS)

    CERN Document Server

    Childress, Michael J; Nielsen, Jon; Sharp, Robert G

    2013-01-01

    We present PyWiFeS, a new Python-based data reduction pipeline for the Wide Field Spectrograph (WiFeS). PyWiFeS consists of a series of core data processing routines built on standard scientific Python packages commonly used in astronomical applications. Included in PyWiFeS is an implementation of a new global optical model of the spectrograph which provides wavelengths solutions accurate to $\\sim$0.05 \\AA\\ (RMS) across the entire detector. The core PyWiFeS package is designed to be scriptable to enable batch processing of large quantities of data, and we present a default format for handling of observation metadata and scripting of data reduction.

  13. A tool to evaluate patients' experiences of nursing care in Australian general practice: development of the Patient Enablement and Satisfaction Survey.

    Science.gov (United States)

    Desborough, Jane; Banfield, Michelle; Parker, Rhian

    2014-01-01

    Australian health policy initiatives have increasingly supported the employment of nurses in general practice. An understanding of the impact of nursing care on patients in this setting is integral to assuring quality, safety and a patient-centred focus. The aim was to develop a survey to evaluate the satisfaction and enablement of patients who receive nursing care in Australian general practices. The survey was to be simple to administer and analyse, ensuring practicality for use by general practice nurses, doctors and managers. Two validated instruments formed the basis of the Patient Enablement and Satisfaction Survey (PESS). This survey was refined and validated for the Australian setting using focus groups and in-depth interviews with patients, and feedback from general practice nurses. Test-retest and alternate form methods were used to establish the survey's reliability. Feedback resulted in 14 amendments to the original draft survey. Questions that demonstrated a strong positive correlation for the test-retest and alternate form measures were included in the final survey. The PESS is a useful, practical tool for the evaluation of nursing care in Australian general practice, its validity and reliability established through a patient-centred research approach, reflective of the needs of patients accessing nursing services in this setting.

  14. Report on the ESO/OPTICON ''Instrumentation School on Use and Data Reduction of X-shooter and KMOS''

    Science.gov (United States)

    Ballester, P.; Dennefeld, M.

    2016-09-01

    The NEON Archive Schools have since 1999 provided opportunities for young researchers to gain practical experience of the reduction and analysis of archive data. Twenty-four participants from 17 nationalities gathered to learn about the end-to-end cycle of observation proposal, data reduction and archive usage for X-shooter and KMOS. A brief description of the school is presented and the content of the main sessions is described.

  15. Interferometric vs Spectral IASI Radiances: Effective Data-Reduction Approaches for the Satellite Sounding of Atmospheric Thermodynamical Parameters

    Directory of Open Access Journals (Sweden)

    Giuseppe Grieco

    2010-09-01

    Full Text Available Abstract: Two data-reduction approaches for the Infrared Atmospheric Sounder Interferometer satellite instrument are discussed and compared. The approaches are intended for the purpose of devising and implementing fast near real time retrievals of atmospheric thermodynamical parameters. One approach is based on the usual selection of sparse channels or portions of the spectrum. This approach may preserve the spectral resolution, but at the expense of the spectral coverage. The second approach considers a suitable truncation of the interferogram (the Fourier transform of the spectrum at points below the nominal maximum optical path difference. This second approach is consistent with the Shannon-Whittaker sampling theorem, preserves the full spectral coverage, but at the expense of the spectral resolution. While the first data-reduction acts within the spectraldomain, the second can be performed within the interferogram domain and without any specific need to go back to the spectral domain for the purpose of retrieval. To assess the impact of these two different data-reduction strategies on retrieval of atmospheric parameters, we have used a statistical retrieval algorithm for skin temperature, temperature, water vapour and ozone profiles. The use of this retrieval algorithm is mostly intended for illustrative purposes and the user could choose a different inverse strategy. In fact, the interferogram-based data-reduction strategy is generic and independent of any inverse algorithm. It will be also shown that this strategy yields subset of interferometric radiances, which are less sensitive to potential interfering effects such as those possibly introduced by the day-night cycle (e.g., the solar component, and spectroscopic effect induced by sun energy and unknown trace gases variability.

  16. Final report for "Development of generalized mapping tools to improve implementation of data driven computer simulations" (LDRD 04-ERD-083)

    Energy Technology Data Exchange (ETDEWEB)

    Pasyanos, M; Ramirez, A; Franz, G

    2005-02-04

    Probabilistic inverse techniques, like the Markov Chain Monte Carlo (MCMC) algorithm, have had recent success in combining disparate data types into a consistent model. The Stochastic Engine (SE) initiative was a technique that developed this method and applied it to a number of earth science and national security applications. For instance, while the method was originally developed to solve ground flow problems (Aines et al.), it has also been applied to atmospheric modeling and engineering problems. The investigators of this proposal have applied the SE to regional-scale lithospheric earth models, which have applications to hazard analysis and nuclear explosion monitoring. While this broad applicability is appealing, tailoring the method for each application is inefficient and time-consuming. Stochastic methods invert data by probabilistically sampling the model space and comparing observations predicted by the proposed model to observed data and preferentially accepting models that produce a good fit, generating a posterior distribution. In other words, the method ''inverts'' for a model or, more precisely, a distribution of models, by a series of forward calculations. While powerful, the technique is often challenging to implement, as the mapping from model space to data needs to be ''customized'' for each data type. For example, all proposed models might need to be transformed through sensitivity kernels from 3-D models to 2-D models in one step in order to compute path integrals, and transformed in a completely different manner in the next step. We seek technical enhancements that widen the applicability of the Stochastic Engine by generalizing some aspects of the method (i.e. model-to-data transformation types, configuration, model representation). Initially, we wish to generalize the transformations that are necessary to match the observations to proposed models. These transformations are sufficiently general not to

  17. Data reduction framework for standard atomic weights and isotopic compositions of the elements

    Science.gov (United States)

    Meija, Juris; Possolo, Antonio

    2017-04-01

    We outline a general framework to compute consensus reference values of standard atomic weights, isotope ratios, and isotopic abundances, and to evaluate associated uncertainties using modern statistical methods for consensus building that can handle mutually inconsistent measurement results. The multivariate meta-regression approach presented here is directly relevant to the work of the IUPAC Commission on Isotopic Abundances and atomic weights (CIAAW), and we illustrate the proposed method in meta-analyses of the isotopic abundances and atomic weights of zinc, platinum, antimony, and iridium.

  18. On the interpretation of least squares collocation. [for geodetic data reduction

    Science.gov (United States)

    Tapley, B. D.

    1976-01-01

    A demonstration is given of the strict mathematical equivalence between the least squares collocation and the classical minimum variance estimates. It is shown that the least squares collocation algorithms are a special case of the modified minimum variance estimates. The computational efficiency of several forms of the general minimum variance estimation algorithm is discussed. It is pointed out that for certain geodetic applications the least square collocation algorithm may provide a more efficient formulation of the results from the point of view of the computations required.

  19. Towards a responsive and interactive graphical user interface for neutron data reduction and visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chatterjee, Alok; Worlton, T.; Hammonds, J.; Loong, C.K. [Argonne National Laboratory, Argonne, IL (United States); Mikkelson, D.; Mikkelson, R. [Univ. of Wisconsin-Stout, Menomonie, WI (United States); Chen, D. [Neutron Scattering Laboratory, China Institute of Atomic Energy, Beijing (China)

    2001-03-01

    An Integrated Spectral Analysis Workbench, ISAW has been developed at IPNS with the goal of providing a flexible and powerful tool to visualize and analyze neutron scattering time-of-flight data. The software, written in Java, is platform independent, object oriented and modular, making it easier to maintain and add features. The graphical user interface (GUI) for ISAW allows intuitive and interactive loading and manipulation of multiple spectra from different 'runs'. ISAW provides multiple displays of the spectra in a Runfile' and most of the functions can be performed through the GUI menu bar as well as through command scripts. All displays are simultaneously updated when the data is changed using the Observable-observer object-model pattern. All displays are observers of the Dataset (observable) and respond to changes or selections in it simultaneously. A 'tree' display of the spectra in run files is provided for a detailed view of detector elements and easy selection of spectra. The operations menu is instrument sensitive so that it displays the appropriate set of operators accordingly. Automatic menu generation is made possible by the ability of the DataSet objects to furnish a list of operations contained in the particular DataSet selected at the time the menu bar is accessed. The transformed and corrected data can be saved to a disk in different file formats for further analyses (e.g., GSAS for structure refinement). (author)

  20. GPs' knowledge, use, and confidence in national physical activity and health guidelines and tools: a questionnaire-based survey of general practice in England.

    Science.gov (United States)

    Chatterjee, Robin; Chapman, Tim; Brannan, Mike Gt; Varney, Justin

    2017-10-01

    Physical activity (PA) brief advice in health care is effective at getting individuals active. It has been suggested that one in four people would be more active if advised by a GP or nurse, but as many as 72% of GPs do not discuss the benefits of physical activity with patients. To assess the knowledge, use, and confidence in national PA and Chief Medical Officer (CMO) health guidelines and tools among GPs in England. Online questionnaire-based survey of self-selecting GPs in England that took place over a 10-day period in March 2016. The questionnaire consisted of six multiple-choice questions and was available on the Doctors.net.uk (DNUK) homepage. Quotas were used to ensure good regional representation. The final analysis included 1013 responses. Only 20% of responders were broadly or very familiar with the national PA guidelines. In all, 70% of GPs were aware of the General Practice Physical Activity Questionnaire (GPPAQ), but 26% were not familiar with any PA assessment tools, and 55% reported that they had not undertaken any training with respect to encouraging PA. The majority of GPs in England (80%) are unfamiliar with the national PA guidelines. Awareness of the recommended tool for assessment, GPPAQ, is higher than use by GPs. This may be because it is used by other clinical staff, for example, as part of the NHS Health Check programme. Although brief advice in isolation by GPs on PA will only be a part of the behaviour change journey, it is an important prompt, especially if repeated as part of routine practice. This study highlights the need for significant improvement in knowledge, skills, and confidence to maximise the potential for PA advice in GP consultations. © British Journal of General Practice 2017.

  1. Distribution of Risks for Major Osteoporotic Fracture Based on Fracture Risk Assessment Tool in Dr. Hasan Sadikin General Hospital, Bandung, Indonesia

    Directory of Open Access Journals (Sweden)

    Nik Fatin Farhana Binti Mohd Rahhim

    2015-09-01

    Full Text Available Background: Osteoporosis has become a growing public health problem in Indonesia. A definite estimation of osteoporosis prevalence in Indonesia is not available due to the limited access of dual energy X ray absorptiometry (DXA facilities. In 2008, the World Health Organization has developed a tool called Fracture Risk Assessment Tool to identify fracture risk based on the clinical risk factors. The study aimed to identify the risk factors of osteoporotic fracture using Fracture Risk Assessment Tool in Dr. Hasan Sadikin General Hospital, Bandung, Indonesia. Methods: This descriptive study was conducted from June–December 2013 in Orthopedic & Traumatology, Internal Medicine, Geriatric and Surgery polyclinics Dr. Hasan Sadikin General Hospital, Bandung to 77 respondents, aged 40–90 years, using the random sampling method. Fracture risks were calculated online, and the data obtained were analyzed and presented using frequency distribution in tables. Results: Most of the respondents had low risk for osteoporotic fracture, and only 5.19% of them had moderate risk. The main risk factors were rheumatoid arthritis (57.14%, followed by current smoking (27.27% and prolonged glucocorticoids consumption (25.98%. The moderate risk group was females, above 60 years old and with normal BMI or underweight with risks of previous fracture, parent’s previous hip fracture, rheumatoid arthritis and prolonged glucocorticoids exposure. Conclusions: Majority of the respondents have low risk for osteoporotic fracture. It must be taken into consideration that increasing age, rheumatoid arthritis, current smoking, prolonged glucocorticoids consumption, previous fracture and parent’s previous hip fracture can cause increased risk.

  2. Cultural adaptation into Spanish of the generalized anxiety disorder-7 (GAD-7 scale as a screening tool

    Directory of Open Access Journals (Sweden)

    Pérez-Páramo María

    2010-01-01

    Full Text Available Abstract Background Generalized anxiety disorder (GAD is a prevalent mental health condition which is underestimated worldwide. This study carried out the cultural adaptation into Spanish of the 7-item self-administered GAD-7 scale, which is used to identify probable patients with GAD. Methods The adaptation was performed by an expert panel using a conceptual equivalence process, including forward and backward translations in duplicate. Content validity was assessed by interrater agreement. Criteria validity was explored using ROC curve analysis, and sensitivity, specificity, predictive positive value and negative value for different cut-off values were determined. Concurrent validity was also explored using the HAM-A, HADS, and WHO-DAS-II scales. Results The study sample consisted of 212 subjects (106 patients with GAD with a mean age of 50.38 years (SD = 16.76. Average completion time was 2'30''. No items of the scale were left blank. Floor and ceiling effects were negligible. No patients with GAD had to be assisted to fill in the questionnaire. The scale was shown to be one-dimensional through factor analysis (explained variance = 72%. A cut-off point of 10 showed adequate values of sensitivity (86.8% and specificity (93.4%, with AUC being statistically significant [AUC = 0.957-0.985; p 0.001. Limitations Elderly people, particularly those very old, may need some help to complete the scale. Conclusion After the cultural adaptation process, a Spanish version of the GAD-7 scale was obtained. The validity of its content and the relevance and adequacy of items in the Spanish cultural context were confirmed.

  3. Charge-coupled device imaging spectroscopy of Mars. I - Instrumentation and data reduction/analysis procedures

    Science.gov (United States)

    Bell, James F., III; Lucey, Paul G.; Mccord, Thomas B.

    1992-01-01

    This paper describes the collection, reduction, and analysis of 0.4-1.0-micron Mars imaging spectroscopy data obtained during the 1988 and 1990 oppositions from Mauna Kea Observatory and provides a general outline for the acquisition and analysis of similar imaging spectroscopy data sets. The U.H. 2.24-m Wide Field Grism CCD Spectrograph was used to collect 13 3D image cubes covering 90 percent of the planet south of 50 deg N in the 0.4-0.8 micron region and covering 55 percent of the planet south of 50 deg N in the 0.5-1.0 micron region. Spectra extracted from these image cubes reveal the detailed character of the Martian near-UV to visible spectrum. Images at red wavelengths reveal the 'classical' albedo markings at 100-500 km spatial resolution while images at blue wavelengths show little surface feature contrast and are dominated by condensate clouds/hazes and polar ice.

  4. Parameterized Radiative Convective Equilibrium Across a Range of Domains: A Unifying Tool for General Circulation Models and High Resolution Models

    Science.gov (United States)

    Silvers, L. G.; Stevens, B. B.; Mauritsen, T.; Marco, G. A.

    2015-12-01

    The characteristics of clouds in General Circulation Models (GCMs) need to be constrained in a consistent manner with theory, observations, and high resolution models (HRMs). One way forward is to base improvements of parameterizations on high resolution studies which resolve more of the important dynamical motions and allow for less parameterizations. This is difficult because of the numerous differences between GCMs and HRMs, both technical and theoretical. Century long simulations at resolutions of 20-250 km on a global domain are typical of GCMs while HRMs often simulate hours at resolutions of 0.1km-5km on domains the size of a single GCM grid cell. The recently developed mode ICON provides a flexible framework which allows many of these difficulties to be overcome. This study uses the ICON model to compute SST perturbation simulations on multiple domains in a state of Radiative Convective Equilibrium (RCE) with parameterized convection. The domains used range from roughly the size of Texas to nearly half of Earth's surface area. All simulations use a doubly periodic domain with an effective distance between cell centers of 13 km and are integrated to a state of statistical stationarity. The primary analysis examines the mean characteristics of the cloud related fields and the feedback parameter of the simulations. It is shown that the simulated atmosphere of a GCM in RCE is sufficiently similar across a range of domain sizes to justify the use of RCE to study both a GCM and a HRM on the same domain with the goal of improved constraints on the parameterized clouds. The simulated atmospheres are comparable to what could be expected at midday in a typical region of Earth's tropics under calm conditions. In particular, the differences between the domains are smaller than differences which result from choosing different physics schemes. Significant convective organization is present on all domain sizes with a relatively high subsidence fraction. Notwithstanding

  5. 云存储数据缩减技术研究%Research on cloud storage data reduction technology

    Institute of Scientific and Technical Information of China (English)

    胡新海

    2012-01-01

    基于云计算应用中的云存储技术,使数据存储变得安全可靠和易管理。在云存储技术数据存储的过程中,不仅考虑数据读写的速度,还得处理数据存储效率,以便满足当前海量信息存储的需求。云存储中的数据缩减技术可以缩减数据信息量,提高存储的效率,满足数据存储急剧的要求。通过对几种数据缩减技术进行比较研究分析,探讨了对数据处理后存储的效率以及每项技术发展状况,为用户选择云存储数据缩减技术提供有力的参考。%Based on cloud storage technology applied in cloud computing applications, data storage has become more safer, reliable and manageable. During the process of cloud data storage,we not only consider the speed of data read and write,but also deal with the efficiency of data storage,in order to meet the needs of the vast amounts of information storage. Cloud storage data reduction technology can reduce the data amount of information to improve the efficiency of the storage to meet drastic re- quirements of data storage. Through a comparative study of several pairs of data reduction technology carried out, the author discuss and analyze the efficiency of the stored after the data processing and the technological development status to provide strong reference for the user to select the cloud storage data reduction technology.

  6. The Oncor Geodatabase for the Columbia Estuary Ecosystem Restoration Program: Handbook of Data Reduction Procedures, Workbooks, and Exchange Templates

    Energy Technology Data Exchange (ETDEWEB)

    Sather, Nichole K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Borde, Amy B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Diefenderfer, Heida L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Serkowski, John A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coleman, Andre M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Gary E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-12-01

    This Handbook of Data Reduction Procedures, Workbooks, and Exchange Templates is designed to support the Oncor geodatabase for the Columbia Estuary Ecosystem Restoration Program (CEERP). The following data categories are covered: water-surface elevation and temperature, sediment accretion rate, photo points, herbaceous wetland vegetation cover, tree plots and site summaries, fish catch and density, fish size, fish diet, fish prey, and Chinook salmon genetic stock identification. The handbook is intended for use by scientists collecting monitoring and research data for the CEERP. The ultimate goal of Oncor is to provide quality, easily accessible, geospatial data for synthesis and evaluation of the collective performance of CEERP ecosystem restoration actions at a program scale.

  7. Observations of the Hubble Deep Field with the Infrared Space Observatory .1. Data reduction, maps and sky coverage

    DEFF Research Database (Denmark)

    Serjeant, S.B.G.; Eaton, N.; Oliver, S.J.

    1997-01-01

    We present deep imaging at 6.7 and 15 mu m from the CAM instrument on the Infrared Space Observatory (ISO), centred on the Hubble Deep Field (HDF). These are the deepest integrations published to date at these wavelengths in any region of sky. We discuss the observational strategy and the data...... reduction. The observed source density appears to approach the CAM confusion limit at 15 mu m, and fluctuations in the 6.7-mu m sky background may be identifiable with similar spatial fluctuations in the HDF galaxy counts. ISO appears to be detecting comparable field galaxy populations to the HDF, and our...

  8. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    Science.gov (United States)

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  9. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    Science.gov (United States)

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  10. A lossless data reduction technique for wireless EEG recorders and its use in selective data filtering for seizure monitoring.

    Science.gov (United States)

    Chengliang Dai; Bailey, Christopher

    2015-01-01

    This paper presents a time-domain based lossless data reduction technique called Log2 Sub-band encoding, which is designed for reducing the size of data recorded on a wireless electroencephalogram (EEG) recorder. A data reduction unit can help to save power from the wireless transceiver and from the storage medium since it allows lower data transmission and read/write rates, and then extends the life time of the battery on the device. Our compression ratio(CR) results show that Log2 Sub-band encoding is comparable and even superior to Huffman coding, a well known entropy encoding method, whilst requiring minimal hardware resource, and it can also be used to extract features from EEG to achieve seizure detection during the compression process. The power consumption when compressing the EEG data is presented to evaluate the system0s overall improvement on its power performance, and our results indicate that a noticeable power saving can be achieved with our technique. The possibility of applying this method to other biomedical signals will also be noted.

  11. A novel data reduction technique for single slanted hot-wire measurements used to study incompressible compressor tip leakage flows

    Science.gov (United States)

    Berdanier, Reid A.; Key, Nicole L.

    2016-03-01

    The single slanted hot-wire technique has been used extensively as a method for measuring three velocity components in turbomachinery applications. The cross-flow orientation of probes with respect to the mean flow in rotating machinery results in detrimental prong interference effects when using multi-wire probes. As a result, the single slanted hot-wire technique is often preferred. Typical data reduction techniques solve a set of nonlinear equations determined by curve fits to calibration data. A new method is proposed which utilizes a look-up table method applied to a simulated triple-wire sensor with application to turbomachinery environments having subsonic, incompressible flows. Specific discussion regarding corrections for temperature and density changes present in a multistage compressor application is included, and additional consideration is given to the experimental error which accompanies each data reduction process. Hot-wire data collected from a three-stage research compressor with two rotor tip clearances are used to compare the look-up table technique with the traditional nonlinear equation method. The look-up table approach yields velocity errors of less than 5 % for test conditions deviating by more than 20 °C from calibration conditions (on par with the nonlinear solver method), while requiring less than 10 % of the computational processing time.

  12. General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — OverviewGMAT is a feature rich system containing high fidelity space system models, optimization and targeting,built in scripting and programming infrastructure, and...

  13. Useful tool for general practitioners, home health care nurses and social workers in assessing determinants of the health status and treatment of patients visited in their homes

    Directory of Open Access Journals (Sweden)

    Andrzej Brodziak

    2012-09-01

    Full Text Available The necessity is emphasized to distinguish between the traditional model of data acquisition reported by a patient in doctor’s office and the more valuable and desired model to become acquainted with the core of the problem by going to a patient’s domicile. In the desired model it is possible to come across various determinants of health during home visits. Family members can be approached and there is a possibility to evaluate the relationships between the patient and his loved ones. One can visually assess one’s living conditions and predictable environmental hazard. For several years, the desired model has been put into practice by general practitioners and home health care nurses. Recently this model is also promoted by “health care therapists” who are members of “teams of home health care”. The authors, being convinced of the merits of “home and environmental model” of practical medicine, have developed a method of recording and illustrating data collected during visits in patient’s home. The elaborated tool helps to communicate and exchange information among general practitioners, home health care nurses, social workers of primary health care centers and specialists. The method improves the formulation of the plan of further therapeutic steps and remedial interventions in psycho-social relations and living conditions of patients.

  14. A Finger-Stick Whole-Blood HIV Self-Test as an HIV Screening Tool Adapted to the General Public.

    Science.gov (United States)

    Prazuck, Thierry; Karon, Stephen; Gubavu, Camelia; Andre, Jerome; Legall, Jean Marie; Bouvet, Elisabeth; Kreplak, Georges; Teglas, Jean Paul; Pialoux, Gilles

    2016-01-01

    In 2013, the French Health Authority approved the use of HIV self-tests in pharmacies for the general public. This screening tool will allow an increase in the number of screenings and a reduction in the delay between infection and diagnosis, thus reducing the risk of further infections. We previously compared 5 HIV-self test candidates (4 oral fluid and one whole blood) and demonstrated that the whole blood HIV test exhibited the optimal level of performance (sensitivity/specificity). We studied the practicability of an easy-to-use finger-stick whole blood HIV self-test "autotest VIH®", when used in the general public. This multicenter cross-sectional study involved 411 participants from the Parisian region (AIDES and HF association) between April and July 2014 and was divided into 2 separate studies: one evaluating the capability of participants to obtain an interpretable result using only the information notice, and a second evaluating the interpretation of test results, using a provided chart. A total of 411 consenting participants, 264 in the first study and 147 in the second, were included. All participants were over 18 years of age. In the first study, 99.2% of the 264 participants correctly administered the auto-test, and 21.2% needed, upon their request, telephone assistance. Ninety-two percent of participants responded that the test was easy/very easy to perform, and 93.5% did not find any difficulty obtaining a sufficient good quantity of blood. In the second study, 98.1% of the 147 participants correctly interpreted the results. The reading/interpretation errors concerned the negative (2.1%) or the indeterminate (3.3%) auto-tests. The success rate of handling and interpretation of this self-test is very satisfactory, demonstrating its potential for use by the general public and its utility to increase the number of opportunities to detect HIV patients.

  15. A Finger-Stick Whole-Blood HIV Self-Test as an HIV Screening Tool Adapted to the General Public.

    Directory of Open Access Journals (Sweden)

    Thierry Prazuck

    Full Text Available In 2013, the French Health Authority approved the use of HIV self-tests in pharmacies for the general public. This screening tool will allow an increase in the number of screenings and a reduction in the delay between infection and diagnosis, thus reducing the risk of further infections. We previously compared 5 HIV-self test candidates (4 oral fluid and one whole blood and demonstrated that the whole blood HIV test exhibited the optimal level of performance (sensitivity/specificity. We studied the practicability of an easy-to-use finger-stick whole blood HIV self-test "autotest VIH®", when used in the general public.This multicenter cross-sectional study involved 411 participants from the Parisian region (AIDES and HF association between April and July 2014 and was divided into 2 separate studies: one evaluating the capability of participants to obtain an interpretable result using only the information notice, and a second evaluating the interpretation of test results, using a provided chart.A total of 411 consenting participants, 264 in the first study and 147 in the second, were included. All participants were over 18 years of age. In the first study, 99.2% of the 264 participants correctly administered the auto-test, and 21.2% needed, upon their request, telephone assistance. Ninety-two percent of participants responded that the test was easy/very easy to perform, and 93.5% did not find any difficulty obtaining a sufficient good quantity of blood. In the second study, 98.1% of the 147 participants correctly interpreted the results. The reading/interpretation errors concerned the negative (2.1% or the indeterminate (3.3% auto-tests.The success rate of handling and interpretation of this self-test is very satisfactory, demonstrating its potential for use by the general public and its utility to increase the number of opportunities to detect HIV patients.

  16. Observations of the Hubble Deep Field with the Infrared Space Observatory; 1, Data reduction, maps and sky coverage

    CERN Document Server

    Serjeant, S

    1997-01-01

    We present deep imaging at 6.7 micron and 15 micron from the CAM instrument on the Infrared Space Observatory (ISO), centred on the Hubble Deep Field (HDF). These are the deepest integrations published to date at these wavelengths in any region of sky. We discuss the observation strategy and the data reduction. The observed source density appears to approach the CAM confusion limit at 15 micron, and fluctuations in the 6.7 micron sky background may be identifiable with similar spatial fluctuations in the HDF galaxy counts. ISO appears to be detecting comparable field galaxy populations to the HDF, and our data yields strong evidence that future IR missions (such as SIRTF, FIRST and WIRE) as well as SCUBA and millimetre arrays will easily detect field galaxies out to comparably high redshifts.

  17. Non-target time trend screening: a data reduction strategy for detecting emerging contaminants in biological samples.

    Science.gov (United States)

    Plassmann, Merle M; Tengstrand, Erik; Åberg, K Magnus; Benskin, Jonathan P

    2016-06-01

    Non-targeted mass spectrometry-based approaches for detecting novel xenobiotics in biological samples are hampered by the occurrence of naturally fluctuating endogenous substances, which are difficult to distinguish from environmental contaminants. Here, we investigate a data reduction strategy for datasets derived from a biological time series. The objective is to flag reoccurring peaks in the time series based on increasing peak intensities, thereby reducing peak lists to only those which may be associated with emerging bioaccumulative contaminants. As a result, compounds with increasing concentrations are flagged while compounds displaying random, decreasing, or steady-state time trends are removed. As an initial proof of concept, we created artificial time trends by fortifying human whole blood samples with isotopically labelled standards. Different scenarios were investigated: eight model compounds had a continuously increasing trend in the last two to nine time points, and four model compounds had a trend that reached steady state after an initial increase. Each time series was investigated at three fortification levels and one unfortified series. Following extraction, analysis by ultra performance liquid chromatography high-resolution mass spectrometry, and data processing, a total of 21,700 aligned peaks were obtained. Peaks displaying an increasing trend were filtered from randomly fluctuating peaks using time trend ratios and Spearman's rank correlation coefficients. The first approach was successful in flagging model compounds spiked at only two to three time points, while the latter approach resulted in all model compounds ranking in the top 11 % of the peak lists. Compared to initial peak lists, a combination of both approaches reduced the size of datasets by 80-85 %. Overall, non-target time trend screening represents a promising data reduction strategy for identifying emerging bioaccumulative contaminants in biological samples. Graphical abstract

  18. Process Design of Shaft Machining with General Machine Tool%普通机床加工轴的工艺设计

    Institute of Scientific and Technical Information of China (English)

    王德鹏

    2015-01-01

    通常用轴传递动力和扭矩或者支撑位于轴上的传动零部件,并使之保持相对位置。根据轴的受力特点不同,一般采用阶梯轴结构,才能满足要求。在普通机床轴的加工,效率较低,精度不容易达到要求。本文通过某阶梯轴的零件分析及工艺分析,设计出了较合理的机械加工工艺,在保证加工质量的前提上,对提高生产效率,合理使用机床等进行了有益探讨。%Usually the shaft is used to transmit power and torque or support the transmission parts on the shaft and keep their position. According to the different loading features of the shaft, usually stepped shaft is adopted to meet the requirements. The processing of shaft with general machine tool is low in efficiency and can't meet the desired accuracy. Through part analysis and process analysis of a kind of stepped shaft, a reasonable machining process technology is designed, which is beneficial for improving production efficiency and reasonable use of the machine tool o the basis of ensuring the processing quality.

  19. Hierarchical Classes Analysis (HICLAS: A novel data reduction method to examine associations between biallelic SNPs and perceptual organization phenotypes in schizophrenia

    Directory of Open Access Journals (Sweden)

    Jamie Joseph

    2015-06-01

    Full Text Available The power of SNP association studies to detect valid relationships with clinical phenotypes in schizophrenia is largely limited by the number of SNPs selected and non-specificity of phenotypes. To address this, we first assessed performance on two visual perceptual organization tasks designed to avoid many generalized deficit confounds, Kanizsa shape perception and contour integration, in a schizophrenia patient sample. Then, to reduce the total number of candidate SNPs analyzed in association with perceptual organization phenotypes, we employed a two-stage strategy: first a priori SNPs from three candidate genes were selected (GAD1, NRG1 and DTNBP1; then a Hierarchical Classes Analysis (HICLAS was performed to reduce the total number of SNPs, based on statistically related SNP clusters. HICLAS reduced the total number of candidate SNPs for subsequent phenotype association analyses from 6 to 3. MANCOVAs indicated that rs10503929 and rs1978340 were associated with the Kanizsa shape perception filling in metric but not the global shape detection metric. rs10503929 was also associated with altered contour integration performance. SNPs not selected by the HICLAS model were unrelated to perceptual phenotype indices. While the contribution of candidate SNPs to perceptual impairments requires further clarification, this study reports the first application of HICLAS as a hypothesis-independent mathematical method for SNP data reduction. HICLAS may be useful for future larger scale genotype-phenotype association studies.

  20. Youth Actuarial Risk Assessment Tool (Y-ARAT): The development of an actuarial risk assessment instrument for predicting general offense recidivism on the basis of police records.

    Science.gov (United States)

    van der Put, Claudia E

    2014-06-01

    Estimating the risk for recidivism is important for many areas of the criminal justice system. In the present study, the Youth Actuarial Risk Assessment Tool (Y-ARAT) was developed for juvenile offenders based solely on police records, with the aim to estimate the risk of general recidivism among large groups of juvenile offenders by police officers without clinical expertise. On the basis of the Y-ARAT, juvenile offenders are classified into five risk groups based on (combinations of) 10 variables including different types of incidents in which the juvenile was a suspect, total number of incidents in which the juvenile was a suspect, total number of other incidents, total number of incidents in which co-occupants at the youth's address were suspects, gender, and age at first incident. The Y-ARAT was developed on a sample of 2,501 juvenile offenders and validated on another sample of 2,499 juvenile offenders, showing moderate predictive accuracy (area under the receiver-operating-characteristic curve = .73), with little variation between the construction and validation sample. The predictive accuracy of the Y-ARAT was considered sufficient to justify its use as a screening instrument for the police.

  1. A Finger-Stick Whole-Blood HIV Self-Test as an HIV Screening Tool Adapted to the General Public

    Science.gov (United States)

    Prazuck, Thierry; Karon, Stephen; Gubavu, Camelia; Andre, Jerome; Legall, Jean Marie; Bouvet, Elisabeth; Kreplak, Georges; Teglas, Jean Paul; Pialoux, Gilles

    2016-01-01

    Background In 2013, the French Health Authority approved the use of HIV self-tests in pharmacies for the general public. This screening tool will allow an increase in the number of screenings and a reduction in the delay between infection and diagnosis, thus reducing the risk of further infections. We previously compared 5 HIV-self test candidates (4 oral fluid and one whole blood) and demonstrated that the whole blood HIV test exhibited the optimal level of performance (sensitivity/specificity). We studied the practicability of an easy-to-use finger-stick whole blood HIV self-test “autotest VIH®”, when used in the general public. Methods and Materials This multicenter cross-sectional study involved 411 participants from the Parisian region (AIDES and HF association) between April and July 2014 and was divided into 2 separate studies: one evaluating the capability of participants to obtain an interpretable result using only the information notice, and a second evaluating the interpretation of test results, using a provided chart. Results A total of 411 consenting participants, 264 in the first study and 147 in the second, were included. All participants were over 18 years of age. In the first study, 99.2% of the 264 participants correctly administered the auto-test, and 21.2% needed, upon their request, telephone assistance. Ninety-two percent of participants responded that the test was easy/very easy to perform, and 93.5% did not find any difficulty obtaining a sufficient good quantity of blood. In the second study, 98.1% of the 147 participants correctly interpreted the results. The reading/interpretation errors concerned the negative (2.1%) or the indeterminate (3.3%) auto-tests. Conclusions The success rate of handling and interpretation of this self-test is very satisfactory, demonstrating its potential for use by the general public and its utility to increase the number of opportunities to detect HIV patients. PMID:26882229

  2. Data reduction in the ITMS system through a data acquisition model with self-adaptive sampling rate

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz, M. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain)], E-mail: mariano.ruiz@upm.es; Lopez, JM.; Arcas, G. de [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Barrera, E. [Departamento de Sistemas Electronicos y de Control, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Melendez, R. [Grupo de Investigacion en Instrumentacion y Acustica Aplicada, Universidad Politecnica de Madrid (UPM), Crta. Valencia Km-7, Madrid 28031 (Spain); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2008-04-15

    Long pulse or steady state operation of fusion experiments require data acquisition and processing systems that reduce the volume of data involved. The availability of self-adaptive sampling rate systems and the use of real-time lossless data compression techniques can help solve these problems. The former is important for continuous adaptation of sampling frequency for experimental requirements. The latter allows the maintenance of continuous digitization under limited memory conditions. This can be achieved by permanent transmission of compressed data to other systems. The compacted transfer ensures the use of minimum bandwidth. This paper presents an implementation based on intelligent test and measurement system (ITMS), a data acquisition system architecture with multiprocessing capabilities that permits it to adapt the system's sampling frequency throughout the experiment. The sampling rate can be controlled depending on the experiment's specific requirements by using an external dc voltage signal or by defining user events through software. The system takes advantage of the high processing capabilities of the ITMS platform to implement a data reduction mechanism based in lossless data compression algorithms which are themselves based in periodic deltas.

  3. The ACS Virgo Cluster Survey IV: Data Reduction Procedures for Surface Brightness Fluctuation Measurements with the Advanced Camera for Surveys

    CERN Document Server

    Mei, S; Tonry, J L; Jordan, A; Peng, E W; Côté, P; Ferrarese, L; Merritt, D; Milosavljevic, M; West, M J; Mei, Simona; Blakeslee, John P.; Tonry, John L.; Jordan, Andres; Peng, Eric W.; Cote, Patrick; Ferrarese, Laura; Merritt, David; Milosavljevic, Milos; West, Michael J.

    2005-01-01

    The Advanced Camera for Surveys (ACS) Virgo Cluster Survey is a large program to image 100 early-type Virgo galaxies using the F475W and F850LP bandpasses of the Wide Field Channel of the ACS instrument on the Hubble Space Telescope (HST). The scientific goals of this survey include an exploration of the three-dimensional structure of the Virgo Cluster and a critical examination of the usefulness of the globular cluster luminosity function as a distance indicator. Both of these issues require accurate distances for the full sample of 100 program galaxies. In this paper, we describe our data reduction procedures and examine the feasibility of accurate distance measurements using the method of surface brightness fluctuations (SBF) applied to the ACS Virgo Cluster Survey F850LP imaging. The ACS exhibits significant geometrical distortions due to its off-axis location in the HST focal plane; correcting for these distortions by resampling the pixel values onto an undistorted frame results in pixel correlations tha...

  4. Stellar Photometry of the Globular Cluster NGC 6229; 1, Data Reduction and Morphology of the Brighter Part of the CMD

    CERN Document Server

    Borisova, J; Spassova, N; Sweigart, A V

    1996-01-01

    BV CCD photometry of the central (1.5 arcmin x 2.0 arcmin) part of the mildly concentrated outer-halo globular cluster NGC 6229 is presented. The data reduction in such a crowded field was based on a wavelet transform analysis. Our larger dataset extends the previous results by Carney et al. (1991, AJ, 101, 1699) for the outer and less crowded fields of the cluster, and confirms that NGC 6229 has a peculiar color-magnitude diagram for its position in the Galaxy. In particular, NGC 6229's horizontal branch (HB) presents several interesting features, among which stand out: a well populated and very extended blue tail; a rather blue overall morphology, with (B-R)/(B+V+R) = 0.24+/-0.02; a bimodal color distribution, resembling those found for NGC 1851 and NGC 2808; and gaps on the blue HB. NGC 6229 is the first bimodal-HB cluster to be identified in the Galactic outer halo. A low value of the R parameter is confirmed, suggestive of a low helium abundance or of the presence of a quite substantial population of ext...

  5. ORBS, ORCS, OACS, a Software Suite for Data Reduction and Analysis of the Hyperspectral Imagers SITELLE and SpIOMM

    Science.gov (United States)

    Martin, T.; Drissen, L.; Joncas, G.

    2015-09-01

    SITELLE (installed in 2015 at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont-Mégantic) are the first Imaging Fourier Transform Spectrometers (IFTS) capable of obtaining a hyperspectral data cube which samples a 12 arc minutes field of view into four millions of visible spectra. The result of each observation is made up of two interferometric data cubes which need to be merged, corrected, transformed and calibrated in order to get a spectral cube of the observed region ready to be analysed. ORBS is a fully automatic data reduction software that has been entirely designed for this purpose. The data size (up to 68 Gb for larger science cases) and the computational needs have been challenging and the highly parallelized object-oriented architecture of ORBS reflects the solutions adopted which made possible to process 68 Gb of raw data in less than 11 hours using 8 cores and 22.6 Gb of RAM. It is based on a core framework (ORB) that has been designed to support the whole software suite for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS). They all aim to provide a strong basis for the creation and development of specialized analysis modules that could benefit the scientific community working with SITELLE and SpIOMM.

  6. Development of the online data reduction system and feasibility studies of 6-layer tracking for the Belle II pixel detector

    Energy Technology Data Exchange (ETDEWEB)

    Muenchow, David

    2015-04-24

    The Belle II experiment, the upgrade of the Belle experiment, at KEK (High Energy Accelerator Research Organization) in Tsukuba, Japan, will be built to answer fundamental questions that are not covered by the Standard Model of particle physics. For this reason, decays should be observed with high precision. To be able to measure all decay products with a very accurate vertex resolution, it was decided to add a Pixel Detector (PXD) with an inner radius of only 14 mm in short distance around the beam (outer radius 12.5 mm). This increases the vertex resolution and it is possible to improve the reconstruction efficiency and accuracy. Because of the short distance to the interaction point, we expect to have a background induced occupancy of up to 3% on the pixel detector. This generates an expected data rate of about 20 GB/s and exceeds the bandwidth limitations of the data storage. Based on hits in the outer detectors, back projections of particle tracks are performed and Region of Interests (ROI) on the PXD sensors are calculated. Based on those ROIs the data are reduced. In this thesis I present my development of the ROI based data reduction algorithm as well as my feasibility studies about a future 6-layer tracking. Online Data Reduction for Belle II A first test with the whole DAQ integration and prototype sensors of PXD and SVD had been performed at DESY. For the verification of the ROI selection logic a full recording of in- and output data was included. With this setup I recorded 1.2.10{sup 6} events containing in total 4.8.10{sup 8} hits. The occupancy of originally ∼ 0.80% was reduced with my ROI selection logic by a factor of 6.9 to ∼ 0.12% by rejecting all hits outside any ROI. In addition I investigated the ROI positioning and got a result of a distance between ROI center and hit of 17.624±0.029 with a main offset direction of (π)/(2) and (3π)/(2). With a more accurate position of the ROIs their size could be reduced which would optimize the

  7. Risk factors and trends in attempting or committing suicide in Dutch general practice in 1983–2009 and tools for early recognition.

    NARCIS (Netherlands)

    Donker, G.A.; Wolters, I.; Schellevis, F.

    2010-01-01

    Background: Many patients visit their general practitioner (GP) before attempting or committing suicide. This study analyses determinants and trends of suicidal behaviour to enhance early recognition of risk factors in general practice. Method: Analysis of trends, patient and treatment

  8. A deep survey of the AKARI north ecliptic pole field : I. WSRT 20 cm radio survey description, observations and data reduction

    NARCIS (Netherlands)

    White, G. J.; Pearson, C.; Braun, R.; Serjeant, S.; Matsuhara, H.; Takagi, T.; Nakagawa, T.; Shipman, R.; Barthel, P.; Hwang, N.; Lee, H. M.; Lee, M. G.; Im, M.; Wada, T.; Oyabu, S.; Pak, S.; Chun, M. -Y.; Hanami, H.; Goto, T.; Oliver, S.

    2010-01-01

    Aims. The Westerbork Radio Synthesis Telescope, WSRT, has been used to make a deep radio survey of an similar to 1.7 degree(2) field coinciding with the AKARI north ecliptic pole deep field. The observations, data reduction and source count analysis are presented, along with a description of the ove

  9. A deep survey of the AKARI north ecliptic pole field . I. WSRT 20 cm radio survey description, observations and data reduction

    NARCIS (Netherlands)

    White, G. J.; Pearson, C.; Braun, R.; Serjeant, S.; Matsuhara, H.; Takagi, T.; Nakagawa, T.; Shipman, R.; Barthel, P.; Hwang, N.; Lee, H. M.; Lee, M. G.; Im, M.; Wada, T.; Oyabu, S.; Pak, S.; Chun, M.-Y.; Hanami, H.; Goto, T.; Oliver, S.

    2010-01-01

    Aims: The Westerbork Radio Synthesis Telescope, WSRT, has been used to make a deep radio survey of an ~1.7 degree2 field coinciding with the AKARI north ecliptic pole deep field. The observations, data reduction and source count analysis are presented, along with a description of the overall scienti

  10. Software engineering tools.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  11. Improvements in Precise and Accurate Isotope Ratio Determination via LA-MC-ICP-MS by Application of an Alternative Data Reduction Protocol

    Science.gov (United States)

    Fietzke, J.; Liebetrau, V.; Guenther, D.; Frische, M.; Zumholz, K.; Hansteen, T. H.; Eisenhauer, A.

    2008-12-01

    An alternative approach for the evaluation of isotope ratio data using LA-MC-ICP-MS will be presented. In contrast to previously applied methods it is based on the simultaneous responses of all analyte isotopes of interest and the relevant interferences without performing a conventional background correction. Significant improvements in precision and accuracy can be achieved when applying this new method and will be discussed based on the results of two first methodical applications: a) radiogenic and stable Sr isotopes in carbonates b) stable chlorine isotopes of pyrohydrolytic extracts. In carbonates an external reproducibility of the 87Sr/86Sr ratios of about 19 ppm (RSD) was achieved, an improvement of about a factor of 5. For recent and sub-recent marine carbonates a mean radiogenic strontium isotope ratio 87Sr/86Sr of 0.709170±0.000007 (2SE) was determined, which agrees well with the value of 0.7091741±0.0000024 (2SE) reported for modern sea water [1,2]. Stable chlorine isotope ratios were determined ablating pyrohydrolytic extracts with a reproducibility of about 0.05‰ (RSD). For basaltic reference material JB1a and JB2 chlorine isotope ratios were determined relative to SMOC (standard mean ocean chlorinity) δ37ClJB-1a = (-0.99±0.06) ‰ and δ37ClJB-1a = (-0.60±0.03) ‰ (SD), respectively, in accordance with published data [3]. The described strategies for data reduction are considered to be generally applicable for all isotope ratio measurements using LA-MC-ICP-MS. [1] J.M. McArthur, D. Rio, F. Massari, D. Castradori, T.R. Bailey, M. Thirlwall, S. Houghton, Palaeogeo. Palaeoclim. Palaeoeco., 2006, 242 (126), doi: 10.1016/j.palaeo.2006.06.004 [2] J. Fietzke, V. Liebetrau, D. Guenther, K. Guers, K. Hametner, K. Zumholz, T.H. Hansteen and A. Eisenhauer, J. Anal. At. Spectrom., 2008, 23, 955-961, doi:10.1039/B717706B [3] J. Fietzke, M. Frische, T.H. Hansteen and A. Eisenhauer, J. Anal. At. Spectrom., 2008, 23, 769-772, doi:10.1039/B718597A

  12. BioXTAS RAW, a software program for high-throughput automated small-angle X-ray scattering data reduction and preliminary analysis

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Toft, K.N.; Snakenborg, Detlef

    2009-01-01

    -dimensional data in terms of the indirect Fourier transform using the objective Bayesian approach to obtain the pair-distance distribution function, PDDF, and is thereby a free and open-source alternative to existing PDDF estimation software. Apart from the TIFF input format, the program also accepts ASCII......A fully open source software program for automated two-dimensional and one-dimensional data reduction and preliminary analysis of isotropic small-angle X-ray scattering (SAXS) data is presented. The program is freely distributed, following the open-source philosophy, and does not rely on any...... commercial software packages. BioXTAS RAW is a fully automated program that, via an online feature, reads raw two-dimensional SAXS detector output files and processes and plots data as the data files are created during measurement sessions. The software handles all steps in the data reduction. This includes...

  13. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    Tools for design management are on the agenda in building projects in order to set targets, to choose and prioritise between alternative environmental solutions, to involve stakeholders and to document, evaluate and benchmark. Different types of tools are available, but what can we learn from...... the use or lack of use of current tools in the development of future design tools for sustainable buildings? Why are some used while others are not? Who is using them? The paper deals with design management, with special focus on sustainable building in Denmark, and the challenge of turning the generally...... vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...

  14. Operational Data Reduction Procedure for Determining Density and Vertical Structure of the Martian Upper Atmosphere from Mars Global Surveyor Accelerometer Measurements

    Science.gov (United States)

    Cancro, George J.; Tolson, Robert H.; Keating, Gerald M.

    1998-01-01

    The success of aerobraking by the Mars Global Surveyor (MGS) spacecraft was partly due to the analysis of MGS accelerometer data. Accelerometer data was used to determine the effect of the atmosphere on each orbit, to characterize the nature of the atmosphere, and to predict the atmosphere for future orbits. To interpret the accelerometer data, a data reduction procedure was developed to produce density estimations utilizing inputs from the spacecraft, the Navigation Team, and pre-mission aerothermodynamic studies. This data reduction procedure was based on the calculation of aerodynamic forces from the accelerometer data by considering acceleration due to gravity gradient, solar pressure, angular motion of the MGS, instrument bias, thruster activity, and a vibration component due to the motion of the damaged solar array. Methods were developed to calculate all of the acceleration components including a 4 degree of freedom dynamics model used to gain a greater understanding of the damaged solar array. The total error inherent to the data reduction procedure was calculated as a function of altitude and density considering contributions from ephemeris errors, errors in force coefficient, and instrument errors due to bias and digitization. Comparing the results from this procedure to the data of other MGS Teams has demonstrated that this procedure can quickly and accurately describe the density and vertical structure of the Martian upper atmosphere.

  15. A GIS-based decision support tool for renewable energy management and planning in semi-arid rural environments of northeast of Brazil-general description and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Tiba, Chigueru; Fraidenraich, Naum; Souza Barbosa, Elielza Moura de [Dept. de Energia Nuclear da Univ. Federal de Pernambuco Recife, Pernambuco (Brazil); Bezerra Candeias, Ana Lucia [Dept. de Engenharia Cartografica da Univ. Federal de Pernambuco Recife, Pernambuco (Brazil); Carvalho Neto, Pedro Bezerra de; Melo Filho, Jose Bione de [Companhia Hidro Electrica do Sao Francisco -DTG- CHESF Recife, Pernambuco (Brazil)

    2008-07-01

    This work describes the development of a system of management and planning elaborated on a GIS platform (Geographic Information System) and is directed to decision makers, be they administrators, planners or consultants in renewable energy. It was conceived to deal with the management and planning of photovoltaic solar systems, biomass and aeolic energy in rural regions of the Northeast of Brazil. The prototype of the GIS tool covers an area of 183,500 km{sup 2} (in a second phase it will be extended to 1,561,178 km{sup 2}) and it is made up of two main blocks: management of installed renewable energy systems and planning of insertion of new renewable energy systems. The system was mainly developed for PV systems as a possible tool of support for the management and planning of the Program of Energetic Development of the States and Municipalities (PRODEEM), a program of insertion of large scale photovoltaic solar energy (thousands of systems), in the rural environment, directed by the Ministry of Mines and Energy of Brazil. The tool for the management of the photovoltaic systems permits the execution of the technical analyses that are involved in the different combinations of the following layers of information: PV systems installed for the purpose of application (energetic, water pumping and others), identification of systems components, identification of component failures, maintenance and training posts, map of monthly mean solar irradiation, infrastructure access, electric transmission lines and distance from a given municipality to the nearest maintenance post. All the maps above can be combined in such a way that information at state or municipal level can be obtained. Additionally, in all the circumstances above, resulting thematic maps as well as issue reports can be printed. The tool for planning the insertion of new photovoltaic systems permits the execution of the technical analyses that are involved in the different combinations of the following layers

  16. Rasch-based high-dimensionality data reduction and class prediction with applications to microarray gene expression data

    CERN Document Server

    Kastrin, Andrej

    2010-01-01

    Class prediction is an important application of microarray gene expression data analysis. The high-dimensionality of microarray data, where number of genes (variables) is very large compared to the number of samples (obser- vations), makes the application of many prediction techniques (e.g., logistic regression, discriminant analysis) difficult. An efficient way to solve this prob- lem is by using dimension reduction statistical techniques. Increasingly used in psychology-related applications, Rasch model (RM) provides an appealing framework for handling high-dimensional microarray data. In this paper, we study the potential of RM-based modeling in dimensionality reduction with binarized microarray gene expression data and investigate its prediction ac- curacy in the context of class prediction using linear discriminant analysis. Two different publicly available microarray data sets are used to illustrate a general framework of the approach. Performance of the proposed method is assessed by re-randomization s...

  17. The Telemedicine benchmark--a general tool to measure and compare the performance of video conferencing equipment in the telemedicine area.

    Science.gov (United States)

    Klutke, P J; Mattioli, P; Baruffaldi, F; Toni, A; Englmeier, K H

    1999-09-01

    In this paper, we describe the 'Telemedicine Benchmark' (TMB), which is a set of standard procedures, protocols and measurements to test reliability and levels of performance of data exchange in a telemedicine session. We have put special emphasis on medical imaging, i.e. digital image transfer, joint viewing and editing and 3D manipulation. With the TMB, we can compare the aptitude of different video conferencing software systems for telemedicine issues and the effect of different network technologies (ISDN, xDSL, ATM, Ethernet). The evaluation criteria used are length of delays and functionality. For the application of the TMB, a data set containing radiological images and medical reports was set up. Considering the Benchmark protocol, this data set has to be exchanged between the partners of the session. The Benchmark covers file transfer, whiteboard usage, application sharing and volume data analysis and compression. The TMB has proven to be a useful tool in several evaluation issues.

  18. Research on a tool for organizational culture assessment of public general hospitals in China%公立医院组织文化测评量表研究

    Institute of Scientific and Technical Information of China (English)

    周萍; 常继乐; 黄金星; 李岚; 唐智柳; 黄葭燕; 伍蓉; 薛迪

    2011-01-01

    Objective A scientific evaluation of hospital culture with the Dimension organizational culture model, in view of features of China's general public hospitals.Methods Based on Denison model, according to the characteristics of the public general hospitals in China, the authors developed a tool for organizational culture assessment (TOCA) by using the survey data from 87 hospitals in three provinces from the East, Central, and West areas in China.Results This tool, an evaluation scale, comprises the four cultural characteristics of direction, consistency, participation, and adaptability, as well as 13 cultural dimensions of social responsibility and competitive consciousness. The tool is tested as having good internal reliability and validity.Conclusion The TOCA provides hospital administrators with a tool for hospital culture evaluation, diagnosis and improvement.%目的借鉴Denison组织文化模型,结合我国公立综合性医院特点,科学测评医院文化.方法 运用我国东、中、西3省/直辖市的87所医院的调查数据,研究设计了医院文化测评量表.结果 医院文化测评量表包括方向性、一致性、参与性、适应性4大文化特性以及社会责任、竞争意识在内的13个文化维度,经检验,量表具有良好的内部信度和效度.结论 医院文化测评量表为医院管理者进行医院文化测评、诊断和改善提供了测评工具.

  19. Nutritional screening tools application in a general hospital: a comparative study Aplicação de instrumentos de triagem nutricional em hospital geral: um estudo comparativo

    Directory of Open Access Journals (Sweden)

    Janaína Damasceno Bezerra

    2012-05-01

    Full Text Available Introduction: There are many nutritional screening tools and it becomes difficult to choose which one is the best to be used in clinical nutrition practice. Objective: To compare five nutritional screening tools (MST, NRS-2002, MUST, MNA and MNA-SF in adults and elderly hospitalized. Materials and Methods: A cross-sectional study, with the application of nutritional screening tools in adult and elderly patients in the first 48 hours of hospitalization was performed. Nutritional risk occurrence between adult and elderly patients was compared. Statistical analyses were performed using descriptive data and a non-parametric test (Man Whitney. Results: We evaluated 77 patients, 51 (66.2% adults and 26 (33.8% elderly, aged 53.6 (standard deviation of 17.9 years, with female predominance (53.2%. The main reasons for hospitalization were neoplasia and nephrolithotripsy. Overall, one quarter of patients was at nutritional risk. Nutritional risk in adults was detected with similarity by MUST and MST. However it was underestimated by NRS-2002. The MNA and MNA-SF, exclusively for the elderly, also had similar result to detect nutritional risk. In relation to the time of application, the MNA was the instrument with longer application time. Conclusion: Considering the higher detection of patients with nutritional risk, the easiness and the lower application time, we suggest, respectively, MUST and MNA-SF to be used in adult and elderly patients admitted in this hospital.Introdução: Com inúmeros instrumentos de triagem nutricional existentes, é difícil eleger o mais adequado para os protocolos de nutrição hospitalar. Objetivo: Comparar cinco instrumentos de triagem nutricional (MST, NRS-2002, MUST, MNA e MNA-SF em adultos e idosos hospitalizados. Materiais e Métodos: Nesse estudo transversal, cinco instrumentos de triagem nutricional foram aplicados aos pacientes nas primeiras 48 horas de internação hospitalar. A ocorrência de risco nutricional

  20. The perception of the clinical relevance of the MDS-Home Care(C) tool by trainers in general practice in Belgium.

    Science.gov (United States)

    Duyver, Corentin; Van Houdt, Sabine; De Lepeleire, Jan; Dory, Valerie; Degryse, Jean-Marie

    2010-12-01

    comprehensive geriatric assessment has been advocated as an effective way to first identify multidimensional needs and second to establish priorities for organizing an individual health care plan for community-dwelling elderly. This paper reports on the perception of an internationally evaluated assessment system for use in community care programmes, the Minimal Data Set-Home Care (MDS-HC), by a group of experienced GP trainers. the primary study aim was to determine the perception of a standardized home care assessment system (MDS-HC) by GP trainers in terms of acceptability, perceived clinical relevance, care planning empowerment and valorization of the GP. sixty-five first-year GP trainees were educated about the MDS-HC and the use of a first version of an electronic interface. Each trainee included two elderly patients, based on strict inclusion criteria. Prior to the assessment, GP trainers and trainees were invited to complete together a basic medical record on the basis of their knowledge of the included patients. Next, the collected data, covering the multiple domains by MDS-HC, were introduced in the electronic interface by the trainee. Based on the collected data for each patient, a series of clinical assessment protocols (CAP's) were generated. Afterwards, these CAP's were critically discussed with the trainer. To investigate how the application of the MDS-HC was perceived, a 21 Likert-type item scale was drawn up based on four dimensions regarding the tool. the perception questionnaire had a good internal consistency (Cronbach's alpha 0.93). The first version of the electronic interface was considered not 'user-friendly' enough and the introduction of data time-consuming. The perception of the GP's about the overall clinical relevance of the MDS-HC was found to have little added value for the GP in the establishment of a personal management plan. many developments in health care result in an increasing demand for a standardized home care assessment

  1. Human bones obtained from routine joint replacement surgery as a tool for studies of plutonium, americium and {sup 90}Sr body-burden in general public

    Energy Technology Data Exchange (ETDEWEB)

    Mietelski, Jerzy W., E-mail: jerzy.mietelski@ifj.edu.pl [Henryk Niewodniczanski Institute of Nuclear Physics, Polish Academy of Sciences, Radzikowskiego 152, 31-342 Cracow (Poland); Golec, Edward B. [Traumatology and Orthopaedic Clinic, 5th Military Clinical Hospital and Polyclinic, Independent Public Healthcare Facility, Wroclawska 1-3, 30-901 Cracow (Poland); Orthopaedic Rehabilitation Department, Chair of Clinical Rehabilitation, Faculty of Motor of the Bronislaw Czech' s Academy of Physical Education, Cracow (Poland); Department of Physical Therapy Basics, Faculty of Physical Therapy, Administration College, Bielsko-Biala (Poland); Tomankiewicz, Ewa [Henryk Niewodniczanski Institute of Nuclear Physics, Polish Academy of Sciences, Radzikowskiego 152, 31-342 Cracow (Poland); Golec, Joanna [Orthopaedic Rehabilitation Department, Chair of Clinical Rehabilitation, Faculty of Motor of the Bronislaw Czech' s Academy of Physical Education, Cracow (Poland); Physical Therapy Department, Institute of Physical Therapy, Faculty of Heath Science, Jagiellonian University, Medical College, Cracow (Poland); Nowak, Sebastian [Traumatology and Orthopaedic Clinic, 5th Military Clinical Hospital and Polyclinic, Independent Public Healthcare Facility, Wroclawska 1-3, 30-901 Cracow (Poland); Orthopaedic Rehabilitation Department, Chair of Clinical Rehabilitation, Faculty of Motor of the Bronislaw Czech' s Academy of Physical Education, Cracow (Poland); Szczygiel, Elzbieta [Physical Therapy Department, Institute of Physical Therapy, Faculty of Heath Science, Jagiellonian University, Medical College, Cracow (Poland); Brudecki, Kamil [Henryk Niewodniczanski Institute of Nuclear Physics, Polish Academy of Sciences, Radzikowskiego 152, 31-342 Cracow (Poland)

    2011-06-15

    The paper presents a new sampling method for studying in-body radioactive contamination by bone-seeking radionuclides such as {sup 90}Sr, {sup 239+240}Pu, {sup 238}Pu, {sup 241}Am and selected gamma-emitters, in human bones. The presented results were obtained for samples retrieved from routine surgeries, namely knee or hip joints replacements with implants, performed on individuals from Southern Poland. This allowed to collect representative sets of general public samples. The applied analytical radiochemical procedure for bone matrix is described in details. Due to low concentrations of {sup 238}Pu the ratio of Pu isotopes which might be used for Pu source identification is obtained only as upper limits other then global fallout (for example Chernobyl) origin of Pu. Calculated concentrations of radioisotopes are comparable to the existing data from post-mortem studies on human bones retrieved from autopsy or exhumations. Human bones removed during knee or hip joint surgery provide a simple and ethical way for obtaining samples for plutonium, americium and {sup 90}Sr in-body contamination studies in general public. - Highlights: > Surgery for joint replacement as novel sampling method for studying in-body radioactive contamination. > Proposed way of sampling is not causing ethic doubts. > It is a convenient way of collecting human bone samples from global population. > The applied analytical radiochemical procedure for bone matrix is described in details. > The opposite patient age correlations trends were found for 90Sr (negative) and Pu, Am (positive).

  2. [Leather bags production: organization study, general identification of hazards, biomechanical overload risk pre-evaluation using an easily applied evaluation tool].

    Science.gov (United States)

    Montomoli, Loretta; Coppola, Giuseppina; Sarrini, Daniela; Sartorelli, P

    2011-01-01

    Craft industries are the backbone of the Italian manufacturing system and in this sector the leather trade plays a crucial role. The aim of the study was to experiment with a risk pre-mapping data sheet in leather bag manufacture by analyzing the production cycle. The prevalence of biomechanical, organizational and physical factors was demonstrated in tanneries. With regard to chemical agents the lack of any priority of intervention could be due to the lack of information on the chemicals used. In the 2 enterprises that used mechanical processes the results showed different priorities for intervention and a different level of the extent of such intervention. In particular in the first enterprise biomechanical overload was a top priority, while in the second the results were very similar to those of the tannery. The analysis showed in both companies that there was a high prevalence of risk of upper limb biomechanical overload in leather bag manufacture. Chemical risk assessment was not shown as a priority because the list of chemicals used was neither complete nor sufficient. The risk pre-mapping data sheet allowed us to obtain a preliminary overview of all the major existing risks in the leather industry. Therefore the method can prove a useful tool for employers as it permits instant identification of priorities for intervention for the different risks.

  3. Tools and their uses

    CERN Document Server

    1973-01-01

    Teaches names, general uses, and correct operation of all basic hand and power tools, fasteners, and measuring devices you are likely to need. Also, grinding, metal cutting, soldering, and more. 329 illustrations.

  4. Case and Administrative Support Tools

    Data.gov (United States)

    U.S. Environmental Protection Agency — Case and Administrative Support Tools (CAST) is the secure portion of the Office of General Counsel (OGC) Dashboard business process automation tool used to help...

  5. The potential of Virtual Reality as anxiety management tool: a randomized controlled study in a sample of patients affected by Generalized Anxiety Disorder

    Directory of Open Access Journals (Sweden)

    Gorini Alessandra

    2008-05-01

    Full Text Available Abstract Background Generalized anxiety disorder (GAD is a psychiatric disorder characterized by a constant and unspecific anxiety that interferes with daily-life activities. Its high prevalence in general population and the severe limitations it causes, point out the necessity to find new efficient strategies to treat it. Together with the cognitive-behavioural treatments, relaxation represents a useful approach for the treatment of GAD, but it has the limitation that it is hard to be learned. To overcome this limitation we propose the use of virtual reality (VR to facilitate the relaxation process by visually presenting key relaxing images to the subjects. The visual presentation of a virtual calm scenario can facilitate patients' practice and mastery of relaxation, making the experience more vivid and real than the one that most subjects can create using their own imagination and memory, and triggering a broad empowerment process within the experience induced by a high sense of presence. According to these premises, the aim of the present study is to investigate the advantages of using a VR-based relaxation protocol in reducing anxiety in patients affected by GAD. Methods/Design The trial is based on a randomized controlled study, including three groups of 25 patients each (for a total of 75 patients: (1 the VR group, (2 the non-VR group and (3 the waiting list (WL group. Patients in the VR group will be taught to relax using a VR relaxing environment and audio-visual mobile narratives; patients in the non-VR group will be taught to relax using the same relaxing narratives proposed to the VR group, but without the VR support, and patients in the WL group will not receive any kind of relaxation training. Psychometric and psychophysiological outcomes will serve as quantitative dependent variables, while subjective reports of participants will be used as qualitative dependent variables. Conclusion We argue that the use of VR for relaxation

  6. 通用多轴数控机床误差建模与指令修正方法的研究%Research on General Error Modeling and Instructions Correction Method for Multi-axis CNC Machine Tools

    Institute of Scientific and Technical Information of China (English)

    范晋伟; 王晓峰; 李云

    2013-01-01

    How to improve the machining accuracy of multi-axis CNC machine tools with least investment is the hot spot of todays social research. In order to solve the problem,modifying the instructions of NC machine to improve the accuracy is the most effective method. The multi-axis CNC machine tool was analyzed with multi-body theory. The general error model of multi-axis CNC machine tools was established. Based on these,a thorough study on rotary error was made. Taking C-A type machine tool as example,the in-terative solution method was used to get the precise NC instructions of rotary angle.%  机床已经成为工业领域不可缺少的加工工具,特别是多轴机床的出现改变了传统的加工方式。针对如何提高多轴数控机床的精度展开研究,应用多体理论建立了通用多轴数控机床误差模型,在此基础上,给出了多轴数控机床平移和转动数控精密指令求解方法。并以C-A型五轴数控机床为例,采用迭代的方法求出了回转角的精密指令值。

  7. Quick Prostate Test (QPT: Motion for a tool for the active contribution of the general pratictioner to the diagnosis and follow up of benign prostatic hyperplasia

    Directory of Open Access Journals (Sweden)

    Giuseppe Albino

    2014-12-01

    Full Text Available Introduction: Less than 40% of men with LUTS consult their doctor. Patients consider the LUTS as physiological and are resigned to endure them. It is necessary to foster awareness of the micturition disorders, to monitor their development and to assess the effectiveness of therapies. At present the only validated test is the IPSS-Q8, but in Italy it is used by only 4% of General Practitioners (GPs. Because the IPSS is complex and not easy to handle, we need a more simple test but nevertheless efficient. The Italian Society of Urology (SIU and the Italian Society for Interdisciplinary Primary Care (SIICP presented the "Quick Prostate Test" (QPT in November 2012. We aimed to evaluate the efficiency of QPT versus the IPSSQ8 and its suitability in primary care. Materials and Methods: The QPT is composed of 3 questions to be answered "yes" or "no." The answer "yes" just to one question makes "positive" the test. We enrolled 64 men, ≥ 50 years old, suffering from BPH, extracted from the database of five GPs. The patients were randomized into two arms: to the arm 1 only QPT was administered, to verify efficiency of the test; to the arm 2 both the QPT that the IPSS-Q8 were administered. Results: Into the arm 1, the 96.4% has tested positive for QPT. Into the arm 2, the 89% of patients with one or two positive responses to the QPT showed a moderate IPSSQ8 score; the 75% of the patients with three positive responses to the QPT had a serious IPSS-Q8 score. The GPs (80% have expressed the highest level of satisfaction for the QPT for the "time of administration" and for the "simplicity" of the test. Conclusions: The experience with the QPT has shown that the test is efficient and suitable in the primary care setting. We want to encourage the GPs to use the QPT for the monitoring of patients with lower urinary tract symptoms (LUTS and to contribute to the validation of the test.

  8. Generalized Free-Surface Effect and Random Vibration Theory: a new tool for computing moment magnitudes of small earthquakes using borehole data

    Science.gov (United States)

    Malagnini, Luca; Dreger, Douglas S.

    2016-07-01

    Although optimal, computing the moment tensor solution is not always a viable option for the calculation of the size of an earthquake, especially for small events (say, below Mw 2.0). Here we show an alternative approach to the calculation of the moment-rate spectra of small earthquakes, and thus of their scalar moments, that uses a network-based calibration of crustal wave propagation. The method works best when applied to a relatively small crustal volume containing both the seismic sources and the recording sites. In this study we present the calibration of the crustal volume monitored by the High-Resolution Seismic Network (HRSN), along the San Andreas Fault (SAF) at Parkfield. After the quantification of the attenuation parameters within the crustal volume under investigation, we proceed to the spectral correction of the observed Fourier amplitude spectra for the 100 largest events in our data set. Multiple estimates of seismic moment for the all events (1811 events total) are obtained by calculating the ratio of rms-averaged spectral quantities based on the peak values of the ground velocity in the time domain, as they are observed in narrowband-filtered time-series. The mathematical operations allowing the described spectral ratios are obtained from Random Vibration Theory (RVT). Due to the optimal conditions of the HRSN, in terms of signal-to-noise ratios, our network-based calibration allows the accurate calculation of seismic moments down to Mw < 0. However, because the HRSN is equipped only with borehole instruments, we define a frequency-dependent Generalized Free-Surface Effect (GFSE), to be used instead of the usual free-surface constant F = 2. Our spectral corrections at Parkfield need a different GFSE for each side of the SAF, which can be quantified by means of the analysis of synthetic seismograms. The importance of the GFSE of borehole instruments increases for decreasing earthquake's size because for smaller earthquakes the bandwidth available

  9. H2S data reduction

    Science.gov (United States)

    Deboer, David

    1993-01-01

    Calculating microwave opacity from a weakly absorbing gas mixture using a resonator requires measuring the quality factor of that resonator which necessitates accurately determining the center frequency (f(sub O)) and the half power bandwidth (Delta-f) of a noisy resonant line. The center frequency can be determined very accurately and varies very little over many measurements (a few kHz at GHz frequencies or a few hundredths of a percent). The greater source of error in estimating the Q of a resonator come from the bandwidth measurements. The half power bandwidth is determined essentially by eye-fitting a curve over a noisy resonant line and measuring with a spectrum analyzer.

  10. Improved Data Reduction Algorithm for the Needle Probe Method Applied to In-Situ Thermal Conductivity Measurements of Lunar and Planetary Regoliths

    Science.gov (United States)

    Nagihara, S.; Hedlund, M.; Zacny, K.; Taylor, P. T.

    2013-01-01

    The needle probe method (also known as the' hot wire' or 'line heat source' method) is widely used for in-situ thermal conductivity measurements on soils and marine sediments on the earth. Variants of this method have also been used (or planned) for measuring regolith on the surfaces of extra-terrestrial bodies (e.g., the Moon, Mars, and comets). In the near-vacuum condition on the lunar and planetary surfaces, the measurement method used on the earth cannot be simply duplicated, because thermal conductivity of the regolith can be approximately 2 orders of magnitude lower. In addition, the planetary probes have much greater diameters, due to engineering requirements associated with the robotic deployment on extra-terrestrial bodies. All of these factors contribute to the planetary probes requiring much longer time of measurement, several tens of (if not over a hundred) hours, while a conventional terrestrial needle probe needs only 1 to 2 minutes. The long measurement time complicates the surface operation logistics of the lander. It also negatively affects accuracy of the thermal conductivity measurement, because the cumulative heat loss along the probe is no longer negligible. The present study improves the data reduction algorithm of the needle probe method by shortening the measurement time on planetary surfaces by an order of magnitude. The main difference between the new scheme and the conventional one is that the former uses the exact mathematical solution to the thermal model on which the needle probe measurement theory is based, while the latter uses an approximate solution that is valid only for large times. The present study demonstrates the benefit of the new data reduction technique by applying it to data from a series of needle probe experiments carried out in a vacuum chamber on JSC-1A lunar regolith stimulant. The use of the exact solution has some disadvantage, however, in requiring three additional parameters, but two of them (the diameter and the

  11. Statistical Tools for Forensic Analysis of Toolmarks

    Energy Technology Data Exchange (ETDEWEB)

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  12. Comparison of Four Kinds of Nutritional Screening Tools in Clinical Application of General Surgery Pa-tients Application%4种营养筛查工具在普外科患者中应用的比较

    Institute of Scientific and Technical Information of China (English)

    包学智; 田伟军

    2015-01-01

    目的:使用NRS2002和SGA、MNA、NRI 4种工具对普外科入院患者进行营养状态调查,比较4种工具的适用性和一致性以及每种工具所得筛查结果和临床结局的关系。方法:采用连续取样方法选取天津医科大学总医院普外科收治的125名患者为研究对象,当患者入院第2 d分别应用NRS2002和SGA、MNA、NRI进行营养筛查和评估,并比较每种工具所得筛查结果和临床结局的关系。结果:入院NRS2002、SGA及MNA适用性91.91%,NRI适用性91.18%,4种方法对患者营养风险的评价具有一致性。将BMI≤18.5或ALB≤30 g/L作为营养不良的一个标准,4种评价工具与该标准的一致性较差。NRS2002、SGA筛查结果和临床结局的关系最为密切。结论:4种营养评价工具均适用于普外科营养不足的筛查。NRS2002、SGA筛查结果和临床结局的关系最为密切。%Objective NRS2002, SGA, MNA and NRI are used to screen nutritional risk of general surgery hospitalized patients, then to compare their applicability and consistency and analyze the effect in clinical out⁃comes of NRS results with respect to each tool. Methods The 125 patients hospitalized in General Hospital of Tianjin Medical University from June to September 2014 were chosen as object of study, to screen and evalu⁃ate nutritional risk of patients by NRS 2002, SGA, MNA and NRI, respectively, on the second hospital day. Then to compare the consistency of NRS results with respect to the four tools on clinical outcomes. Results The applicabilities of hospital NRS2002, SGA, MNA and NRI were alternatively 91.91%,91.91%,91.91% and 91.18%.The evaluation of patients’NRS corresponding to different four tools was consistent. Let BMI≤18.5 or ALB≤30 g/L be one standard of malnutrition, the consistency of four evaluation tools and the standard was poor. The effects of screening results of NRS2002 and SGA on clinical outcomes were most closely related. Conclusion

  13. Radio-selected Galaxies in Very Rich Clusters at z < 0.25 I. Multi-wavelength Observations and Data Reduction Techniques

    CERN Document Server

    Morrison, G E; Ledlow, M J; Keel, W C; Hill, J M; Voges, W; Herter, T L

    2002-01-01

    Radio observations were used to detect the `active' galaxy population within rich clusters of galaxies in a non-biased manner that is not plagued by dust extinction or the K-correction. We present wide-field radio, optical (imaging and spectroscopy), and ROSAT All-Sky Survey (RASS) X-ray data for a sample of 30 very rich Abell (R > 2) cluster with z 2E22 W/Hz) galaxy population within these extremely rich clusters for galaxies with M_R 5 M_sun/yr) and active galactic nuclei (AGN) populations contained within each cluster. Archival and newly acquired redshifts were used to verify cluster membership for most (~95%) of the optical identifications. Thus we can identify all the starbursting galaxies within these clusters, regardless of the level of dust obscuration that would affect these galaxies being identified from their optical signature. Cluster sample selection, observations, and data reduction techniques for all wavelengths are discussed.

  14. GEKATOO: A General Knowledge Acquisition Tool

    Science.gov (United States)

    1989-03-01

    fill in your name and address and return this page to: Laboratorio di Calcolatori Dip.to di ziettronica - ?oli~tec.ico di Milano P.zza L. da Vinci 32...basic competences needed to cope with the corresponding problems. Competences are defined in terms of action sche , rata at different level of abstraction

  15. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  16. The General Comments on HIV adopted by the African Commission on Human and Peoples' Rights as a tool to advance the sexual and reproductive rights of women in Africa.

    Science.gov (United States)

    Durojaye, Ebenezer

    2014-12-01

    The present article examines the contents and importance of the General Comments adopted by the African Commission on Human and Peoples' Rights on Article 14 (1) (d) and (e) of the Protocol to the African Charter on the Rights of Women in Africa as a tool for advancing women's rights in the context of HIV. Given that discriminatory practices in all facets of life have continued to limit African women's enjoyment of their sexual and reproductive rights and render them susceptible to HIV infection, it becomes vital that African governments adopt appropriate measures to address this challenge. The provisions of the Protocol on the Rights of Women in Africa present great opportunities for this to be realized. The radical and progressive provisions of the Protocol will be of no use to women unless policymakers and other stakeholders have a clear understanding of them and are able to implement them effectively. The adoption of the General Comments is a welcome development, and states and civil society groups must maximize it to advance women's rights.

  17. Exploring students' perceptions on the use of significant event analysis, as part of a portfolio assessment process in general practice, as a tool for learning how to use reflection in learning

    Directory of Open Access Journals (Sweden)

    Houston Helen

    2007-03-01

    Full Text Available Abstract Background Portfolio learning enables students to collect evidence of their learning. Component tasks making up a portfolio can be devised that relate directly to intended learning outcomes. Reflective tasks can stimulate students to recognise their own learning needs. Assessment of portfolios using a rating scale relating to intended learning outcomes offers high content validity. This study evaluated a reflective portfolio used during a final-year attachment in general practice (family medicine. Students were asked to evaluate the portfolio (which used significant event analysis as a basis for reflection as a learning tool. The validity and reliability of the portfolio as an assessment tool were also measured. Methods 81 final-year medical students completed reflective significant event analyses as part of a portfolio created during a three-week attachment (clerkship in general practice (family medicine. As well as two reflective significant event analyses each portfolio contained an audit and a health needs assessment. Portfolios were marked three times; by the student's GP teacher, the course organiser and by another teacher in the university department of general practice. Inter-rater reliability between pairs of markers was calculated. A questionnaire enabled the students' experience of portfolio learning to be determined. Results Benefits to learning from reflective learning were limited. Students said that they thought more about the patients they wrote up in significant event analyses but information as to the nature and effect of this was not forthcoming. Moderate inter-rater reliability (Spearman's Rho .65 was found between pairs of departmental raters dealing with larger numbers (20 – 60 of portfolios. Inter-rater reliability of marking involving GP tutors who only marked 1 – 3 portfolios was very low. Students rated highly their mentoring relationship with their GP teacher but found the portfolio tasks time

  18. 29 CFR 1918.69 - Tools.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Tools. 1918.69 Section 1918.69 Labor Regulations Relating... § 1918.69 Tools. (a) General. Employers shall not issue or permit the use of visibly unsafe tools. (b) Portable electric tools. (1) Portable hand-held electric tools shall be equipped with switches of a...

  19. Tools of radio astronomy

    CERN Document Server

    Wilson, Thomas L; Hüttemeister, Susanne

    2013-01-01

    This 6th edition of “Tools of Radio Astronomy”, the most used introductory text in radio astronomy, has been revised to reflect the current state of this important branch of astronomy. This includes the use of satellites, low radio frequencies, the millimeter/sub-mm universe, the Cosmic Microwave Background and the increased importance of mm/sub-mm dust emission. Several derivations and presentations of technical aspects of radio astronomy and receivers, such as receiver noise, the Hertz dipole and  beam forming have been updated, expanded, re-worked or complemented by alternative derivations. These reflect advances in technology. The wider bandwidths of the Jansky-VLA and long wave arrays such as LOFAR and mm/sub-mm arrays such as ALMA required an expansion of the discussion of interferometers and aperture synthesis. Developments in data reduction algorithms have been included. As a result of the large amount of data collected in the past 20 years, the discussion of solar system radio astronomy, dust em...

  20. General Motors Goes Metric

    Science.gov (United States)

    Webb, Ted

    1976-01-01

    Describes the program to convert to the metric system all of General Motors Corporation products. Steps include establishing policy regarding employee-owned tools, setting up training plans, and making arrangements with suppliers. (MF)

  1. Science in General Education

    Science.gov (United States)

    Read, Andrew F.

    2013-01-01

    General education must develop in students an appreciation of the power of science, how it works, why it is an effective knowledge generation tool, and what it can deliver. Knowing what science has discovered is desirable but less important.

  2. CRIRES-POP: a library of high resolution spectra from 1 to 5 microns II. Data reduction and the spectrum of the K giant 10 Leo

    CERN Document Server

    Nicholls, C P; Smette, A; Wolff, B; Hartman, H; Käufl, H -U; Przybilla, N; Ramsay, S; Uttenthaler, S; Wahlgren, G M; Bagnulo, S; Hussain, G A J; Nieva, M -F; Seeman, U; Seifahrt, A

    2016-01-01

    Context. High resolution stellar spectral atlases are valuable resources to astronomy. They are rare in the $1 - 5\\,\\mu$m region for historical reasons, but once available, high resolution atlases in this part of the spectrum will aid the study of a wide range of astrophysical phenomena. Aims. The aim of the CRIRES-POP project is to produce a high resolution near-infrared spectral library of stars across the H-R diagram. The aim of this paper is to present the fully reduced spectrum of the K giant 10 Leo that will form the basis of the first atlas within the CRIRES-POP library, to provide a full description of the data reduction processes involved, and to provide an update on the CRIRES-POP project. Methods. All CRIRES-POP targets were observed with almost 200 different observational settings of CRIRES on the ESO Very Large Telescope, resulting in a basically complete coverage of its spectral range as accessible from the ground. We reduced the spectra of 10 Leo with the CRIRES pipeline, corrected the waveleng...

  3. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...

  4. A survey of parallel programming tools

    Science.gov (United States)

    Cheng, Doreen Y.

    1991-01-01

    This survey examines 39 parallel programming tools. Focus is placed on those tool capabilites needed for parallel scientific programming rather than for general computer science. The tools are classified with current and future needs of Numerical Aerodynamic Simulator (NAS) in mind: existing and anticipated NAS supercomputers and workstations; operating systems; programming languages; and applications. They are divided into four categories: suggested acquisitions, tools already brought in; tools worth tracking; and tools eliminated from further consideration at this time.

  5. CRIRES-POP: a library of high resolution spectra in the near-infrared. II. Data reduction and the spectrum of the K giant 10 Leonis

    Science.gov (United States)

    Nicholls, C. P.; Lebzelter, T.; Smette, A.; Wolff, B.; Hartman, H.; Käufl, H.-U.; Przybilla, N.; Ramsay, S.; Uttenthaler, S.; Wahlgren, G. M.; Bagnulo, S.; Hussain, G. A. J.; Nieva, M.-F.; Seemann, U.; Seifahrt, A.

    2017-02-01

    Context. High resolution stellar spectral atlases are valuable resources to astronomy. They are rare in the 1-5 μm region for historical reasons, but once available, high resolution atlases in this part of the spectrum will aid the study of a wide range of astrophysical phenomena. Aims: The aim of the CRIRES-POP project is to produce a high resolution near-infrared spectral library of stars across the H-R diagram. The aim of this paper is to present the fully reduced spectrum of the K giant 10 Leo that will form the basis of the first atlas within the CRIRES-POP library, to provide a full description of the data reduction processes involved, and to provide an update on the CRIRES-POP project. Methods: All CRIRES-POP targets were observed with almost 200 different observational settings of CRIRES on the ESO Very Large Telescope, resulting in a basically complete coverage of its spectral range as accessible from the ground. We reduced the spectra of 10 Leo with the CRIRES pipeline, corrected the wavelength solution and removed telluric absorption with Molecfit, then resampled the spectra to a common wavelength scale, shifted them to rest wavelengths, flux normalised, and median combined them into one final data product. Results: We present the fully reduced, high resolution, near-infrared spectrum of 10 Leo. This is also the first complete spectrum from the CRIRES instrument. The spectrum is available online. Conclusions: The first CRIRES-POP spectrum has exceeded our quality expectations and will form the centre of a state-of-the-art stellar atlas. This first CRIRES-POP atlas will soon be available, and further atlases will follow. All CRIRES-POP data products will be freely and publicly available online. The spectrum is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A79

  6. Geometric reasoning about assembly tools

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, R.H.

    1997-01-01

    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  7. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  8. The ALMA Observation Support Tool

    CERN Document Server

    Heywood, I; Williams, C J

    2011-01-01

    The ALMA Observation Support Tool (OST) is an ALMA simulator which is interacted with solely via a standard web browser. It is aimed at users who may or may not be experts in interferometry, or those that do not wish to familarise themselves with the simulation components of a data reduction package. It has been designed to offer full imaging simulation capability for an arbitrary ALMA observation while maintaining the accessibility of other online tools such as the ALMA Sensitivity Calculator. Simulation jobs are defined by selecting and entering options on a standard web form. The user can specify the standard parameters that would need to be considered for an ALMA observation (e.g. pointing direction, frequency set up, duration), and there is also the option to upload arbitrary sky models in FITS format. Once submitted, jobs are sequentially processed by a remote server running a CASA-based back-end system. The user is notified by email when the job is complete, and directed to a standard web page which co...

  9. Mathematical tools

    Science.gov (United States)

    Capozziello, Salvatore; Faraoni, Valerio

    In this chapter we discuss certain mathematical tools which are used extensively in the following chapters. Some of these concepts and methods are part of the standard baggage taught in undergraduate and graduate courses, while others enter the tool-box of more advanced researchers. These mathematical methods are very useful in formulating ETGs and in finding analytical solutions.We begin by studying conformal transformations, which allow for different representations of scalar-tensor and f(R) theories of gravity, in addition to being useful in GR. We continue by discussing variational principles in GR, which are the basis for presenting ETGs in the following chapters. We close the chapter with a discussion of Noether symmetries, which are used elsewhere in this book to obtain analytical solutions.

  10. General aviation air traffic pattern safety analysis

    Science.gov (United States)

    Parker, L. C.

    1973-01-01

    A concept is described for evaluating the general aviation mid-air collision hazard in uncontrolled terminal airspace. Three-dimensional traffic pattern measurements were conducted at uncontrolled and controlled airports. Computer programs for data reduction, storage retrieval and statistical analysis have been developed. Initial general aviation air traffic pattern characteristics are presented. These preliminary results indicate that patterns are highly divergent from the expected standard pattern, and that pattern procedures observed can affect the ability of pilots to see and avoid each other.

  11. Udder Hygiene Analysis tool

    OpenAIRE

    2013-01-01

    In this report, the pilot of UHC is described. The main objective of the pilot is to make farmers more aware of how to increase udder health in dairy herds. This goes through changing management aspects related to hygiene. This report firstly provides general information about antibiotics and the processes that influence udder health. Secondly, six subjects are described related to udder health. Thirdly, the tools (checklists and roadmap) are shown and fourthly, advises that are written by UH...

  12. General Dentist

    Science.gov (United States)

    ... Some general dentists work in government health services, research programs, higher education, corporations and even the military. ?xml:namespace> What kind of procedures do general dentists provide? ?xml:namespace> Many general dentists are ...

  13. Process for selecting engineering tools : applied to selecting a SysML tool.

    Energy Technology Data Exchange (ETDEWEB)

    De Spain, Mark J.; Post, Debra S. (Sandia National Laboratories, Livermore, CA); Taylor, Jeffrey L.; De Jong, Kent

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  14. Process for selecting engineering tools : applied to selecting a SysML tool.

    Energy Technology Data Exchange (ETDEWEB)

    De Spain, Mark J.; Post, Debra S. (Sandia National Laboratories, Livermore, CA); Taylor, Jeffrey L.; De Jong, Kent

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  15. C 3, A Command-line Catalog Cross-match Tool for Large Astrophysical Catalogs

    Science.gov (United States)

    Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio

    2017-02-01

    Modern Astrophysics is based on multi-wavelength data organized into large and heterogeneous catalogs. Hence, the need for efficient, reliable and scalable catalog cross-matching methods plays a crucial role in the era of the petabyte scale. Furthermore, multi-band data have often very different angular resolution, requiring the highest generality of cross-matching features, mainly in terms of region shape and resolution. In this work we present C 3 (Command-line Catalog Cross-match), a multi-platform application designed to efficiently cross-match massive catalogs. It is based on a multi-core parallel processing paradigm and conceived to be executed as a stand-alone command-line process or integrated within any generic data reduction/analysis pipeline, providing the maximum flexibility to the end-user, in terms of portability, parameter configuration, catalog formats, angular resolution, region shapes, coordinate units and cross-matching types. Using real data, extracted from public surveys, we discuss the cross-matching capabilities and computing time efficiency also through a direct comparison with some publicly available tools, chosen among the most used within the community, and representative of different interface paradigms. We verified that the C 3 tool has excellent capabilities to perform an efficient and reliable cross-matching between large data sets. Although the elliptical cross-match and the parametric handling of angular orientation and offset are known concepts in the astrophysical context, their availability in the presented command-line tool makes C 3 competitive in the context of public astronomical tools.

  16. Tool Gear: Infrastructure for Parallel Tools

    Energy Technology Data Exchange (ETDEWEB)

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  17. Tool for Military Logistics Planning of Peace Support Operations: The OTAS Planning Tool

    NARCIS (Netherlands)

    Merrienboer, S.A. van

    1998-01-01

    Within the research group Operations Research Studies Army of the TNO Physics and Electronics Laboratory the OTAS planning tool is developed for the Royal Netherlands Armed Forces. This paper gives a general and brief description of the tool.

  18. RSP Tooling Technology

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  19. RSP Tooling Technology

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  20. The Proteogenomic Mapping Tool

    Directory of Open Access Journals (Sweden)

    Dandass Yoginder S

    2011-04-01

    Full Text Available Abstract Background High-throughput mass spectrometry (MS proteomics data is increasingly being used to complement traditional structural genome annotation methods. To keep pace with the high speed of experimental data generation and to aid in structural genome annotation, experimentally observed peptides need to be mapped back to their source genome location quickly and exactly. Previously, the tools to do this have been limited to custom scripts designed by individual research groups to analyze their own data, are generally not widely available, and do not scale well with large eukaryotic genomes. Results The Proteogenomic Mapping Tool includes a Java implementation of the Aho-Corasick string searching algorithm which takes as input standardized file types and rapidly searches experimentally observed peptides against a given genome translated in all 6 reading frames for exact matches. The Java implementation allows the application to scale well with larger eukaryotic genomes while providing cross-platform functionality. Conclusions The Proteogenomic Mapping Tool provides a standalone application for mapping peptides back to their source genome on a number of operating system platforms with standard desktop computer hardware and executes very rapidly for a variety of datasets. Allowing the selection of different genetic codes for different organisms allows researchers to easily customize the tool to their own research interests and is recommended for anyone working to structurally annotate genomes using MS derived proteomics data.

  1. Fluid blade disablement tool

    Energy Technology Data Exchange (ETDEWEB)

    Jakaboski, Juan-Carlos [Albuquerque, NM; Hughs, Chance G [Albuquerque, NM; Todd, Steven N [Rio Rancho, NM

    2012-01-10

    A fluid blade disablement (FBD) tool that forms both a focused fluid projectile that resembles a blade, which can provide precision penetration of a barrier wall, and a broad fluid projectile that functions substantially like a hammer, which can produce general disruption of structures behind the barrier wall. Embodiments of the FBD tool comprise a container capable of holding fluid, an explosive assembly which is positioned within the container and which comprises an explosive holder and explosive, and a means for detonating. The container has a concavity on the side adjacent to the exposed surface of the explosive. The position of the concavity relative to the explosive and its construction of materials with thicknesses that facilitate inversion and/or rupture of the concavity wall enable the formation of a sharp and coherent blade of fluid advancing ahead of the detonation gases.

  2. Case and Administrative Support Tools

    Science.gov (United States)

    Case and Administrative Support Tools (CAST) is the secure portion of the Office of General Counsel (OGC) Dashboard business process automation tool used to help reduce office administrative labor costs while increasing employee effectiveness. CAST supports business functions which rely on and store Privacy Act sensitive data (PII). Specific business processes included in CAST (and respective PII) are: -Civil Rights Cast Tracking (name, partial medical history, summary of case, and case correspondance). -Employment Law Case Tracking (name, summary of case). -Federal Tort Claims Act Incident Tracking (name, summary of incidents). -Ethics Program Support Tools and Tracking (name, partial financial history). -Summer Honors Application Tracking (name, home address, telephone number, employment history). -Workforce Flexibility Initiative Support Tools (name, alternative workplace phone number). -Resource and Personnel Management Support Tools (name, partial employment and financial history).

  3. Methods to improve computer-assisted seismic interpretation using seismic attributes: Multiattribute display, spectral data reduction, and attributes to quantify structural deformation and velocity anisotropy

    Science.gov (United States)

    Guo, Hao

    Computer-assisted seismic interpretation gained widespread acceptance in the mid 1980s that no 3D survey and few 2D surveys are interpreted without the aid of an interpretation workstation. Geoscientists routinely quantify features of geologic interest and enhance their interpretation through the use of seismic attributes. Typically these attributes are examined sequentially, or within different interpretation windows. In this dissertation, I present two novel means of presenting the information content of multiple attributes by a single image. In the first approach, I show how two, three, or four attributes can be displayed by an appropriate use of color. I use a colorstack model of Red, Green, and Blue (RGB) to map attributes of similar type such as volumes of near-, mid-, and far-angle amplitude or low-, moderate-, high-frequency spectral components. I use an HLS model to display a theme attribute modulated by another secondary attribute, such as dip magnitude modulating dip azimuth, or amplitude of the peak spectral frequency modulating the phase measured at the peak frequency. Transparency/opacity provides a 4th color dimension and provides additional attribute modulation capabilities. In the second approach I use principal component analysis to reduce the multiplicity of redundant data into a smaller, more manageable number of components. The importance of each principal component is proportional to its corresponding eigenvalue. By mapping the three largest principal components against red, green, and blue, we can represent more than 80% of the original information with a single colored image. I then use these tools to help quantify and correlate structural deformation with velocity anisotropy. I develop an innovative algorithm that automatically counts the azimuth distribution of the fast P-wave velocity (or alternatively, the strike of the structural lineaments) weighted by the amount of anisotropy (or the intensity of the lineaments) at any point in the

  4. Meeting Generalized Others

    DEFF Research Database (Denmark)

    Strøbæk, Pernille Solveig; Willert, Søren

    2014-01-01

    with different pre-merger backgrounds. Our findings suggest that analyzing dominant material discourse themes may throw light on the way work teams define and regulate their social practice, and, hence, that such analyses may be useful tools for studying the social dynamics of materiality and agency......Following George Herbert Mead, we contend that work-related organizational behavior requires continued negotiation of meaning – using linguistic, behavioral, and social tools. The meaning structures of the Generalized Other(s) of a particular employing organization provide the regulatory framework...

  5. A Data Simulator Tool for NIRCam

    Science.gov (United States)

    Hilbert, Bryan; Canipe, Alicia Michelle; Robberto, Massimo; NIRCam Team at STScI

    2017-06-01

    We present a new data simulator tool capable of producing high fidelity simulated data for NIRCam. This simulator produces “raw” multiaccum integrations, each composed of multiple non-destructive detector readouts. This is equivalent to data from real observations prior to the application of any calibration steps. Our primary motivation for creating this tool is to produce realistic data with which to test the JWST calibration pipeline steps, from basic calibration through the production of mosaic images and associated source catalogs. However, data created from this tool may also be useful to observers wishing to have example data to test custom data reduction or analysis software.The simulator begins with a real NIRCam dark current integration and adds synthetic astronomical sources. In this way, the simulated integration is guaranteed to contain all of the same noise characteristics and detector effects that will be present in real NIRCam observations. The output format of the simulated data is such that the files can be immediately run through the standard JWST calibration pipelines. Currently the tool supports the creation of NIRCam imaging and dispersed (wide field slitless) observations, including moving target (non-sidereal tracking) and time series observation data.

  6. Downhole tool with replaceable tool sleeve sections

    Energy Technology Data Exchange (ETDEWEB)

    Case, W. A.

    1985-10-29

    A downhole tool for insertion in a drill stem includes elongated cylindrical half sleeve tool sections adapted to be non-rotatably supported on an elongated cylindrical body. The tool sections are mountable on and removable from the body without disconnecting either end of the tool from a drill stem. The half sleeve tool sections are provided with tapered axially extending flanges on their opposite ends which fit in corresponding tapered recesses formed on the tool body and the tool sections are retained on the body by a locknut threadedly engaged with the body and engageable with an axially movable retaining collar. The tool sections may be drivably engaged with axial keys formed on the body or the tool sections may be formed with flat surfaces on the sleeve inner sides cooperable with complementary flat surfaces formed on a reduced diameter portion of the body around which the tool sections are mounted.

  7. Generalized product

    OpenAIRE

    Greco,Salvatore; Mesiar, Radko; Rindone, Fabio

    2014-01-01

    Aggregation functions on [0,1] with annihilator 0 can be seen as a generalized product on [0,1]. We study the generalized product on the bipolar scale [–1,1], stressing the axiomatic point of view. Based on newly introduced bipolar properties, such as the bipolar increasingness, bipolar unit element, bipolar idempotent element, several kinds of generalized bipolar product are introduced and studied. A special stress is put on bipolar semicopulas, bipolar quasi-copulas and bipolar copulas.

  8. Generalized product

    OpenAIRE

    Greco, Salvatore; Mesiar, Radko; Rindone, Fabio

    2014-01-01

    Aggregation functions on [0,1] with annihilator 0 can be seen as a generalized product on [0,1]. We study the generalized product on the bipolar scale [–1,1], stressing the axiomatic point of view. Based on newly introduced bipolar properties, such as the bipolar increasingness, bipolar unit element, bipolar idempotent element, several kinds of generalized bipolar product are introduced and studied. A special stress is put on bipolar semicopulas, bipolar quasi-copulas and bipolar copulas.

  9. REPLICATION TOOL AND METHOD OF PROVIDING A REPLICATION TOOL

    DEFF Research Database (Denmark)

    2016-01-01

    structured master surface (3a, 3b, 3c, 3d) having a lateral master pattern and a vertical master profile. The microscale structured master surface (3a, 3b, 3c, 3d) has been provided by localized pulsed laser treatment to generate microscale phase explosions. A method for producing a part with microscale......The invention relates to a replication tool (1, 1a, 1b) for producing a part (4) with a microscale textured replica surface (5a, 5b, 5c, 5d). The replication tool (1, 1a, 1b) comprises a tool surface (2a, 2b) defining a general shape of the item. The tool surface (2a, 2b) comprises a microscale...... energy directors on flange portions thereof uses the replication tool (1, 1a, 1b) to form an item (4) with a general shape as defined by the tool surface (2a, 2b). The formed item (4) comprises a microscale textured replica surface (5a, 5b, 5c, 5d) with a lateral arrangement of polydisperse microscale...

  10. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    Science.gov (United States)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  11. Foundational Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton [Univ. of Wisconsin, Madison, WI (United States)

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  12. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...... of a new sustainable settlement. The use of design tools is discussed in relation to innovation and stakeholder participation, and it is stressed that the usefulness of design tools is context dependent....

  13. General Conformity

    Science.gov (United States)

    The General Conformity requirements ensure that the actions taken by federal agencies in nonattainment and maintenance areas do not interfere with a state’s plans to meet national standards for air quality.

  14. General Anesthesia

    Science.gov (United States)

    ... unconscious and unable to feel pain during medical procedures. General anesthesia usually uses a combination of intravenous drugs ... 1998-2017 Mayo Foundation for Medical Education and Research (MFMER). All rights reserved.

  15. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  16. 29 CFR 1926.300 - General requirements.

    Science.gov (United States)

    2010-07-01

    ... Institute, B15.1-1953 (R1958), Safety Code for Mechanical Power-Transmission Apparatus. (3) Types of... (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR CONSTRUCTION Tools-Hand and Power § 1926.300 General requirements. (a) Condition of tools. All hand and power tools and similar equipment, whether furnished by...

  17. Data Reduction Method for Categorical Data Clustering

    OpenAIRE

    Sánchez Garreta, José Salvador; Rendón, Eréndira; García, Rene A.; Abundez, Itzel; Gutiérrez, Citlalih; Gasca, Eduardo

    2008-01-01

    Categorical data clustering constitutes an important part of data mining; its relevance has recently drawn attention from several researchers. As a step in data mining, however, clustering encounters the problem of large amount of data to be processed. This article offers a solution for categorical clustering algorithms when working with high volumes of data by means of a method that summarizes the database. This is done using a structure called CM-tree. In order to test our metho...

  18. AMPERE Science Data Reduction and Processing

    Science.gov (United States)

    Korth, H.; Dyrud, L.; Anderson, B.; Waters, C. L.; Barnes, R. J.

    2010-12-01

    The Active Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE) uses the constellation of Iridium Communications satellites in 780-km-altitude, circular, near-polar orbits to monitor the electro-dynamic coupling of the ionosphere to the surrounding space environment in real time. The constellation consists of 66 satellites plus on-orbit spares, and each satellite carries a magnetometer for attitude determination. The magnetometer data are continuously sent from Iridium Satellite Network Operations Center to the AMPERE Science Data Center, where they are processed to extract the magnetic perturbation signatures associated with the Birkeland currents. This is accomplished by first merging real-time telemetry packets from each satellite into time-ordered sets of records, formatting and compiling a database. Subsequent processing automatically evaluates baselines, inter-calibrates magnetic field data between satellites, and quantifies the magnetic field residuals with the goal to reduce errors to the 30-nT digitization resolution of the magnetometers. The magnetic field residuals are then used to rank the quality of the data from the individual satellites and weight the data in subsequent science processing. Because magnetic fields generated by the Birkeland currents represent typically less than one percent of the total magnetic field, numerous challenges must be overcome to derive reliable magnetic perturbation signals. For example, corrections to the IGRF magnetic field model must be applied and adverse effects due to missing data must be mitigated. In the final processing step the Birkeland currents are derived by applying Ampere's law to the spherical harmonic fit of the perturbation data. We present the processing methodology, discuss the sensitivity of the Birkeland currents on the accuracy of the derived magnetic perturbations, and show a preliminary analysis of the 3-5 August 2010 geomagnetic storm.

  19. Advanced Data Reduction Techniques for MUSE

    CERN Document Server

    Weilbacher, Peter M; Roth, Martin M; Boehm, Petra; Pecontal-Rousset, Arlette

    2009-01-01

    MUSE, a 2nd generation VLT instrument, will become the world's largest integral field spectrograph. It will be an AO assisted instrument which, in a single exposure, covers the wavelength range from 465 to 930 nm with an average resolution of 3000 over a field of view of 1'x1' with 0.2'' spatial sampling. Both the complexity and the rate of the data are a challenge for the data processing of this instrument. We will give an overview of the data processing scheme that has been designed for MUSE. Specifically, we will use only a single resampling step from the raw data to the reduced data product. This allows us to improve data quality, accurately propagate variance, and minimize spreading of artifacts and correlated noise. This approach necessitates changes to the standard way in which reduction steps like wavelength calibration and sky subtraction are carried out, but can be expanded to include combination of multiple exposures.

  20. Delta: Data Reduction for Integrated Application Workflows.

    Energy Technology Data Exchange (ETDEWEB)

    Lofstead, Gerald Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jean-Baptiste, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oldfield, Ron A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Integrated Application Workflows (IAWs) run multiple simulation workflow components con- currently on an HPC resource connecting these components using compute area resources and compensating for any performance or data processing rate mismatches. These IAWs require high frequency and high volume data transfers between compute nodes and staging area nodes during the lifetime of a large parallel computation. The available network band- width between the two areas may not be enough to efficiently support the data movement. As the processing power available to compute resources increases, the requirements for this data transfer will become more difficult to satisfy and perhaps will not be satisfiable at all since network capabilities are not expanding at a comparable rate. Furthermore, energy consumption in HPC environments is expected to grow by an order of magnitude as exas- cale systems become a reality. The energy cost of moving large amounts of data frequently will contribute to this issue. It is necessary to reduce the volume of data without reducing the quality of data when it is being processed and analyzed. Delta resolves the issue by addressing the lifetime data transfer operations. Delta removes subsequent identical copies of already transmitted data during transfers and restores those copies once the data has reached the destination. Delta is able to identify duplicated information and determine the most space efficient way to represent it. Initial tests show about 50% reduction in data movement while maintaining the same data quality and transmission frequency.

  1. Fetal magnetocardiography: Methods for rapid data reduction

    Science.gov (United States)

    Mosher, John C.; Flynn, Edward R.; Quinn, A.; Weir, A.; Shahani, U.; Bain, R. J. P.; Maas, P.; Donaldson, G. B.

    1997-03-01

    Fetal magnetocardigraphy (fMCG) provides a unique method for noninvasive observations of the fetal heart. Electrical currents generated by excitable tissues within the fetal heart yield measurable external magnetic fields. Measurements are performed with superconducting quantum interference devices inductively coupled to magnetometer or gradiometer coils, and the resulting signals are converted to digital form in the data acquisition system. The measured fields are usually contaminated by fetal and maternal movements (usually respiration), other physiological fields such as skeletal muscle contraction, the maternal cardiac signal, and environmental electromagnetic fields. Sensitivity to relatively distant sources, both physiological and environmental, is substantially reduced by the use of magnetic gradiometers. Other contaminants may be removed by proper signal conditioning which may be automatically applied using "black box" algorithms that are transparent to the user and highly efficient. These procedures can rapidly reduce the complex signal plus noise waveforms to the desired fMCG with minimal operator interference.

  2. Data Reduction and Related Software for Photographic Observations of Sunspots in the Yunnan Observatories%云南天文台太阳黑子照相观测资料处理和相关软件

    Institute of Scientific and Technical Information of China (English)

    顾啸马; 刘艳霄; 叶惠莲; 林隽

    2015-01-01

    In this paper we present a data-reduction approach and related software for processing data of sunspots obtained from a photographic full-disk solar observation system established by us.When applied to data from our system the approach and software yield fundamental data and parameter values of sunspots, which can serve as the first-hand data to be accumulated and used by solar physicists to study underlying mechanisms of solar activities.The results of our data reduction include those of the relative sunspot numbers. sunspot-group numbers, sunspot locations, sunspot circular areas, sunspot surface areas, sunspot total areas, sunspot classifications, and distances from sunspots ( or sunspot groups) to the solar center.As part of our data reduction we have incorporated a calculation program, which makes it possible to rapidly process daily sunspot data and give values of the mentioned parameters.Our approach may thus completely change the traditional method of manually handling photographic sunspot data, and greatly improve efficiencies of sunspot observation/data processing.It is still necessary for an observer to have certain observational experience and technical skill to use our approach, especially in grouping sunspots and evaluating group numbers.%在建立太阳全日面黑子照相观测系统的基础上对黑子观测资料进行了处理,给出黑子观测的重要数据及相关参数,为太阳物理学家研究太阳活动规律提供和积累最基本的第一手数据。这些数据包括:太阳黑子相对数,南北半球太阳黑子的坐标和黑子群数,太阳黑子的圆面积和球面积等。编写了一个程序,对每天的太阳黑子观测资料进行处理,给出以上物理参数,彻底改变了手工描绘黑子和计算黑子参数的传统方法,同时也提高了黑子资料处理的精度和效率。

  3. Generalized polygons

    CERN Document Server

    Van Maldeghem, Hendrik

    1998-01-01

    Generalized Polygons is the first book to cover, in a coherent manner, the theory of polygons from scratch. In particular, it fills elementary gaps in the literature and gives an up-to-date account of current research in this area, including most proofs, which are often unified and streamlined in comparison to the versions generally known. Generalized Polygons will be welcomed both by the student seeking an introduction to the subject as well as the researcher who will value the work as a reference. In particular, it will be of great value for specialists working in the field of generalized polygons (which are, incidentally, the rank 2 Tits-buildings) or in fields directly related to Tits-buildings, incidence geometry and finite geometry. The approach taken in the book is of geometric nature, but algebraic results are included and proven (in a geometric way!). A noteworthy feature is that the book unifies and generalizes notions, definitions and results that exist for quadrangles, hexagons, octagons - in the ...

  4. Stable generalized complex structures

    CERN Document Server

    Cavalcanti, Gil R

    2015-01-01

    A stable generalized complex structure is one that is generically symplectic but degenerates along a real codimension two submanifold, where it defines a generalized Calabi-Yau structure. We introduce a Lie algebroid which allows us to view such structures as symplectic forms. This allows us to construct new examples of stable structures, and also to define period maps for their deformations in which the background three-form flux is either fixed or not, proving the unobstructedness of both deformation problems. We then use the same tools to establish local normal forms for the degeneracy locus and for Lagrangian branes. Applying our normal forms to the four-dimensional case, we prove that any compact stable generalized complex 4-manifold has a symplectic completion, in the sense that it can be modified near its degeneracy locus to produce a compact symplectic 4-manifold.

  5. Generalized Multidimensional Association Rules

    Institute of Scientific and Technical Information of China (English)

    周傲英; 周水庚; 金文; 田增平

    2000-01-01

    The problem of association rule mining has gained considerable prominence in the data mining community for its use as an important tool of knowl-edge discovery from large-scale databases. And there has been a spurt of research activities around this problem. Traditional association rule mining is limited to intra-transaction. Only recently the concept of N-dimensional inter-transaction as-sociation rule (NDITAR) was proposed by H.J. Lu. This paper modifies and extends Lu's definition of NDITAR based on the analysis of its limitations, and the general-ized multidimensional association rule (GMDAR) is subsequently introduced, which is more general, flexible and reasonable than NDITAR.

  6. Generalized derivations and general relativity

    CERN Document Server

    Heller, M; Pysiak, L; Sasin, W

    2013-01-01

    We construct differential geometry (connection, curvature, etc.) based on generalized derivations of an algebra A. Such a derivation, introduced by Bresar in 1991, is given by a linear mapping u: A -> A such that there exists a usual derivation d of A satisfying the generalized Leibniz rule u(a b) = u(a) b + a d(b) for all a,b in A. The generalized geometry "is tested" in the case of the algebra of smooth functions on a manifold. We then apply this machinery to study the generalized general relativity. We define the Einstein-Hilbert action and deduce from it Einstein's field equations. We show that for a special class of metrics containing, besides the usual metric components, only one non-zero term, the action reduces to O'Hanlon action that is a Brans-Dicke action with potential and with the parameter \\omega equal to zero. We also show that the generalized Einstein equations (with zero energy-stress tensor) are equivalent to those of the Kaluza-Klein theory satisfying a "modified cylinder condition" and hav...

  7. GENERAL EQUILIBRIUM

    Directory of Open Access Journals (Sweden)

    Monique Florenzano

    2008-09-01

    Full Text Available General equilibrium is a central concept of economic theory. Unlike partial equilibrium analysis which study the equilibrium of a particular market under the clause “ceteris paribus” that revenues and prices on the other markets stay approximately unaffected, the ambition of a general equilibrium model is to analyze the simultaneous equilibrium in all markets of a competitive economy. Definition of the abstract model, some of its basic results and insights are presented. The important issues of uniqueness and local uniqueness of equilibrium are sketched; they are the condition for a predictive power of the theory and its ability to allow for statics comparisons. Finally, we review the main extensions of the general equilibrium model. Besides the natural extensions to infinitely many commodities and to a continuum of agents, some examples show how economic theory can accommodate the main ideas in order to study some contexts which were not thought of by the initial model

  8. General Relativity

    CERN Document Server

    Canuto, V

    2015-01-01

    This is an English translation of the Italian version of an encyclopedia chapter that appeared in the Italian Encyclopedia of the Physical Sciences, edited by Bruno Bertotti (1994). Following requests from colleagues we have decided to make it available to a more general readership. We present the motivation for constructing General Relativity, provide a short discussion of tensor algebra, and follow the set up of Einstein equations. We discuss briefly the initial value problem, the linear approximation and how should non gravitational physics be described in curved spacetime.

  9. Tool Changer For Robot

    Science.gov (United States)

    Voellmer, George M.

    1992-01-01

    Mechanism enables robot to change tools on end of arm. Actuated by motion of robot: requires no additional electrical or pneumatic energy to make or break connection between tool and wrist at end of arm. Includes three basic subassemblies: wrist interface plate attached to robot arm at wrist, tool interface plate attached to tool, and holster. Separate tool interface plate and holster provided for each tool robot uses.

  10. Applying generalized linear models as an explanatory tool of sex steroids, thyroid hormones and their relationships with environmental and physiologic factors in immature East Pacific green sea turtles (Chelonia mydas).

    Science.gov (United States)

    Labrada-Martagón, Vanessa; Méndez-Rodríguez, Lia C; Mangel, Marc; Zenteno-Savín, Tania

    2013-09-01

    Generalized linear models were fitted to evaluate the relationship between 17β-estradiol (E2), testosterone (T) and thyroxine (T4) levels in immature East Pacific green sea turtles (Chelonia mydas) and their body condition, size, mass, blood biochemistry parameters, handling time, year, season and site of capture. According to external (tail size) and morphological (<77.3 straight carapace length) characteristics, 95% of the individuals were juveniles. Hormone levels, assessed on sea turtles subjected to a capture stress protocol, were <34.7nmolTL(-1), <532.3pmolE2 L(-1) and <43.8nmolT4L(-1). The statistical model explained biologically plausible metabolic relationships between hormone concentrations and blood biochemistry parameters (e.g. glucose, cholesterol) and the potential effect of environmental variables (season and study site). The variables handling time and year did not contribute significantly to explain hormone levels. Differences in sex steroids between season and study sites found by the models coincided with specific nutritional, physiological and body condition differences related to the specific habitat conditions. The models correctly predicted the median levels of the measured hormones in green sea turtles, which confirms the fitted model's utility. It is suggested that quantitative predictions could be possible when the model is tested with additional data.

  11. FASTBUS simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Dean, T.D. (Stanford Linear Accelerator Center, Menlo Park, CA (United States)); Haney, M.J. (Illinois Univ., Urbana, IL (United States))

    1991-10-01

    A generalized model of a FASTBUS master is presented. The model is used with simulation tools to aid in the specification, design, and production of FASTBUS slave modules. The model provides a mechanism to interact with the electrical schematics and software models to predict performance. The model is written in the IEEE std 1076-1987 hardware description language VHDL. A model of the ATC logic is also presented. VHDL was chosen to provide portability to various platforms and simulation tools. The models, in conjunction with most commercially available simulators, will perform all of the transactions specified in IEEE std 960-1989. The models may be used to study the behavior of electrical schematics and other software models and detect violations of the FASTBUS protocol. For example, a hardware design of a slave module could be studied, protocol violations detected and corrected before committing money to prototype development. The master model accepts a stream of high level commands from an ASCII file to initiate FASTBUS transactions. The high level command language is based on the FASTBUS standard routines listed in IEEE std 1177-1989. Using this standard-based command language to direct the model of the master, hardware engineers can simulate FASTBUS transactions in the language used by physicists and programmers to operate FASTBUS systems. 15 refs., 6 figs.

  12. Route Availabililty Planning Tool -

    Data.gov (United States)

    Department of Transportation — The Route Availability Planning Tool (RAPT) is a weather-assimilated decision support tool (DST) that supports the development and execution of departure management...

  13. General Assembly

    CERN Multimedia

    Staff Association

    2016-01-01

    5th April, 2016 – Ordinary General Assembly of the Staff Association! In the first semester of each year, the Staff Association (SA) invites its members to attend and participate in the Ordinary General Assembly (OGA). This year the OGA will be held on Tuesday, April 5th 2016 from 11:00 to 12:00 in BE Auditorium, Meyrin (6-2-024). During the Ordinary General Assembly, the activity and financial reports of the SA are presented and submitted for approval to the members. This is the occasion to get a global view on the activities of the SA, its financial management, and an opportunity to express one’s opinion, including taking part in the votes. Other points are listed on the agenda, as proposed by the Staff Council. Who can vote? Only “ordinary” members (MPE) of the SA can vote. Associated members (MPA) of the SA and/or affiliated pensioners have a right to vote on those topics that are of direct interest to them. Who can give his/her opinion? The Ordinary General Asse...

  14. Generalized Parabolas

    Science.gov (United States)

    Joseph, Dan; Hartman, Gregory; Gibson, Caleb

    2011-01-01

    In this article we explore the consequences of modifying the common definition of a parabola by considering the locus of all points equidistant from a focus and (not necessarily linear) directrix. The resulting derived curves, which we call "generalized parabolas," are often quite beautiful and possess many interesting properties. We show that…

  15. General discussion

    NARCIS (Netherlands)

    Jagers op Akkerhuis, Gerard A.J.M.

    2016-01-01

    The general discussion focuses on some aspects that are of overarching relevance for all the preceding chapters. The fi rst subject that is discussed is the relationship between systems theory and the philosophy of science. After a short summary of the principles of system science and the

  16. Generale preventie

    NARCIS (Netherlands)

    1949-01-01

    In part I of this study a survey has veen given of what Dutch authors have written since 1870, when capital punishment was abolished, on subjects concerning the general preventive effect of punishment. This historical survey ends where, during the years 1940-1945, under the stress of the occupation

  17. 33 CFR 101.510 - Assessment tools.

    Science.gov (United States)

    2010-07-01

    ... MARITIME SECURITY: GENERAL Other Provisions § 101.510 Assessment tools. Ports, vessels, and facilities... may include: (a) DHS/TSA's vulnerability self-assessment tool located at http://www.tsa.gov/risk; and...); and (3) Navigation and Vessel Inspection Circular titled, “Security Guidelines for Facilities”, (NVIC...

  18. Program plan recognition for year 2000 tools

    NARCIS (Netherlands)

    Deursen, A. van; Woods, S.; Quilici, A.

    1997-01-01

    There are many commercial tools that address various aspects of the Year 2000 problem. None of these tools, however, make any documented use of plan-based techniques for automated concept recovery. This implies a general perception that plan-based techniques is not useful for this problem. This pap

  19. [General psychotherapy].

    Science.gov (United States)

    Vymetal, J

    2003-01-01

    Nowadays a theoretical psychotherapeutical thinking develops from the eclectic practice and uses particularly the research of the effective factors of the therapy. Best they can be characterized as differentiate, synthetic, integrative and exceeding other approaches. The development in question goes on with attempts of creating a general model of the psychotherapy that could be a basis for models of special psychotherapies. The aim of such a model is to describe all that is present as important factor for inducing a desirable change of a human in all psychotherapeutical approaches. Among general models we can mention the generic model of D. E. Orlinski and K. I. Howard, Grawe's cube (the author is K. Grawe) and the equation of the psychotherapy.

  20. Generalized polygons

    CERN Document Server

    Maldeghem, Hendrik

    1998-01-01

    This book is intended to be an introduction to the fascinating theory ofgeneralized polygons for both the graduate student and the specialized researcher in the field. It gathers together a lot of basic properties (some of which are usually referred to in research papers as belonging to folklore) and very recent and sometimes deep results. I have chosen a fairly strict geometrical approach, which requires some knowledge of basic projective geometry. Yet, it enables one to prove some typically group-theoretical results such as the determination of the automorphism groups of certain Moufang polygons. As such, some basic group-theoretical knowledge is required of the reader. The notion of a generalized polygon is a relatively recent one. But it is one of the most important concepts in incidence geometry. Generalized polygons are the building bricks of Tits buildings. They are the prototypes and precursors of more general geometries such as partial geometries, partial quadrangles, semi-partial ge­ ometries, near...

  1. Ootw Tool Requirements in Relation to JWARS

    Energy Technology Data Exchange (ETDEWEB)

    Hartley III, D.S.; Packard, S.L.

    1998-01-01

    This document reports the results of the CMke of the Secretary of Defense/Program Analysis & Evaluation (OSD/PA&E) sponsored project to identify how Operations Other Than War (OOTW) tool requirements relate to the Joint Warfare Simulation (JWARS) and, more generally, to joint analytical modeling and simulation (M&S) requirements. It includes recommendations about which OOTW tools (and functionality within tools) should be included in JWARS, which should be managed as joint analytical modeling and simulation (M&S) tools, and which should be left for independent development.

  2. Generalized Lotka stability.

    Science.gov (United States)

    Smith, J D H; Zhang, C

    2015-08-01

    The recently developed macroscopic approach to demography describes the age distribution of mothers and the net maternity function for a given human population entirely in terms of five parameters. Tracking of these parameters provides a number of new tools for analyzing populations and predicting their future states. Within the macroscopic approach, the new concept of generalized Lotka stability is presented in this paper, as an extension of a strong version of classic Lotka stability. The two leading parameters of the macroscopic approach, the Malthusian parameter r and the perturbation s, are computed from population data and plotted in two-dimensional parameter space. Generalized Lotka stability is then defined in terms of the movement of the (r,s)-vector over time. It may be observed in a number of human populations at specific periods of their history.

  3. General Relativity

    CERN Document Server

    Khriplovich, I. B

    2005-01-01

    This book offers an alternative to other textbooks on the subject, providing a more specific discussion of numerous general relativistic effects for readers who have knowledge of classical mechanics and electrodynamics, including special relativity. Coverage includes gravitational lensing, signal retardation in the gravitational field of the Sun, the Reissner-Nordström solution, selected spin effects, the resonance transformation of an electromagnetic wave into a gravitational one, and the entropy and temperature of black holes. The book includes numerous problems at various levels of difficulty, making it ideal also for independent study by a broad readership of advanced students and researchers. I.B. Khriplovich is Chief Researcher, Budker Institute of Nuclear Physics, Novosibirsk, and Chair of Theoretical Physics at Novosibirsk University. Dr. Khriplovich is a Corresponding Member of the Russian Academy of Sciences. He has been awarded the Dirac Medal ``For the advancement of theoretical physics'' by Univ...

  4. [General anesthesia].

    Science.gov (United States)

    Feiss, P

    2001-04-30

    General anaesthesia is a reversible loss of consciousness induced and maintained with a hypnotic drug given either by venous injection and infusion, or by inhalation. A potent opioid is usually associated to inhibit the transmission of pain and thus to lessen sympathetic and endocrine reactions to nociceptive stimuli. Myorelaxation is used to facilitate tracheal intubation and surgery. Whatever the anaesthetic protocol use, the patient and anaesthesia machine require close monitoring. In addition to vital signs, the depth of anaesthesia may be monitored using automated electroencephalographic analysis and myorelaxation should always be monitored using a nerve stimulator, but pain or analgesia evaluation is only based on clinical signs of sympathetic stimulation. Because anaesthesia-related death and morbidity have decreased considerably, future improvements in outcome should concern perioperative comfort, i.e. prevention of cognitive disturbances, nausea, vomiting and pain.

  5. Fetal onset of general movements

    NARCIS (Netherlands)

    Luechinger, Annemarie B.; Hadders-Algra, Mijna; Van Kan, Colette M.; de Vries, JIP

    2008-01-01

    Perinatal qualitative assessment of general movements (GMs) is a tool to evaluate the integrity of the young nervous system. The aim of this investigation was to study the emergence of GMs. Fetal onset of GMs was studied sonographically in 18 fetuses during the first trimester of uncomplicated in vi

  6. LensTools: Weak Lensing computing tools

    Science.gov (United States)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  7. General Mission Analysis Tool (GMAT): Mission, Vision, and Business Case

    Science.gov (United States)

    Hughes, Steven P.

    2007-01-01

    The Goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities businesses and other government organizations; and to share that technology in an open and unhindered way. GMAT's a free and open source software system; free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or future technology development.

  8. Skycorr: A general tool for spectroscopic sky subtraction

    CERN Document Server

    Noll, S; Kimeswenger, S; Barden, M; Jones, A M; Modigliani, A; Szyszka, C; Taylor, J

    2014-01-01

    Airglow emission lines, which dominate the optical-to-near-IR sky radiation, show strong, line-dependent variability on various time scales. Therefore, the subtraction of the sky background in the affected wavelength regime becomes a problem if plain sky spectra have to be taken at a different time as the astronomical data. A solution of this issue is the physically motivated scaling of the airglow lines in the plain sky data to fit the sky lines in the object spectrum. We have developed a corresponding instrument-independent approach based on one-dimensional spectra. Our code skycorr separates sky lines and sky/object continuum by an iterative approach involving a line finder and airglow line data. The sky lines are grouped according to their expected variability. The line groups in the sky data are then scaled to fit the sky in the science data. Required pixel-specific weights for overlapping groups are taken from a comprehensive airglow model. Deviations in the wavelength calibration are corrected by fitti...

  9. gatspy: General tools for Astronomical Time Series in Python

    Science.gov (United States)

    VanderPlas, Jake

    2016-10-01

    Gatspy contains efficient, well-documented implementations of several common routines for Astronomical time series analysis, including the Lomb-Scargle periodogram, the Supersmoother method, and others.

  10. Plaster core washout tool

    Science.gov (United States)

    Heisman, R. M.; Keir, A. R.; Teramura, K.

    1977-01-01

    Tool powered by pressurized water or air removes water soluble plaster lining from Kevlar/epoxy duct. Rotating plastic cutterhead with sealed end fitting connects flexible shaft that allows tool to be used with curved ducts.

  11. OOTW COST TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    HARTLEY, D.S.III; PACKARD, S.L.

    1998-09-01

    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  12. Pro Tools HD

    CERN Document Server

    Camou, Edouard

    2013-01-01

    An easy-to-follow guide for using Pro Tools HD 11 effectively.This book is ideal for anyone who already uses ProTools and wants to learn more, or is new to Pro Tools HD and wants to use it effectively in their own audio workstations.

  13. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  14. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  15. Perception as a Tool

    Directory of Open Access Journals (Sweden)

    Jovana Komnenič

    2014-03-01

    Full Text Available The article presents a project of providing guidelines on art education for the blind and visually impaired, which was entitled Perception as a Tool and presented at the Berlin Biennale on 6 October 2010. It focuses on potential aspects of art education with regard to people with special needs and seeks to discover what happens with art if we cannot see it. This approach to art education combines elements of conventional tours of exhibitions and involves the participants through play. The methods that were used in our work included establishing dramatic tension and insecurity in the group as well as mutual trust by relying on different resources, including sensory perception, personal biography and different forms of knowledge and skills. A major part of the project is finding hidden, invisible or forgotten stories that are not directly linked to the exhibition and the aspects directly related to the exhibition. Such a generally inclusive approach enabled us to formulate political questions on the issue of ’invisibility’.

  16. MOD Tool (Microwave Optics Design Tool)

    Science.gov (United States)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl

  17. MOD Tool (Microwave Optics Design Tool)

    Science.gov (United States)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl

  18. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  19. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    This paper describes and discusses two different Scheme documentation tools. The first is SchemeDoc, which is intended for documentation of the interfaces of Scheme libraries (APIs). The second is the Scheme Elucidator, which is for internal documentation of Scheme programs. Although the tools...... are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  20. A Generalization of Generalized Fibonacci and Generalized Pell Numbers

    Science.gov (United States)

    Abd-Elhameed, W. M.; Zeyada, N. A.

    2017-01-01

    This paper is concerned with developing a new class of generalized numbers. The main advantage of this class is that it generalizes the two classes of generalized Fibonacci numbers and generalized Pell numbers. Some new identities involving these generalized numbers are obtained. In addition, the two well-known identities of Sury and Marques which…

  1. A Quadtree In SAR Data Reduction Method Based on Covariance Function%顾及协方差函数的自适应四叉树 InSAR 数据压缩算法

    Institute of Scientific and Technical Information of China (English)

    张静; 张勤; 赵超英; 张菊清

    2014-01-01

    利用 I nSAR 变形监测结果进行形变机理反演时,由于 I nSAR 获取的数据点众多,且往往含有较多的误差乃至粗差点,严重制约了反演计算的效率和可靠性。为此,本文提出顾及 I nSAR 变形监测数据的物理空间相关性来设立协方差函数,并依据协方差函数确定四叉树象限分解阈值和最大象限大小的自适应四叉树分解 I nSAR 数据压缩算法。本算法能够在尽可能保留形变信号特征细节信息的同时,极大地降低 I nSAR 数据量。论文以西安地区地面沉降 I nSAR 形变监测结果为例进行了试验分析,验证了该算法的有效性。结果表明,该方法能够在不损失形变信号特征的同时,有效地实现 I nSAR 数据压缩和噪声消除的目的。%A major problem in inversion of deformation mechanism using InSAR data is that the InSAR results often contain thousands to millions of data points.Furthermore,there always exist errors and even some blunders,which make the data inversion be lower efficient and lower reliable.Thus,an adaptive quadtree decomposition method for InSAR data reduction is proposed in order to reduce the data numbers without losing the significant information about the deformation.The two important parameters of quadtree decomposition by covariance function are determined,which are eatabl ished by taking account of the physical spatial correlation of InSAR data.The algorithm can preserve details of deformation as much as possible and achieve efficient data reduction.This method is evaluated with InSAR data over Xi’an land subsidence.The results indicate that the algorithm proposed in this manuscript can not only reduce InSAR data number efficiently under a very good preservation of deformation signal,but canel iminate the noise of deformation results efficiently.

  2. Enterprise integration: A tool`s perspective

    Energy Technology Data Exchange (ETDEWEB)

    Polito, J. [Sandia National Labs., Albuquerque, NM (United States); Jones, A. [National Inst. of Standards and Technology, Gaithersburg, MD (United States); Grant, H. [National Science Foundation, Washington, DC (United States)

    1993-06-01

    The advent of sophisticated automation equipment and computer hardware and software is changing the way manufacturing is carried out. To compete in the global marketplace, manufacturing companies must integrate these new technologies into their factories. In addition, they must integrate the planning, control, and data management methodologies needed to make effective use of these technologies. This paper provides an overview of recent approaches to achieving this enterprise integration. It then describes, using simulation as a particular example, a new tool`s perspective of enterprise integration.

  3. Rapid tooling for functional prototyping of metal mold processes: Literature review on cast tooling

    Energy Technology Data Exchange (ETDEWEB)

    Baldwin, M.D. [Sandia National Labs., Albuquerque, NM (United States); Hochanadel, P.W. [Colorado School of Mines, Golden, CO (United States). Dept. of Metallurgical and Materials Engineering

    1995-11-01

    This report is a literature review on cast tooling with the general focus on AISI H13 tool steel. The review includes processing of both wrought and cast H13 steel along with the accompanying microstructures. Also included is the incorporation of new rapid prototyping technologies, such as Stereolithography and Selective Laser Sintering, into the investment casting of tool steel. The limiting property of using wrought or cast tool steel for die casting is heat checking. Heat checking is addressed in terms of testing procedures, theories regarding the mechanism, and microstructural aspects related to the cracking.

  4. Recent Advances in Algal Genetic Tool Development

    Energy Technology Data Exchange (ETDEWEB)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  5. Lunar hand tools

    Science.gov (United States)

    Bentz, Karl F.; Coleman, Robert D.; Dubnik, Kathy; Marshall, William S.; Mcentee, Amy; Na, Sae H.; Patton, Scott G.; West, Michael C.

    1987-01-01

    Tools useful for operations and maintenance tasks on the lunar surface were determined and designed. Primary constraints are the lunar environment, the astronaut's space suit and the strength limits of the astronaut on the moon. A multipurpose rotary motion tool and a collapsible tool carrier were designed. For the rotary tool, a brushless motor and controls were specified, a material for the housing was chosen, bearings and lubrication were recommended and a planetary reduction gear attachment was designed. The tool carrier was designed primarily for ease of access to the tools and fasteners. A material was selected and structural analysis was performed on the carrier. Recommendations were made about the limitations of human performance and about possible attachments to the torque driver.

  6. A Generalized Rough Set Approach to Attribute Generalization in Data Mining

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper presents a generalized method for updating approximations of a concept incrementally, which can be used as an effective tool to deal with dynamic attribute generalization. By combining this method and the LERS inductive learning algorithm, it also introduces a generalized quasi-incremental algorithm for learning classification rules from data bases.

  7. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  8. Advances of Bioinformatics Tools Applied in Virus Epitopes Prediction

    Institute of Scientific and Technical Information of China (English)

    Ping Chen; Simon Rayner; Kang-hong Hu

    2011-01-01

    In recent years, the in silico epitopes prediction tools have facilitated the progress of vaccines development significantly and many have been applied to predict epitopes in viruses successfully. Herein, a general overview of different tools currently available, including T cell and B cell epitopes prediction tools, is presented. And the principles of different prediction algorithms are reviewed briefly. Finally, several examples are present to illustrate the application of the prediction tools.

  9. Wound assessment tools and nurses’ needs: an evaluation study

    OpenAIRE

    Greatrex-White, Sheila; Moxey, Helen

    2013-01-01

    The purpose of this study was to ascertain how well different wound assessment tools meet the needs of nurses in carrying out general wound assessment and whether current tools are fit for purpose. The methodology employed was evaluation research. In order to conduct the evaluation, a literature review was undertaken to identify the criteria of an optimal wound assessment tool which would meet nurses’ needs. Several freely available wound assessment tools were selected based on predetermin...

  10. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  11. Authoring tool evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, A.L.; Klenk, K.S.; Coday, A.C.; McGee, J.P.; Rivenburgh, R.R.; Gonzales, D.M.; Mniszewski, S.M.

    1994-09-15

    This paper discusses and evaluates a number of authoring tools currently on the market. The tools evaluated are Visix Galaxy, NeuronData Open Interface Elements, Sybase Gain Momentum, XVT Power++, Aimtech IconAuthor, Liant C++/Views, and Inmark Technology zApp. Also discussed is the LIST project and how this evaluation is being used to fit an authoring tool to the project.

  12. Population Density Modeling Tool

    Science.gov (United States)

    2014-02-05

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...Density Modeling Tool 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Davy Andrew Michael Knott David Burke 5d. PROJECT NUMBER

  13. CMS offline web tools

    CERN Document Server

    Metson, S; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Evans, D; Fanfani, A; Feichtinger, D; Kavka, C; Kuznetsov, V; Van Lingen, F; Newbold, D; Tuura, L; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments.

  14. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  15. Java Power Tools

    CERN Document Server

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  16. Qlikview Audit Tool (QLIKVIEW) -

    Data.gov (United States)

    Department of Transportation — This tool supports the cyclical financial audit process. Qlikview supports large volumes of financial transaction data that can be mined, summarized and presented to...

  17. Double diameter boring tool

    Science.gov (United States)

    Ashbaugh, Fred N.; Murry, Kenneth R.

    1988-12-27

    A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting edges formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first cutting edge tip to the axis of rotation plus the distance from the second cutting edge tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second cutting edge tip to the axis of rotation minus one-half the distance from the first cutting edge tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.

  18. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  19. TFV as a strategic tool

    DEFF Research Database (Denmark)

    Bonke, Sten; Bertelsen, Sven

    2011-01-01

    The paper investigates the use of the Transformation-Flow-Value theory as a strategic tool in the development of the project production firm. When producing products such as ships, focus on value more than on cost may be the best approach, but in service industries such as construction, focus on ...... on flow may often be a far better approach than just looking at the costs. The paper presents a simple, general financial model to support this argument and not least to assist the reader in conducting similar analyses in his own company....

  20. An Interactive Visual Analytics Tool for NASA's General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of any spacecraft trajectory design process is to identify a path that transfers a vehicle from its point of origin to some specific destination in the...

  1. Generalized periodic and generalized Boolean rings

    Directory of Open Access Journals (Sweden)

    Howard E. Bell

    2001-01-01

    Full Text Available We prove that a generalized periodic, as well as a generalized Boolean, ring is either commutative or periodic. We also prove that a generalized Boolean ring with central idempotents must be nil or commutative. We further consider conditions which imply the commutativity of a generalized periodic, or a generalized Boolean, ring.

  2. Adaptive Control of Machine-Tool Vibration Based on an Active Tool Holder Shank with an Embedded Piezo Ceramic Actuator

    OpenAIRE

    Pettersson, Linus; Håkansson, Lars; Claesson, Ingvar; Olsson, Sven

    2001-01-01

    In the turning operation chatter or vibration is a common problem affecting the result of the machining, and, in particular, the surface finish. Tool life is also influenced by vibration. Severe acoustic noise in the working environment frequently occurs as a result of dynamic motion between the cutting tool and the workpiece. These problems can be reduced by active control of machine-tool vibration. However, machine-tool vibration control systems are usually not applicable to a general lathe...

  3. General(es) remisión de V. escuelas general(es).

    OpenAIRE

    2011-01-01

    [ES] Definición del término General(es) remisión de V. escuelas general(es). en el diccionario Dicter. [EN] Definition of the word General(es) remisión de V. escuelas general(es). in the dictionary Dicter.

  4. General Relativity in (1 + 1) Dimensions

    Science.gov (United States)

    Boozer, A. D.

    2008-01-01

    We describe a theory of gravity in (1 + 1) dimensions that can be thought of as a toy model of general relativity. The theory should be a useful pedagogical tool, because it is mathematically much simpler than general relativity but shares much of the same conceptual structure; in particular, it gives a simple illustration of how gravity arises…

  5. General Relativity in (1 + 1) Dimensions

    Science.gov (United States)

    Boozer, A. D.

    2008-01-01

    We describe a theory of gravity in (1 + 1) dimensions that can be thought of as a toy model of general relativity. The theory should be a useful pedagogical tool, because it is mathematically much simpler than general relativity but shares much of the same conceptual structure; in particular, it gives a simple illustration of how gravity arises…

  6. Generalized geometry lectures on type II backgrounds

    CERN Document Server

    Tsimpis, Dimitrios

    2016-01-01

    The first part of these notes is a self-contained introduction to generalized complex geometry. It is intended as a `user manual' for tools used in the study of supersymmetric backgrounds of supergravity. In the second part we review some past and recent results on the generalized complex structure of supersymmetric type II vacua in various dimensions.

  7. Pneumatically actuated hand tool

    NARCIS (Netherlands)

    Cool, J.C.; Rijnsaardt, K.A.

    1996-01-01

    Abstract of NL 9401195 (A) Pneumatically actuated hand tool for carrying out a mechanical operation, provided with an exchangeable gas cartridge in which the gas which is required for pneumatic actuation is stored. More particularly, the hand tool is provided with at least one pneumatic motor, at

  8. Coring Sample Acquisition Tool

    Science.gov (United States)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  9. WATERS Expert Query Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Expert Query Tool is a web-based reporting tool using the EPA’s WATERS database.There are just three steps to using Expert Query:1. View Selection – Choose what...

  10. Study of Tools Interoperability

    NARCIS (Netherlands)

    Krilavičius, T.

    2007-01-01

    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation

  11. Maailma suurim tool

    Index Scriptorium Estoniae

    2000-01-01

    AS Tartu näitused, Tartu Kunstikool ja ajakiri 'Diivan' korraldavad 9.-11. III Tartu messikeskuse I paviljonis näituse 'Tool 2000'. Eksponeeritakse 2000 tooli, mille hulgast valitakse TOP 12. Messikeskuse territooriumile on kavas püstitada maailma suurim tool. Samal ajal II paviljonis kaksikmess 'Sisustus 2000' ja 'Büroo 2000'.

  12. General Relativity in Electrical Engineering

    OpenAIRE

    Leonhardt, Ulf; Philbin, Thomas G.

    2006-01-01

    In electrical engineering metamaterials have been developed that offer unprecedented control over electromagnetic fields. Here we show that general relativity lends the theoretical tools for designing devices made of such versatile materials. Given a desired device function, the theory describes the electromagnetic properties that turn this function into fact. We consider media that facilitate space-time transformations and include negative refraction. Our theory unifies the concepts operatin...

  13. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership......This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  14. Software Tool Issues

    Science.gov (United States)

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  15. OOTW Force Design Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  16. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  17. New tools for learning.

    Science.gov (United States)

    Dickinson, D

    1999-01-01

    more often to collaborate on creating new knowledge as well as mastering the basics. As technology becomes more ubiquitous, there is growing recognition of the importance of the arts in humanizing the curriculum. "More high-tech, more need for high-touch" is becoming the by-word of many schools. They recognize that the arts are not only culturally important and civilizing influences, but they can facilitate the learning of almost any subject. I believe that these four concepts--the plasticity of the brain, the modifiability of intelligence, the use of technology as a powerful new tool for learning, and the renaissance of the arts in education--have major implications specifically for educational systems and generally for the future of our world. In this time of rapid change, leading-edge educational systems are equipping people with the ability to learn, unlearn, and relearn continually. They are giving students meaningful opportunities to apply what they have learned in order to turn information into knowledge. And--of critical importance if any of this is to lead to a healthy future--they are helping students to learn to use knowledge responsibly, ethically, and with integrity. Furthermore, they are involving students in experiences that develop compassion and altruism in the process of their education. Our complex world urgently needs more people who have developed their fullest potential in mind, body, and spirit.

  18. Tool Gear: Infrastructure for Building Parallel Programming Tools

    Energy Technology Data Exchange (ETDEWEB)

    May, J M; Gyllenhaal, J

    2002-12-09

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  19. Graphitic packing removal tool

    Energy Technology Data Exchange (ETDEWEB)

    Meyers, K.E.; Kolsun, G.J.

    1996-12-31

    Graphitic packing removal tools are described for removal of the seal rings in one piece from valves and pumps. The packing removal tool has a cylindrical base ring the same size as the packing ring with a surface finish, perforations, knurling or threads for adhesion to the seal ring. Elongated leg shanks are mounted axially along the circumferential center. A slit or slits permit insertion around shafts. A removal tool follower stabilizes the upper portion of the legs to allow a spanner wrench to be used for insertion and removal.

  20. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  1. Benchmarking expert system tools

    Science.gov (United States)

    Riley, Gary

    1988-01-01

    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  2. Trajectories in Operating a Handheld Tool

    Science.gov (United States)

    Heuer, Herbert; Sulzenbruck, Sandra

    2009-01-01

    The authors studied the trajectories of the hand and of the tip of a handheld sliding first-order lever in aiming movements. With this kind of tool, straight trajectories of the hand are generally associated with curved trajectories of the tip of the lever and vice versa. Trajectories of the tip of the lever exhibited smaller deviations from…

  3. Trajectories in Operating a Handheld Tool

    Science.gov (United States)

    Heuer, Herbert; Sulzenbruck, Sandra

    2009-01-01

    The authors studied the trajectories of the hand and of the tip of a handheld sliding first-order lever in aiming movements. With this kind of tool, straight trajectories of the hand are generally associated with curved trajectories of the tip of the lever and vice versa. Trajectories of the tip of the lever exhibited smaller deviations from…

  4. Smart Growth Tools

    Science.gov (United States)

    This page describes a variety of tools useful to federal, state, tribal, regional, and local government staff and elected officials; community leaders; developers; and others interested in smart growth development.

  5. Recovery Action Mapping Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Recovery Action Mapping Tool is a web map that allows users to visually interact with and query actions that were developed to recover species listed under the...

  6. Cash Reconciliation Tool

    Data.gov (United States)

    US Agency for International Development — CART is a cash reconciliation tool that allows users to reconcile Agency cash disbursements with Treasury fund balances; track open unreconciled items; and create an...

  7. Mapping Medicare Disparities Tool

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Minority Health has designed an interactive map, the Mapping Medicare Disparities Tool, to identify areas of disparities between subgroups of...

  8. Friction stir welding tool

    Science.gov (United States)

    Tolle; Charles R. , Clark; Denis E. , Barnes; Timothy A.

    2008-04-15

    A friction stir welding tool is described and which includes a shank portion; a shoulder portion which is releasably engageable with the shank portion; and a pin which is releasably engageable with the shoulder portion.

  9. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  10. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  11. Clean Energy Finance Tool

    Science.gov (United States)

    This tool is for state and local governments interested in developing a financing program to support energy efficiency and clean energy improvements for large numbers of buildings within their jurisdiction.

  12. Neighborhood Mapping Tool

    Data.gov (United States)

    Department of Housing and Urban Development — This tool assists the public and Choice Neighborhoods applicants to prepare data to submit with their grant application by allowing applicants to draw the exact...

  13. NWRS Survey Prioritization Tool

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — A SMART Tool and User's Guide for aiding NWRS Station staff when prioritizing their surveys for an Inventory and Monitoring Plan. This guide describes a process and...

  14. Chemical Data Access Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — This tool is intended to aid individuals interested in learning more about chemicals that are manufactured or imported into the United States. Health and safety...

  15. Smart tool holder

    Science.gov (United States)

    Day, Robert Dean; Foreman, Larry R.; Hatch, Douglas J.; Meadows, Mark S.

    1998-01-01

    There is provided an apparatus for machining surfaces to accuracies within the nanometer range by use of electrical current flow through the contact of the cutting tool with the workpiece as a feedback signal to control depth of cut.

  16. TENCompetence tool demonstration

    NARCIS (Netherlands)

    Kluijfhout, Eric

    2010-01-01

    Kluijfhout, E. (2009). TENCompetence tool demonstration. Presented at Zorgacademie Parkstad (Health Academy Parkstad), Limburg Leisure Academy, Life Long Learning Limburg and a number of regional educational institutions. May, 18, 2009, Heerlen, The Netherlands: Open University of the Netherlands, T

  17. ATO Resource Tool -

    Data.gov (United States)

    Department of Transportation — Cru-X/ART is a shift management tool designed for?use by operational employees in Air Traffic Facilities.? Cru-X/ART is used for shift scheduling, shift sign in/out,...

  18. Autism Teaching Tool

    CERN Multimedia

    2014-01-01

    CERN pattern recognition technologies transferred to Austistic children learning tool. The state of the art of pattern recognition technology developed at CERN for High Energy Physics are transferred to Computer Vision domain and are used to develop a new

  19. Learning Design Tools

    NARCIS (Netherlands)

    Griffiths, David; Blat, Josep; Garcia, Rocío; Vogten, Hubert; Kwong, KL

    2005-01-01

    Griffiths, D., Blat, J., Garcia, R., Vogten, H. & Kwong, KL. (2005). Learning Design Tools. In: Koper, R. & Tattersall, C., Learning Design: A Handbook on Modelling and Delivering Networked Education and Training (pp. 109-136). Berlin-Heidelberg: Springer Verlag.

  20. Tools used for hand deburring

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, L.K.

    1981-03-01

    This guide is designed to help in quick identification of those tools most commonly used to deburr hand size or smaller parts. Photographs and textual descriptions are used to provide rapid yet detailed information. The data presented include the Bendix Kansas City Division coded tool number, tool description, tool crib in which the tool can be found, the maximum and minimum inventory requirements, the cost of each tool, and the number of the illustration that shows the tool.

  1. VDC Monitoring Tools

    CERN Document Server

    Porat, Itay Gershon

    2017-01-01

    Electron drift velocity is an important parameter for muon path reconstruction capabilities of CMS drift tube chambers. The parameter is monitored independently by six dedicated drift velocity chambers (VDC). This report presents monitoring tools developed to study key VDC parameters such as anode voltage and current, as well as gas source. These graphical tools can be used to learn about VDC operation and performance, and contribute to understanding anode wire aging in the systems.

  2. Manual bamboo cutting tool.

    Science.gov (United States)

    Bezerra, Mariana Pereira; Correia, Walter Franklin Marques; da Costa Campos, Fabio Ferreira

    2012-01-01

    The paper presents the development of a cutting tool guide, specifically for the harvest of bamboo. The development was made based on precepts of eco-design and ergonomics, for prioritizing the physical health of the operator and the maintenance of the environment, as well as meet specific requirements of bamboo. The main goal is to spread the use of bamboo as construction material, handicrafts, among others, from a handy, easy assembly and material available tool.

  3. Stochastic tools in turbulence

    CERN Document Server

    Lumey, John L

    2012-01-01

    Stochastic Tools in Turbulence discusses the available mathematical tools to describe stochastic vector fields to solve problems related to these fields. The book deals with the needs of turbulence in relation to stochastic vector fields, particularly, on three-dimensional aspects, linear problems, and stochastic model building. The text describes probability distributions and densities, including Lebesgue integration, conditional probabilities, conditional expectations, statistical independence, lack of correlation. The book also explains the significance of the moments, the properties of the

  4. Wound assessment tools and nurses' needs: an evaluation study.

    Science.gov (United States)

    Greatrex-White, Sheila; Moxey, Helen

    2015-06-01

    The purpose of this study was to ascertain how well different wound assessment tools meet the needs of nurses in carrying out general wound assessment and whether current tools are fit for purpose. The methodology employed was evaluation research. In order to conduct the evaluation, a literature review was undertaken to identify the criteria of an optimal wound assessment tool which would meet nurses' needs. Several freely available wound assessment tools were selected based on predetermined inclusion and exclusion criteria and an audit tool was developed to evaluate the selected tools based on how well they met the criteria of the optimal wound assessment tool. The results provide a measure of how well the selected wound assessment tools meet the criteria of the optimal wound assessment tool. No tool was identified which fulfilled all the criteria, but two (the Applied Wound Management tool and the National Wound Assessment Form) met the most criteria of the optimal tool and were therefore considered to best meet nurses' needs in wound assessment. The study provides a mechanism for the appraisal of wound assessment tools using a set of optimal criteria which could aid practitioners in their search for the best wound assessment tool.

  5. Visual illusion of tool use recalibrates tactile perception.

    Science.gov (United States)

    Miller, Luke E; Longo, Matthew R; Saygin, Ayse P

    2017-02-11

    Brief use of a tool recalibrates multisensory representations of the user's body, a phenomenon called tool embodiment. Despite two decades of research, little is known about its boundary conditions. It has been widely argued that embodiment requires active tool use, suggesting a critical role for somatosensory and motor feedback. The present study used a visual illusion to cast doubt on this view. We used a mirror-based setup to induce a visual experience of tool use with an arm that was in fact stationary. Following illusory tool use, tactile perception was recalibrated on this stationary arm, and with equal magnitude as physical use. Recalibration was not found following illusory passive tool holding, and could not be accounted for by sensory conflict or general interhemispheric plasticity. These results suggest visual tool-use signals play a critical role in driving tool embodiment.

  6. Modern Canonical Quantum General Relativity

    Science.gov (United States)

    Thiemann, Thomas

    2008-11-01

    Preface; Notation and conventions; Introduction; Part I. Classical Foundations, Interpretation and the Canonical Quantisation Programme: 1. Classical Hamiltonian formulation of general relativity; 2. The problem of time, locality and the interpretation of quantum mechanics; 3. The programme of canonical quantisation; 4. The new canonical variables of Ashtekar for general relativity; Part II. Foundations of Modern Canonical Quantum General Relativity: 5. Introduction; 6. Step I: the holonomy-flux algebra [P]; 7. Step II: quantum-algebra; 8. Step III: representation theory of [A]; 9. Step IV: 1. Implementation and solution of the kinematical constraints; 10. Step V: 2. Implementation and solution of the Hamiltonian constraint; 11. Step VI: semiclassical analysis; Part III. Physical Applications: 12. Extension to standard matter; 13. Kinematical geometrical operators; 14. Spin foam models; 15. Quantum black hole physics; 16. Applications to particle physics and quantum cosmology; 17. Loop quantum gravity phenomenology; Part IV. Mathematical Tools and their Connection to Physics: 18. Tools from general topology; 19. Differential, Riemannian, symplectic and complex geometry; 20. Semianalytical category; 21. Elements of fibre bundle theory; 22. Holonomies on non-trivial fibre bundles; 23. Geometric quantisation; 24. The Dirac algorithm for field theories with constraints; 25. Tools from measure theory; 26. Elementary introduction to Gel'fand theory for Abelean C* algebras; 27. Bohr compactification of the real line; 28. Operatir -algebras and spectral theorem; 29. Refined algebraic quantisation (RAQ) and direct integral decomposition (DID); 30. Basics of harmonic analysis on compact Lie groups; 31. Spin network functions for SU(2); 32. + Functional analytical description of classical connection dynamics; Bibliography; Index.

  7. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  8. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  9. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    Science.gov (United States)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  10. Algorithmic Bricks: A Tangible Robot Programming Tool for Elementary School Students

    Science.gov (United States)

    Kwon, D.-Y.; Kim, H.-S.; Shim, J.-K.; Lee, W.-G.

    2012-01-01

    Tangible programming tools enable children to easily learn the programming process, previously considered to be difficult for them. While various tangible programming tools have been developed, there is still a lack of available tools to help students experience the general programming process. This study therefore developed a tool called…

  11. Tool Gear Documentation

    Energy Technology Data Exchange (ETDEWEB)

    May, J; Gyllenhaal, J

    2002-04-03

    Tool Gear is designed to allow tool developers to insert instrumentation code into target programs using the DPCL library. This code can gather data and send it back to the Client for display or analysis. Tools can use the Tool Gear client without using the DPCL Collector. Any collector using the right protocols can send data to the Client for display and analysis. However, this document will focus on how to gather data with the DPCL Collector. There are three parts to the task of using Tool Gear to gather data through DPCL: (1) Write the instrumentation code that will be loaded and run in the target program. The code should be in the form of one or more functions, which can pass data structures back to the Client by way of DPCL. The collections of functions is compiled into a library, as described in this report. (2) Write the code that tells the DPCL Collector about the instrumentation and how to forward data back to the Client. (3) Extend the client to accept data from the Collector and display it in a useful way. The rest of this report describes how to carry out each of these steps.

  12. Large Crater Clustering tool

    Science.gov (United States)

    Laura, Jason; Skinner, James A.; Hunter, Marc A.

    2017-08-01

    In this paper we present the Large Crater Clustering (LCC) tool set, an ArcGIS plugin that supports the quantitative approximation of a primary impact location from user-identified locations of possible secondary impact craters or the long-axes of clustered secondary craters. The identification of primary impact craters directly supports planetary geologic mapping and topical science studies where the chronostratigraphic age of some geologic units may be known, but more distant features have questionable geologic ages. Previous works (e.g., McEwen et al., 2005; Dundas and McEwen, 2007) have shown that the source of secondary impact craters can be estimated from secondary impact craters. This work adapts those methods into a statistically robust tool set. We describe the four individual tools within the LCC tool set to support: (1) processing individually digitized point observations (craters), (2) estimating the directional distribution of a clustered set of craters, back projecting the potential flight paths (crater clusters or linearly approximated catenae or lineaments), (3) intersecting projected paths, and (4) intersecting back-projected trajectories to approximate the local of potential source primary craters. We present two case studies using secondary impact features mapped in two regions of Mars. We demonstrate that the tool is able to quantitatively identify primary impacts and supports the improved qualitative interpretation of potential secondary crater flight trajectories.

  13. Climate Change and Water Tools

    Science.gov (United States)

    EPA tools and workbooks guide users to mitigate and adapt to climate change impacts. Various tools can help manage risks, others can visualize climate projections in maps. Included are comprehensive tool kits hosted by other federal agencies.

  14. New Conceptual Design Tools

    DEFF Research Database (Denmark)

    Pugnale, Alberto; Holst, Malene; Kirkegaard, Poul Henning

    2010-01-01

    This paper aims to discuss recent approaches in using more and more frequently computer tools as supports for the conceptual design phase of the architectural project. The present state-of-the-art about software as conceptual design tool could be summarized in two parallel tendencies. On the one...... hand, the main software houses are trying to introduce powerful and effective user-friendly applications in the world of building designers, that are more and more able to fit their specific requirements; on the other hand, some groups of expert users with a basic programming knowledge seem to deal...... with the problem of software as conceptual design tool by means of 'scripting', in other words by self-developing codes able to solve specific and well defined design problems. Starting with a brief historical recall and the discussion of relevant researches and practical experiences, this paper investigates...

  15. Cataract Surgery Tool

    Science.gov (United States)

    1977-01-01

    The NASA-McGannon cataract surgery tool is a tiny cutter-pump which liquefies and pumps the cataract lens material from the eye. Inserted through a small incision in the cornea, the tool can be used on the hardest cataract lens. The cutter is driven by a turbine which operates at about 200,000 revolutions per minute. Incorporated in the mechanism are two passages for saline solutions, one to maintain constant pressure within the eye, the other for removal of the fragmented lens material and fluids. Three years of effort have produced a design, now being clinically evaluated, with excellent potential for improved cataract surgery. The use of this tool is expected to reduce the patient's hospital stay and recovery period significantly.

  16. C-TOOL

    DEFF Research Database (Denmark)

    Taghizadeh-Toosi, Arezoo; Christensen, Bent Tolstrup; Hutchings, Nicholas John

    2014-01-01

    Soil organic carbon (SOC) is a significant component of the global carbon (C) cycle. Changes in SOC storage affect atmospheric CO2 concentrations on decadal to centennial timescales. The C-TOOL model was developed to simulate farm- and regional-scale effects of management on medium- to long......-term SOC storage in the profile of well-drained agricultural mineral soils. C-TOOL uses three SOC pools for both the topsoil (0–25 cm) and the subsoil (25–100 cm), and applies temperature-dependent first order kinetics to regulate C turnover. C-TOOL also enables the simulation of 14C turnover. The simple...... model structure facilitates calibration and requires few inputs (mean monthly air temperature, soil clay content, soil C/N ratio and C in organic inputs). The model was parameterised using data from 19 treatments drawn from seven long-term field experiments in the United Kingdom, Sweden and Denmark...

  17. Free Access Does Not Necessarily Encourage Practitioners to Use Online Evidence Based Information Tools. A Review of: Buchan, H., Lourey, E., D’Este, C., & Sanson-Fisher, R. (2009. Effectiveness of strategies to encourage general practitioners to accept an offer of free access to online evidence-based information: A randomised controlled trial. Implementation Science, 4, article 68.

    Directory of Open Access Journals (Sweden)

    Heather Ganshorn

    2010-12-01

    Full Text Available Objectives – To determine which strategies were most effective for encouraging general practitioners (GPs to sign up for free access to an online evidence based information resource; and to determine whether those who accepted the offer differed in their sociodemographic characteristics from those who did not.Design – Descriptive marketing research study.Setting – Australia’s public healthcare system.Subjects – 14,000 general practitioners (GPs from all regions of Australia.Methods – Subjects were randomly selected by Medicare Australia from its list of GPs that bill it for services. Medicare Australia had 18,262 doctors it deemed eligible; 14,000 of these were selected for a stratified random sample. Subjects were randomized to one of 7 groups of 2,000 each. Each group received a different letter offering two years of free access to BMJ Clinical Evidence, an evidence based online information tool. Randomization was done electronically, and the seven groups were stratified by age group, gender, and location. The interventions given to each group differed as follows:• Group 1: Received a letter offering 2 years of free access, with no further demands on the recipient.• Group 2: Received a letter offering 2 years of free access, but on the condition that they complete an initial questionnaire and another one at 12 months, as well as allowing the publisher to provide de-personalized usage data to the researchers.• Group 3: Same as Group 2, but with the additional offer of an online tutorial to assist them with using the resource.• Group 4: Same as Group 2, but with an additional pamphlet with positive testimonials about the resource from Australian medical opinion leaders.• Group 5: Same as Group 2, but with an additional offer of professional development credits towards their required annual totals.• Group 6: Same as Group 2, but with an additional offer to be entered to win a prize of $500 towards registration at a

  18. Rapid SAW Sensor Development Tools

    Science.gov (United States)

    Wilson, William C.; Atkinson, Gary M.

    2007-01-01

    The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.

  19. Phyx: phylogenetic tools for unix.

    Science.gov (United States)

    Brown, Joseph W; Walker, Joseph F; Smith, Stephen A

    2017-02-08

    The ease with which phylogenomic data can be generated has drastically escalated the computational burden for even routine phylogenetic investigations. To address this, we present phyx : a collection of programs written in C ++ to explore, manipulate, analyze and simulate phylogenetic objects (alignments, trees and MCMC logs). Modelled after Unix/GNU/Linux command line tools, individual programs perform a single task and operate on standard I/O streams that can be piped to quickly and easily form complex analytical pipelines. Because of the stream-centric paradigm, memory requirements are minimized (often only a single tree or sequence in memory at any instance), and hence phyx is capable of efficiently processing very large datasets. phyx runs on POSIX-compliant operating systems. Source code, installation instructions, documentation and example files are freely available under the GNU General Public License at https://github.com/FePhyFoFum/phyx. eebsmith@umich.edu. Supplementary data are available at Bioinformatics online.

  20. The GNEMRE Dendro Tool.

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  1. Tool nimega Sacco

    Index Scriptorium Estoniae

    1998-01-01

    Kolmekümneseks on saanud Zanotta kott ehk tool "Sacco", mille 1968. a. disainisid P. Gatti, C. Paolini, F. Teodoro. "Sacco" - polüstüreenist graanulitega täidetud kott. Tähelepanu pälvis ka Zanotta firma täispuhutav tool "Blow" (1967, Scholari, D'Urbino, Lomazzi, De Pas). E. Lucie-Smith neist. 1968. aastale on pühendatud Düsseldorfi Kunstimuuseumi näitus "1968. a. legendid ja sümbolid", kus on eksponeeritud ligi 500 objekti ja mitu rekonstrueeritud interjööri

  2. Tool nimega Sacco

    Index Scriptorium Estoniae

    1998-01-01

    Kolmekümneseks on saanud Zanotta kott ehk tool "Sacco", mille 1968. a. disainisid P. Gatti, C. Paolini, F. Teodoro. "Sacco" - polüstüreenist graanulitega täidetud kott. Tähelepanu pälvis ka Zanotta firma täispuhutav tool "Blow" (1967, Scholari, D'Urbino, Lomazzi, De Pas). E. Lucie-Smith neist. 1968. aastale on pühendatud Düsseldorfi Kunstimuuseumi näitus "1968. a. legendid ja sümbolid", kus on eksponeeritud ligi 500 objekti ja mitu rekonstrueeritud interjööri

  3. Tool-use frequency by individual sea otters in California

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Sea otters are well-known tool users, employing objects such as rocks or shells to break open invertebrate prey. We used a series of generalized linear mixed effect...

  4. Revolutions in Neuroscience: Tool Development

    Science.gov (United States)

    Bickle, John

    2016-01-01

    Thomas Kuhn’s famous model of the components and dynamics of scientific revolutions is still dominant to this day across science, philosophy, and history. The guiding philosophical theme of this article is that, concerning actual revolutions in neuroscience over the past 60 years, Kuhn’s account is wrong. There have been revolutions, and new ones are brewing, but they do not turn on competing paradigms, anomalies, or the like. Instead, they turn exclusively on the development of new experimental tools. I adopt a metascientific approach and examine in detail the development of two recent neuroscience revolutions: the impact of engineered genetically mutated mammals in the search for causal mechanisms of “higher” cognitive functions; and the more recent impact of optogenetics and designer receptors exclusively activated by designer drugs (DREADDs). The two key metascientific concepts, I derive from these case studies are a revolutionary new tool’s motivating problem, and its initial and second-phase hook experiments. These concepts hardly exhaust a detailed metascience of tool development experiments in neuroscience, but they get that project off to a useful start and distinguish the subsequent account of neuroscience revolutions clearly from Kuhn’s famous model. I close with a brief remark about the general importance of molecular biology for a current philosophical understanding of science, as comparable to the place physics occupied when Kuhn formulated his famous theory of scientific revolutions. PMID:27013992

  5. Revolutions in Neuroscience: Tool Development

    Directory of Open Access Journals (Sweden)

    John eBickle

    2016-03-01

    Full Text Available Thomas Kuhn’s famous model of the components and dynamics of scientific revolutions is still dominant to this day across science, philosophy, and history. The guiding philosophical theme of this paper is that, concerning actual revolutions in neuroscience over the past sixty years, Kuhn’s account is wrong. There have been revolutions, and new ones are brewing, but they do not turn on competing paradigms, anomalies, or the like. Instead, they turn exclusively on the development of new experimental tools. I adopt a metascientific approach and examine in detail the development of two recent neuroscience revolutions: the impact of engineered genetically mutated mammals in the search for causal mechanisms of higher cognitive functions; and the more recent impact of optogenetics (and DREADDs. The two key metascientific concepts I derive from these case studies are a revolutionary new tool’s motivating problem, and its initial and second-phase hook experiments. These concepts hardly exhaust a detailed metascience of Tool Development experiments in neuroscience, but they get that project off to a useful start and distinguish the subsequent account of neuroscience revolutions clearly from Kuhn’s famous model. I close with a brief remark about the general importance of molecular biology for a current philosophical understanding of science, as comparable to the place physics occupied when Kuhn formulated his famous theory of scientific revolutions.

  6. Psychological and aesthetical tools of advertising photography and their effectiveness

    OpenAIRE

    Krulišová, Eliška

    2013-01-01

    This Bachelor's Thesis focuses on psychological, aesthetical, photographic and graphic tools that are used in advertising photography and generally advertising. The hypothesis being verified is a proof of an influence of mentioned tools on an advertising effectiveness. Theoretical part is about advertising photography, psychology of advertising, analysis of used tools and the theory of measuring advertising effectiveness. Practical part utilizes acquired knowledge in an analysis of Research o...

  7. A simulation tool for brassiness studies.

    Science.gov (United States)

    Gilbert, Joël; Menguy, Ludovic; Campbell, Murray

    2008-04-01

    A frequency-domain numerical model of brass instrument sound production is proposed as a tool to predict their brassiness, defined as the rate of spectral enrichment with increasing dynamic level. It is based on generalized Burger's equations dedicated to weakly nonlinear wave propagation in nonuniform ducts, and is an extension of previous work by Menguy and Gilbert [Acta Acustica 86, 798-810 (2000)], initially limited to short cylindrical tubes. The relevance of the present tool is evaluated by carrying out simulations over distances longer than typical shock formation distances, and by doing preliminary simulations of periodic regimes in a typical brass trombone bore geometry.

  8. Teaching Syllogistics Using E-learning Tools

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Sandborg-Petersen, Ulrik; Thorvaldsen, Steinar

    2016-01-01

    This paper is a study of various strategies for teaching syllogistics as part of a course in basic logic. It is a continuation of earlier studies involving practical experiments with students of Communication using the Syllog system, which makes it possible to develop e-learning tools and to do...... learning analytics based on log-data. The aim of the present paper is to investigate whether the Syllog e-learning tools can be helpful in logic teaching in order to obtain a better understanding of logic and argumentation in general and syllogisms in particular. Four versions of a course in basic logic...... involving different teaching methods will be compared....

  9. A Generalization of the Alias Matrix

    DEFF Research Database (Denmark)

    Kulahci, Murat; Bisgaard, S.

    2006-01-01

    The investigation of aliases or biases is important for the interpretation of the results from factorial experiments. For two-level fractional factorials this can be facilitated through their group structure. For more general arrays the alias matrix can be used. This tool is traditionally based...... on the assumption that the error structure is that associated with ordinary least squares. For situations where that is not the case, we provide in this article a generalization of the alias matrix applicable under the generalized least squares assumptions. We also show that for the special case of split plot error...... structure, the generalized alias matrix simplifies to the ordinary alias matrix....

  10. RISK COMMUNICATION IN ACTION: THE TOOLS OF MESSAGE MAPPING

    Science.gov (United States)

    Risk Communication in Action: The Tools of Message Mapping, is a workbook designed to guide risk communicators in crisis situations. The first part of this workbook will review general guidelines for risk communication. The second part will focus on one of the most robust tools o...

  11. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  12. Automation tools for flexible aircraft maintenance.

    Energy Technology Data Exchange (ETDEWEB)

    Prentice, William J.; Drotning, William D.; Watterberg, Peter A.; Loucks, Clifford S.; Kozlowski, David M.

    2003-11-01

    This report summarizes the accomplishments of the Laboratory Directed Research and Development (LDRD) project 26546 at Sandia, during the period FY01 through FY03. The project team visited four DoD depots that support extensive aircraft maintenance in order to understand critical needs for automation, and to identify maintenance processes for potential automation or integration opportunities. From the visits, the team identified technology needs and application issues, as well as non-technical drivers that influence the application of automation in depot maintenance of aircraft. Software tools for automation facility design analysis were developed, improved, extended, and integrated to encompass greater breadth for eventual application as a generalized design tool. The design tools for automated path planning and path generation have been enhanced to incorporate those complex robot systems with redundant joint configurations, which are likely candidate designs for a complex aircraft maintenance facility. A prototype force-controlled actively compliant end-effector was designed and developed based on a parallel kinematic mechanism design. This device was developed for demonstration of surface finishing, one of many in-contact operations performed during aircraft maintenance. This end-effector tool was positioned along the workpiece by a robot manipulator, programmed for operation by the automated planning tools integrated for this project. Together, the hardware and software tools demonstrate many of the technologies required for flexible automation in a maintenance facility.

  13. Change Detection Tools

    NARCIS (Netherlands)

    Dekker, R.J.; Kuenzer, C.; Lehner, M.; Reinartz, P.; Niemeyer, I.; Nussbaum, S.; Lacroix, V.; Sequeira, V.; Stringa, E.; Schöpfer, E.

    2009-01-01

    In this chapter a wide range of change detection tools is addressed. They are grouped into methods suitable for optical and multispectral data, synthetic aperture radar (SAR) images, and 3D data. Optical and multispectral methods include unsupervised approaches, supervised and knowledge-based approa

  14. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...

  15. Healthy Homes Tools

    Science.gov (United States)

    Peek, Gina; Lyon, Melinda; Russ, Randall

    2012-01-01

    Extension is focusing on healthy homes programming. Extension educators are not qualified to diagnose consumers' medical problems as they relate to housing. We cannot give medical advice. Instead, we can help educate consumers about home conditions that may affect their well-being. Extension educators need appropriate healthy homes tools to…

  16. The science writing tool

    Science.gov (United States)

    Schuhart, Arthur L.

    This is a two-part dissertation. The primary part is the text of a science-based composition rhetoric and reader called The Science Writing Tool. This textbook has seven chapters dealing with topics in Science Rhetoric. Each chapter includes a variety of examples of science writing, discussion questions, writing assignments, and instructional resources. The purpose of this text is to introduce lower-division college science majors to the role that rhetoric and communication plays in the conduct of Science, and how these skills contribute to a successful career in Science. The text is designed as a "tool kit," for use by an instructor constructing a science-based composition course or a writing-intensive Science course. The second part of this part of this dissertation reports on student reactions to draft portions of The Science Writing Tool text. In this report, students of English Composition II at Northern Virginia Community College-Annandale were surveyed about their attitudes toward course materials and topics included. The findings were used to revise and expand The Science Writing Tool.

  17. Incident Information Management Tool

    CERN Document Server

    Pejovic, Vladimir

    2015-01-01

    Flaws of\tcurrent incident information management at CMS and CERN\tare discussed. A new data\tmodel for future incident database is\tproposed and briefly described. Recently developed draft version of GIS-­‐based tool for incident tracking is presented.

  18. Tools in HRD. Symposium.

    Science.gov (United States)

    2002

    This document contains three papers from a symposium on tools in human resource development (HRD). "Game Theory Methodology in HRD" (Thomas J. Chermack, Richard A. Swanson) explores the utility of game theory in helping the HRD profession address the complexity of integrating multiple theories for disciplinary understanding and…

  19. Tools and Concepts.

    Science.gov (United States)

    Artis, Margaret, Ed.; And Others

    This guide provides enrichment for students to develop tools and concepts used in various areas of mathematics. The first part presents arithmetic progressions, geometric progressions, and harmonic progression. In the second section, the concept of mathematic induction is developed from intuitive induction, using concrete activities, to the…

  20. Clean Cities Tools

    Energy Technology Data Exchange (ETDEWEB)

    None

    2014-12-19

    The U.S. Department of Energy's Clean Cities offers a large collection of Web-based tools on the Alternative Fuels Data Center. These calculators, interactive maps, and data searches can assist fleets, fuels providers, and other transportation decision makers in their efforts to reduce petroleum use.

  1. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  2. Photutils: Photometry tools

    Science.gov (United States)

    Bradley, Larry; Sipocz, Brigitta; Robitaille, Thomas; Tollerud, Erik; Deil, Christoph; Vinícius, Zè; Barbary, Kyle; Günther, Hans Moritz; Bostroem, Azalee; Droettboom, Michael; Bray, Erik; Bratholm, Lars Andersen; Pickering, T. E.; Craig, Matt; Pascual, Sergio; Greco, Johnny; Donath, Axel; Kerzendorf, Wolfgang; Littlefair, Stuart; Barentsen, Geert; D'Eugenio, Francesco; Weaver, Benjamin Alan

    2016-09-01

    Photutils provides tools for detecting and performing photometry of astronomical sources. It can estimate the background and background rms in astronomical images, detect sources in astronomical images, estimate morphological parameters of those sources (e.g., centroid and shape parameters), and perform aperture and PSF photometry. Written in Python, it is an affiliated package of Astropy (ascl:1304.002).

  3. Dynamic multimedia annotation tool

    Science.gov (United States)

    Pfund, Thomas; Marchand-Maillet, Stephane

    2001-12-01

    Annotating image collections is crucial for different multimedia applications. Not only this provides an alternative access to visual information but it is a critical step to perform the evaluation of content-based image retrieval systems. Annotation is a tedious task so that there is a real need for developing tools that lighten the work of annotators. The tool should be flexible and offer customization so as to make the annotator the most comfortable. It should also automate the most tasks as possible. In this paper, we present a still image annotation tool that has been developed with the aim of being flexible and adaptive. The principle is to create a set of dynamic web pages that are an interface to a SQL database. The keyword set is fixed and every image receives from concurrent annotators a set of keywords along with time stamps and annotator Ids. Each annotator has the possibility of going back and forth within the collection and its previous annotations. He is helped by a number of search services and customization options. An administrative section allows the supervisor to control the parameter of the annotation, including the keyword set, given via an XML structure. The architecture of the tool is made flexible so as to accommodate further options through its development.

  4. Measurement and Research Tools.

    Science.gov (United States)

    1997

    This document contains four papers from a symposium on measurement and research tools for human resource development (HRD). "The 'Best Fit' Training: Measure Employee Learning Style Strengths" (Daniel L. Parry) discusses a study of the physiological aspect of sensory intake known as modality, more specifically, modality as measured by…

  5. Tools for Authentication

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  6. Google - Security Testing Tool

    OpenAIRE

    Staykov, Georgi

    2007-01-01

    Using Google as a security testing tool, basic and advanced search techniques using advanced google search operators. Examples of obtaining control over security cameras, VoIP systems, web servers and collecting valuable information as: Credit card details, cvv codes – only using Google.

  7. Organisational skills and tools.

    Science.gov (United States)

    Wicker, Paul

    2009-04-01

    While this article mainly applies to practitioners who have responsibilities for leading teams or supervising practitioners, many of the skills and tools described here may also apply to students or junior practitioners. The purpose of this article is to highlight some of the main points about organisation, some of the organisational skills and tools that are available, and some examples of how these skills and tools can be used to make practitioners more effective at organising their workload. It is important to realise that organising work and doing work are two completely different things and shouldn't be mixed up. For example, it would be very difficult to start organising work in the middle of a busy operating list: the organisation of the work must come before the work starts and therefore preparation is often an important first step in organising work. As such, some of the tools and skills described in this article may need to be used hours or even days prior to the actual work taking place.

  8. Nitrogen Trading Tool (NTT)

    Science.gov (United States)

    The Natural Resources Conservation Service (NRCS) recently developed a prototype web-based nitrogen trading tool to facilitate water quality credit trading. The development team has worked closely with the Agriculture Research Service Soil Plant Nutrient Research Unit (ARS-SPNR) and the Environmenta...

  9. Apple Shuns Tracking Tool

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Apple Inc. is advising software de- velopers to stop using a feature in software for its iPhones and iPads .that has been linked to privacyconcerns, a move that would also take away a widely used tool for tracking users and their behavior. Developers who write programs for Apple's lOS operating system have been using a unique.

  10. Balancing the tools

    DEFF Research Database (Denmark)

    Leroyer, Patrick

    2009-01-01

    The purpose of this article is to describe the potential of a new combination of functions in lexicographic tools for tourists. So far lexicography has focused on the communicative information needs of tourists, i.e. helping tourists decide what to say in a number of specific tourist situations, ...

  11. Digital Tectonic Tools

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due

    2005-01-01

    in particular. A model of the aspects in the term tectonics – epresentation, ontology and culture – will be presented and used to discuss the current digital tools’ ability in tectonics. Furthermore it will be discussed what a digital tectonic tool is and could be and how a connection between the digital...

  12. Risk Management Implementation Tool

    Science.gov (United States)

    Wright, Shayla L.

    2004-01-01

    Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.

  13. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  14. Tool handling robot system; Tool handling robot system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-10

    As an example of the delivery of industrial use automation equipment by Meidensha Corp., the paper introduced a tool handling robot system. The system is a tool handling robot of case processing use FMS (flexible manufacturing system). This is a system which exchanges tool automatically according to the From To order from the managing computer using the ceiling running robot between five horizontal type machining centers and more than 800 collective tool stockers. The structure of the system is as follows: tool handling robot (MHR-400), robot controller (meirocs-F), tool hand, robot running unit, tool stocker (for 844 tools), five life tool exchange trucks, tool truck lifting unit, system control panel. (NEDO)

  15. Genetics for the general internist.

    Science.gov (United States)

    Laukaitis, Christina M

    2012-01-01

    The internist's goal is to determine a patient's disease risk and to implement preventative interventions. Genetic evaluation is a powerful risk assessment tool, and new interventions target previously untreatable genetic disorders. The purpose of this review is to educate the general internist about common genetic conditions affecting adult patients, with special emphasis on diagnoses with an effective intervention, including hereditary cancer syndromes and cardiovascular disorders. Basic tenets of genetic counseling, complex genetic disease, and management of adults with genetic diagnoses also are discussed. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Keeping you safe by making machine tools safe

    CERN Multimedia

    2012-01-01

    CERN’s third safety objective for 2012 concerns the safety of equipment - and machine tools in particular.   There are three prerequisites for ensuring that a machine tool can be used safely: ·      the machine tool must comply with Directive 2009/104/EC, ·      the layout of the workshop must be compliant, and ·      everyone who uses the machine tool must be trained. Provided these conditions are met, the workshop head can grant authorisation to use the machine tool. To fulfil this objective, an inventory of the machine tools must be drawn up and the people responsible for them identified. The HSE Unit's Safety Inspection Service produces compliance reports for the machine tools. In order to meet the third objective set by the Director-General, the section has doubled its capacity to carry out inspections: ...

  17. General Nuclear Medicine

    Science.gov (United States)

    ... Physician Resources Professions Site Index A-Z General Nuclear Medicine Nuclear medicine imaging uses small amounts of ... limitations of General Nuclear Medicine? What is General Nuclear Medicine? Nuclear medicine is a branch of medical ...

  18. General Ultrasound Imaging

    Science.gov (United States)

    ... Physician Resources Professions Site Index A-Z General Ultrasound Ultrasound imaging uses sound waves to produce pictures ... limitations of General Ultrasound Imaging? What is General Ultrasound Imaging? Ultrasound is safe and painless, and produces ...

  19. Calfornia General Plans Rural

    Data.gov (United States)

    California Department of Resources — We undertook creating the first ever seamless statewide General Plan map for California. All county general plans and many city general plans were integrated into 1...

  20. Calfornia General Plans

    Data.gov (United States)

    California Department of Resources — We undertook creating the first ever seamless statewide General Plan map for California. All county general plans and many city general plans were integrated into 1...