WorldWideScience

Sample records for general data-reduction tool

  1. Bats: A new tool for AMS data reduction

    International Nuclear Information System (INIS)

    Wacker, L.; Christl, M.; Synal, H.-A.

    2010-01-01

    A data evaluation program was developed at ETH Zurich to meet the requirements of the new compact AMS systems MICADAS and TANDY in addition to the large EN-Tandem accelerator. The program, called 'BATS', is designed to automatically calculate standard and blank corrected results for measured samples. After almost one year of routine operation with the MICADAS C-14 system BATS has proven to be an easy to use data reduction tool that requires minimal user input. Here we present the fundamental principle and the algorithms used in BATS for standard-sized radiocarbon measurements.

  2. Bats: A new tool for AMS data reduction

    Energy Technology Data Exchange (ETDEWEB)

    Wacker, L., E-mail: Wacker@phys.ethz.c [Ion Beam Physics, ETH Zurich (Switzerland); Christl, M.; Synal, H.-A. [Ion Beam Physics, ETH Zurich (Switzerland)

    2010-04-15

    A data evaluation program was developed at ETH Zurich to meet the requirements of the new compact AMS systems MICADAS and TANDY in addition to the large EN-Tandem accelerator. The program, called 'BATS', is designed to automatically calculate standard and blank corrected results for measured samples. After almost one year of routine operation with the MICADAS C-14 system BATS has proven to be an easy to use data reduction tool that requires minimal user input. Here we present the fundamental principle and the algorithms used in BATS for standard-sized radiocarbon measurements.

  3. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    Science.gov (United States)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  4. sTools - a data reduction pipeline for the GREGOR Fabry-Pérot Interferometer and the High-resolution Fast Imager at the GREGOR solar telescope

    Science.gov (United States)

    Kuckein, C.; Denker, C.; Verma, M.; Balthasar, H.; González Manrique, S. J.; Louis, R. E.; Diercke, A.

    2017-10-01

    A huge amount of data has been acquired with the GREGOR Fabry-Pérot Interferometer (GFPI), large-format facility cameras, and since 2016 with the High-resolution Fast Imager (HiFI). These data are processed in standardized procedures with the aim of providing science-ready data for the solar physics community. For this purpose, we have developed a user-friendly data reduction pipeline called ``sTools'' based on the Interactive Data Language (IDL) and licensed under creative commons license. The pipeline delivers reduced and image-reconstructed data with a minimum of user interaction. Furthermore, quick-look data are generated as well as a webpage with an overview of the observations and their statistics. All the processed data are stored online at the GREGOR GFPI and HiFI data archive of the Leibniz Institute for Astrophysics Potsdam (AIP). The principles of the pipeline are presented together with selected high-resolution spectral scans and images processed with sTools.

  5. GumTree: Data reduction

    Energy Technology Data Exchange (ETDEWEB)

    Rayner, Hugh [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)]. E-mail: hrz@ansto.gov.au; Hathaway, Paul [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Hauser, Nick [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Fei, Yang [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Franceschini, Ferdi [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Lam, Tony [Bragg Institute, Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    2006-11-15

    Access to software tools for interactive data reduction, visualisation and analysis during a neutron scattering experiment enables instrument users to make informed decisions regarding the direction and success of their experiment. ANSTO aims to enhance the experiment experience of its facility's users by integrating these data reduction tools with the instrument control interface for immediate feedback. GumTree is a software framework and application designed to support an Integrated Scientific Experimental Environment, for concurrent access to instrument control, data acquisition, visualisation and analysis software. The Data Reduction and Analysis (DRA) module is a component of the GumTree framework that allows users to perform data reduction, correction and basic analysis within GumTree while an experiment is running. It is highly integrated with GumTree, able to pull experiment data and metadata directly from the instrument control and data acquisition components. The DRA itself uses components common to all instruments at the facility, providing a consistent interface. It features familiar ISAW-based 1D and 2D plotting, an OpenGL-based 3D plotter and peak fitting performed by fityk. This paper covers the benefits of integration, the flexibility of the DRA module, ease of use for the interface and audit trail generation.

  6. GumTree: Data reduction

    International Nuclear Information System (INIS)

    Rayner, Hugh; Hathaway, Paul; Hauser, Nick; Fei, Yang; Franceschini, Ferdi; Lam, Tony

    2006-01-01

    Access to software tools for interactive data reduction, visualisation and analysis during a neutron scattering experiment enables instrument users to make informed decisions regarding the direction and success of their experiment. ANSTO aims to enhance the experiment experience of its facility's users by integrating these data reduction tools with the instrument control interface for immediate feedback. GumTree is a software framework and application designed to support an Integrated Scientific Experimental Environment, for concurrent access to instrument control, data acquisition, visualisation and analysis software. The Data Reduction and Analysis (DRA) module is a component of the GumTree framework that allows users to perform data reduction, correction and basic analysis within GumTree while an experiment is running. It is highly integrated with GumTree, able to pull experiment data and metadata directly from the instrument control and data acquisition components. The DRA itself uses components common to all instruments at the facility, providing a consistent interface. It features familiar ISAW-based 1D and 2D plotting, an OpenGL-based 3D plotter and peak fitting performed by fityk. This paper covers the benefits of integration, the flexibility of the DRA module, ease of use for the interface and audit trail generation

  7. The Panchromatic High-Resolution Spectroscopic Survey of Local Group Star Clusters. I. General data reduction procedures for the VLT/X-shooter UVB and VIS arm

    Science.gov (United States)

    Schönebeck, Frederik; Puzia, Thomas H.; Pasquali, Anna; Grebel, Eva K.; Kissler-Patig, Markus; Kuntschner, Harald; Lyubenova, Mariya; Perina, Sibilla

    2014-12-01

    Aims: Our dataset contains spectroscopic observations of 29 globular clusters in the Magellanic Clouds and the Milky Way performed with VLT/X-shooter over eight full nights. To derive robust results instrument and pipeline systematics have to be well understood and properly modeled. We aim at a consistent data reduction procedure with an accurate understanding of the measurement accuracy limitations. Here we present detailed data reduction procedures for the VLT/X-shooter UVB and VIS arm. These are not restricted to our particular dataset, but are generally applicable to different kinds of X-shooter data without major limitation on the astronomical object of interest. Methods: ESO's X-shooter pipeline (v1.5.0) performs well and reliably for the wavelength calibration and the associated rectification procedure, yet we find several weaknesses in the reduction cascade that are addressed with additional calibration steps, such as bad pixel interpolation, flat fielding, and slit illumination corrections. Furthermore, the instrumental PSF is analytically modeled and used to reconstruct flux losses at slit transit. This also forms the basis for an optimal extraction of point sources out of the two-dimensional pipeline product. Regular observations of spectrophotometric standard stars obtained from the X-shooter archive allow us to detect instrumental variability, which needs to be understood if a reliable absolute flux calibration is desired. Results: A cascade of additional custom calibration steps is presented that allows for an absolute flux calibration uncertainty of ≲10% under virtually every observational setup, provided that the signal-to-noise ratio is sufficiently high. The optimal extraction increases the signal-to-noise ratio typically by a factor of 1.5, while simultaneously correcting for resulting flux losses. The wavelength calibration is found to be accurate to an uncertainty level of Δλ ≃ 0.02 Å. Conclusions: We find that most of the X

  8. LOFT data reduction

    International Nuclear Information System (INIS)

    Norman, N.L.

    1975-08-01

    The Loss-of-Fluid Test (LOFT) Facility is an experimental facility built around a ''scaled'' version of a large pressurized water reactor (LPWR). LOFT will be used to run loss-of-coolant experiments (LOCEs) and to acquire the necessary data required ''to evaluate the adequacy and improve the analytical methods currently used to predict the loss-of-coolant accident (LOCA) response of LPWRs'' and ''to identify and investigate any unexpected event(s) or threshold(s) in the response of either the plant or the engineered safety features and develop analytical techniques that adequately describe and account for the unexpected behavior(s)''. During the LOCE this required data will be acquired and recorded in both analog and digital modes. Subsequent to the test the analog data will also be converted to the raw digital mode. This raw digital data will be converted to the desired engineering units using the LOFT Data Reduction System. This system is implemented on the IBM 360/75 and is a part of a commercially available data processing program called MAC/RAN III. The theory of reducing LOFT data to engineering units and the application of the MAC/ RAN III system to accomplish this reduction is given. (auth)

  9. A business intelligence approach using web search tools and online data reduction techniques to examine the value of product-enabled services

    DEFF Research Database (Denmark)

    Tanev, Stoyan; Liotta, Giacomo; Kleismantas, Andrius

    2015-01-01

    in Canada and Europe. It adopts an innovative methodology based on online textual data that could be implemented in advanced business intelligence tools aiming at the facilitation of innovation, marketing and business decision making. Combinations of keywords referring to different aspects of service value......-service innovation as a competitive advantage on the marketplace. On the other hand, the focus of EU firms on innovative hybrid offerings is not explicitly related to business differentiation and competitiveness....

  10. Robust methods for data reduction

    CERN Document Server

    Farcomeni, Alessio

    2015-01-01

    Robust Methods for Data Reduction gives a non-technical overview of robust data reduction techniques, encouraging the use of these important and useful methods in practical applications. The main areas covered include principal components analysis, sparse principal component analysis, canonical correlation analysis, factor analysis, clustering, double clustering, and discriminant analysis.The first part of the book illustrates how dimension reduction techniques synthesize available information by reducing the dimensionality of the data. The second part focuses on cluster and discriminant analy

  11. ESO Reflex: a graphical workflow engine for data reduction

    Science.gov (United States)

    Hook, Richard; Ullgrén, Marko; Romaniello, Martino; Maisala, Sami; Oittinen, Tero; Solin, Otto; Savolainen, Ville; Järveläinen, Pekka; Tyynelä, Jani; Péron, Michèle; Ballester, Pascal; Gabasch, Armin; Izzo, Carlo

    ESO Reflex is a prototype software tool that provides a novel approach to astronomical data reduction by integrating a modern graphical workflow system (Taverna) with existing legacy data reduction algorithms. Most of the raw data produced by instruments at the ESO Very Large Telescope (VLT) in Chile are reduced using recipes. These are compiled C applications following an ESO standard and utilising routines provided by the Common Pipeline Library (CPL). Currently these are run in batch mode as part of the data flow system to generate the input to the ESO/VLT quality control process and are also exported for use offline. ESO Reflex can invoke CPL-based recipes in a flexible way through a general purpose graphical interface. ESO Reflex is based on the Taverna system that was originally developed within the UK life-sciences community. Workflows have been created so far for three VLT/VLTI instruments, and the GUI allows the user to make changes to these or create workflows of their own. Python scripts or IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. Taverna is intended for use with web services and experiments using ESO Reflex to access Virtual Observatory web services have been successfully performed. ESO Reflex is the main product developed by Sampo, a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal was to look into the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Sampo concluded early in 2008. This contribution will describe ESO Reflex and show several examples of its use both locally and using Virtual Observatory remote web services. ESO Reflex is expected to be released to the community in early 2009.

  12. Parallel Enhancements of the General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The General Mission Analysis Tool (GMAT) is a state of the art spacecraft mission design tool under active development at NASA's Goddard Space Flight Center (GSFC)....

  13. Peak Wind Tool for General Forecasting

    Science.gov (United States)

    Barrett, Joe H., III

    2010-01-01

    The expected peak wind speed of the day is an important forecast element in the 45th Weather Squadron's (45 WS) daily 24-Hour and Weekly Planning Forecasts. The forecasts are used for ground and space launch operations at the Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). The 45 WS also issues wind advisories for KSC/CCAFS when they expect wind gusts to meet or exceed 25 kt, 35 kt and 50 kt thresholds at any level from the surface to 300 ft. The 45 WS forecasters have indicated peak wind speeds are challenging to forecast, particularly in the cool season months of October - April. In Phase I of this task, the Applied Meteorology Unit (AMU) developed a tool to help the 45 WS forecast non-convective winds at KSC/CCAFS for the 24-hour period of 0800 to 0800 local time. The tool was delivered as a Microsoft Excel graphical user interface (GUI). The GUI displayed the forecast of peak wind speed, 5-minute average wind speed at the time of the peak wind, timing of the peak wind and probability the peak speed would meet or exceed 25 kt, 35 kt and 50 kt. For the current task (Phase II ), the 45 WS requested additional observations be used for the creation of the forecast equations by expanding the period of record (POR). Additional parameters were evaluated as predictors, including wind speeds between 500 ft and 3000 ft, static stability classification, Bulk Richardson Number, mixing depth, vertical wind shear, temperature inversion strength and depth and wind direction. Using a verification data set, the AMU compared the performance of the Phase I and II prediction methods. Just as in Phase I, the tool was delivered as a Microsoft Excel GUI. The 45 WS requested the tool also be available in the Meteorological Interactive Data Display System (MIDDS). The AMU first expanded the POR by two years by adding tower observations, surface observations and CCAFS (XMR) soundings for the cool season months of March 2007 to April 2009. The POR was expanded

  14. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  15. General practice ethnicity data: evaluation of a tool

    Directory of Open Access Journals (Sweden)

    Neuwelt P

    2014-03-01

    Full Text Available INTRODUCTION: There is evidence that the collection of ethnicity data in New Zealand primary care is variable and that data recording in practices does not always align with the procedures outlined in the Ethnicity Data Protocols for the Health and Disability Sector. In 2010, The Ministry of Health funded the development of a tool to audit the collection of ethnicity data in primary care. The aim of this study was to pilot the Ethnicity Data Audit Tool (EAT in general practice. The goal was to evaluate the tool and identify recommendations for its improvement. METHODS: Eight general practices in the Waitemata District Health Board region participated in the EAT pilot. Feedback about the pilot process was gathered by questionnaires and interviews, to gain an understanding of practices’ experiences in using the tool. Questionnaire and interview data were analysed using a simple analytical framework and a general inductive method. FINDINGS: General practice receptionists, practice managers and general practitioners participated in the pilot. Participants found the pilot process challenging but enlightening. The majority felt that the EAT was a useful quality improvement tool for handling patient ethnicity data. Larger practices were the most positive about the tool. CONCLUSION: The findings suggest that, with minor improvements to the toolkit, the EAT has the potential to lead to significant improvements in the quality of ethnicity data collection and recording in New Zealand general practices. Other system-level factors also need to be addressed.

  16. Training generalized improvisation of tools by preschool children1

    Science.gov (United States)

    Parsonson, Barry S.; Baer, Donald M.

    1978-01-01

    The development of new, “creative” behaviors was examined in a problem-solving context. One form of problem solving, improvisation, was defined as finding a substitute to replace the specifically designated, but currently unavailable, tool ordinarily used to solve the problem. The study examined whether preschool children spontaneously displayed generalized improvisation skills, and if not, whether they could be trained to do so within different classes of tools. Generalization across different tool classes was monitored but not specifically trained. Five preschool children participated in individual sessions that first probed their skill at improvising tools, and later trained and probed generalized improvisation in one or more of three tool classes (Hammers, Containers, and Shoelaces), using a multiple-baseline design. All five children were trained with Hammers, two were trained in two classes, and two were trained in all three tool classes. Four of the five children improvised little in Baseline. During Training, all five showed increased generalized improvisation within the trained class, but none across classes. Tools fabricated by item combinations were rare in Baseline, but common in Training. Followup probes showed that the training effects were durable. PMID:16795596

  17. Automated data reduction for optical interferometric data

    International Nuclear Information System (INIS)

    Boyd, R.D.; Miller, D.J.; Ghiglia, D.C.

    1983-01-01

    The potential for significant progress in understanding many transport processes exists through the use of a rapid and automated data reduction process of optical interferometric data. An example involving natural convection in a horizontal annulus is used to demonstrate that the accuracy possible in automated techniques is better than 99.0%

  18. ORAC-DR: A generic data reduction pipeline infrastructure

    Science.gov (United States)

    Jenness, Tim; Economou, Frossie

    2015-03-01

    ORAC-DR is a general purpose data reduction pipeline system designed to be instrument and observatory agnostic. The pipeline works with instruments as varied as infrared integral field units, imaging arrays and spectrographs, and sub-millimeter heterodyne arrays and continuum cameras. This paper describes the architecture of the pipeline system and the implementation of the core infrastructure. We finish by discussing the lessons learned since the initial deployment of the pipeline system in the late 1990s.

  19. Automating OSIRIS Data Reduction for the Keck Observatory Archive

    Science.gov (United States)

    Holt, J.; Tran, H. D.; Goodrich, R.; Berriman, G. B.; Gelino, C. R.; KOA Team

    2014-05-01

    By the end of 2013, the Keck Observatory Archive (KOA) will serve data from all active instruments on the Keck Telescopes. OSIRIS (OH-Suppressing Infra-Red Imaging Spectrograph), the last active instrument to be archived in KOA, has been in use behind the (AO) system at Keck since February 2005. It uses an array of tiny lenslets to simultaneously produce spectra at up to 4096 locations. Due to the complicated nature of the OSIRIS raw data, the OSIRIS team developed a comprehensive data reduction program. This data reduction system has an online mode for quick real-time reductions, which are used primarily for basic data visualization and quality assessment done at the telescope while observing. The offline version of the data reduction system includes an expanded reduction method list, does more iterations for a better construction of the data cubes, and is used to produce publication-quality products. It can also use reconstruction matrices that are developed after the observations were taken, and are more refined. The KOA team is currently utilizing the standard offline reduction mode to produce quick-look browse products for the raw data. Users of the offline data reduction system generally use a graphical user interface to manually setup the reduction parameters. However, in order to reduce and serve the 200,000 science files on disk, all of the reduction parameters and steps need to be fully automated. This pipeline will also be used to automatically produce quick-look browse products for future OSIRIS data after each night's observations. Here we discuss the complexities of OSIRIS data, the reduction system in place, methods for automating the system, performance using virtualization, and progress made to date in generating the KOA products.

  20. Time varying, multivariate volume data reduction

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [Los Alamos National Laboratory; Fout, Nathaniel [UC DAVIS; Ma, Kwan - Liu [UC DAVIS

    2010-01-01

    Large-scale supercomputing is revolutionizing the way science is conducted. A growing challenge, however, is understanding the massive quantities of data produced by large-scale simulations. The data, typically time-varying, multivariate, and volumetric, can occupy from hundreds of gigabytes to several terabytes of storage space. Transferring and processing volume data of such sizes is prohibitively expensive and resource intensive. Although it may not be possible to entirely alleviate these problems, data compression should be considered as part of a viable solution, especially when the primary means of data analysis is volume rendering. In this paper we present our study of multivariate compression, which exploits correlations among related variables, for volume rendering. Two configurations for multidimensional compression based on vector quantization are examined. We emphasize quality reconstruction and interactive rendering, which leads us to a solution using graphics hardware to perform on-the-fly decompression during rendering. In this paper we present a solution which addresses the need for data reduction in large supercomputing environments where data resulting from simulations occupies tremendous amounts of storage. Our solution employs a lossy encoding scheme to acrueve data reduction with several options in terms of rate-distortion behavior. We focus on encoding of multiple variables together, with optional compression in space and time. The compressed volumes can be rendered directly with commodity graphics cards at interactive frame rates and rendering quality similar to that of static volume renderers. Compression results using a multivariate time-varying data set indicate that encoding multiple variables results in acceptable performance in the case of spatial and temporal encoding as compared to independent compression of variables. The relative performance of spatial vs. temporal compression is data dependent, although temporal compression has the

  1. QUICKSILVER - A general tool for electromagnetic PIC simulation

    International Nuclear Information System (INIS)

    Seidel, David B.; Coats, Rebecca S.; Johnson, William A.; Kiefer, Mark L.; Mix, L. Paul; Pasik, Michael F.; Pointon, Timothy D.; Quintenz, Jeffrey P.; Riley, Douglas J.; Turner, C. David

    1997-01-01

    The dramatic increase in computational capability that has occurred over the last ten years has allowed fully electromagnetic simulations of large, complex, three-dimensional systems to move progressively from impractical, to expensive, and recently, to routine and widespread. This is particularly true for systems that require the motion of free charge to be self-consistently treated. The QUICKSILVER electromagnetic Particle-In-Cell (EM-PIC) code has been developed at Sandia National Laboratories to provide a general tool to simulate a wide variety of such systems. This tool has found widespread use for many diverse applications, including high-current electron and ion diodes, magnetically insulated power transmission systems, high-power microwave oscillators, high-frequency digital and analog integrated circuit packages, microwave integrated circuit components, antenna systems, radar cross-section applications, and electromagnetic interaction with biological material. This paper will give a brief overview of QUICKSILVER and provide some thoughts on its future development

  2. Data reduction in cascade impactor and sedimentation battery

    International Nuclear Information System (INIS)

    Boulaud, Denis; Diouri, Mohamed.

    1982-07-01

    The determination of the mass distribution of an aerosol from data collected by a cascade impactor or a sedimentation battery implies the size characterization of each impactor stage or each battery length. In the case of the impactor four data reduction methods were compared. Preinning and Picknett's methods, a simulation method and the wellknown effective cut off size method. A theoretical simulation showed that both the simulation and Picknett's methods were the best adapted to restituting a mass distribution with an uncertainty not exceeding 5% for the mass median diameter and 10% for the standard deviation. In the case of the sedimentation battery a new method was developed allowing data reduction when the analytical shape of the size distribution is known. A theoretical simulation was carried out in order to test our method. The test showed that this method was also adapted to restituting the distribution shape, however the size range covered by the sedimentation battery was generally smaller than that of the impactor [fr

  3. ESO Reflex: A Graphical Workflow Engine for Data Reduction

    Science.gov (United States)

    Hook, R.; Romaniello, M.; Péron, M.; Ballester, P.; Gabasch, A.; Izzo, C.; Ullgrén, M.; Maisala, S.; Oittinen, T.; Solin, O.; Savolainen, V.; Järveläinen, P.; Tyynelä, J.

    2008-08-01

    Sampo {http://www.eso.org/sampo} (Hook et al. 2005) is a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal is to assess the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Those prototypes will not only be used to validate concepts and understand requirements but will also be tools of immediate value for the community. Most of the raw data produced by ESO instruments can be reduced using CPL {http://www.eso.org/cpl} recipes: compiled C programs following an ESO standard and utilizing routines provided by the Common Pipeline Library. Currently reduction recipes are run in batch mode as part of the data flow system to generate the input to the ESO VLT/VLTI quality control process and are also made public for external users. Sampo has developed a prototype application called ESO Reflex {http://www.eso.org/sampo/reflex/} that integrates a graphical user interface and existing data reduction algorithms. ESO Reflex can invoke CPL-based recipes in a flexible way through a dedicated interface. ESO Reflex is based on the graphical workflow engine Taverna {http://taverna.sourceforge.net} that was originally developed by the UK eScience community, mostly for work in the life sciences. Workflows have been created so far for three VLT/VLTI instrument modes ( VIMOS/IFU {http://www.eso.org/instruments/vimos/}, FORS spectroscopy {http://www.eso.org/instruments/fors/} and AMBER {http://www.eso.org/instruments/amber/}), and the easy-to-use GUI allows the user to make changes to these or create workflows of their own. Python scripts and IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available.

  4. CMS Analysis and Data Reduction with Apache Spark

    Energy Technology Data Exchange (ETDEWEB)

    Gutsche, Oliver [Fermilab; Canali, Luca [CERN; Cremer, Illia [Magnetic Corp., Waltham; Cremonesi, Matteo [Fermilab; Elmer, Peter [Princeton U.; Fisk, Ian [Flatiron Inst., New York; Girone, Maria [CERN; Jayatilaka, Bo [Fermilab; Kowalkowski, Jim [Fermilab; Khristenko, Viktor [CERN; Motesnitsalis, Evangelos [CERN; Pivarski, Jim [Princeton U.; Sehrish, Saba [Fermilab; Surdy, Kacper [CERN; Svyatkovskiy, Alexey [Princeton U.

    2017-10-31

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis of very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.

  5. Generalized Aliasing as a Basis for Program Analysis Tools

    National Research Council Canada - National Science Library

    O'Callahan, Robert

    2000-01-01

    .... This dissertation describes the design of a system, Ajax, that addresses this problem by using semantics-based program analysis as the basis for a number of different tools to aid Java programmers...

  6. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    or linearity. The generalizations do not prescribe a particular smoothing technique. In fact, when the smoother is replaced by a linear regression the generalizations reduce to close approximations of SACF and SPACF. For this reason a smooth transition from the linear to the non-linear case can be obtained...

  7. Exposure Assessment Tools by Lifestages and Populations - General Population

    Science.gov (United States)

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases

  8. Fault-Tolerant NDE Data Reduction Framework, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — A distributed fault tolerant nondestructive evaluation (NDE) data reduction framework is proposed in which large NDE datasets are mapped to thousands to millions of...

  9. DNA – A General Energy System Simulation Tool

    DEFF Research Database (Denmark)

    Elmegaard, Brian; Houbak, Niels

    2005-01-01

    The paper reviews the development of the energy system simulation tool DNA (Dynamic Network Analysis). DNA has been developed since 1989 to be able to handle models of any kind of energy system based on the control volume approach, usually systems of lumped parameter components. DNA has proven...... to be a useful tool in the analysis and optimization of several types of thermal systems: Steam turbines, gas turbines, fuels cells, gasification, refrigeration and heat pumps for both conventional fossil fuels and different types of biomass. DNA is applicable for models of both steady state and dynamic...... operation. The program decides at runtime to apply the DAE solver if the system contains differential equations. This makes it easy to extend an existing steady state model to simulate dynamic operation of the plant. The use of the program is illustrated by examples of gas turbine models. The paper also...

  10. The DEEP-South: Scheduling and Data Reduction Software System

    Science.gov (United States)

    Yim, Hong-Suh; Kim, Myung-Jin; Bae, Youngho; Moon, Hong-Kyu; Choi, Young-Jun; Roh, Dong-Goo; the DEEP-South Team

    2015-08-01

    The DEep Ecliptic Patrol of the Southern sky (DEEP-South), started in October 2012, is currently in test runs with the first Korea Microlensing Telescope Network (KMTNet) 1.6 m wide-field telescope located at CTIO in Chile. While the primary objective for the DEEP-South is physical characterization of small bodies in the Solar System, it is expected to discover a large number of such bodies, many of them previously unknown.An automatic observation planning and data reduction software subsystem called "The DEEP-South Scheduling and Data reduction System" (the DEEP-South SDS) is currently being designed and implemented for observation planning, data reduction and analysis of huge amount of data with minimum human interaction. The DEEP-South SDS consists of three software subsystems: the DEEP-South Scheduling System (DSS), the Local Data Reduction System (LDR), and the Main Data Reduction System (MDR). The DSS manages observation targets, makes decision on target priority and observation methods, schedules nightly observations, and archive data using the Database Management System (DBMS). The LDR is designed to detect moving objects from CCD images, while the MDR conducts photometry and reconstructs lightcurves. Based on analysis made at the LDR and the MDR, the DSS schedules follow-up observation to be conducted at other KMTNet stations. In the end of 2015, we expect the DEEP-South SDS to achieve a stable operation. We also have a plan to improve the SDS to accomplish finely tuned observation strategy and more efficient data reduction in 2016.

  11. Principal Components as a Data Reduction and Noise Reduction Technique

    Science.gov (United States)

    Imhoff, M. L.; Campbell, W. J.

    1982-01-01

    The potential of principal components as a pipeline data reduction technique for thematic mapper data was assessed and principal components analysis and its transformation as a noise reduction technique was examined. Two primary factors were considered: (1) how might data reduction and noise reduction using the principal components transformation affect the extraction of accurate spectral classifications; and (2) what are the real savings in terms of computer processing and storage costs of using reduced data over the full 7-band TM complement. An area in central Pennsylvania was chosen for a study area. The image data for the project were collected using the Earth Resources Laboratory's thematic mapper simulator (TMS) instrument.

  12. Adaptive radial basis function mesh deformation using data reduction

    Science.gov (United States)

    Gillebaart, T.; Blom, D. S.; van Zuijlen, A. H.; Bijl, H.

    2016-09-01

    Radial Basis Function (RBF) mesh deformation is one of the most robust mesh deformation methods available. Using the greedy (data reduction) method in combination with an explicit boundary correction, results in an efficient method as shown in literature. However, to ensure the method remains robust, two issues are addressed: 1) how to ensure that the set of control points remains an accurate representation of the geometry in time and 2) how to use/automate the explicit boundary correction, while ensuring a high mesh quality. In this paper, we propose an adaptive RBF mesh deformation method, which ensures the set of control points always represents the geometry/displacement up to a certain (user-specified) criteria, by keeping track of the boundary error throughout the simulation and re-selecting when needed. Opposed to the unit displacement and prescribed displacement selection methods, the adaptive method is more robust, user-independent and efficient, for the cases considered. Secondly, the analysis of a single high aspect ratio cell is used to formulate an equation for the correction radius needed, depending on the characteristics of the correction function used, maximum aspect ratio, minimum first cell height and boundary error. Based on the analysis two new radial basis correction functions are derived and proposed. This proposed automated procedure is verified while varying the correction function, Reynolds number (and thus first cell height and aspect ratio) and boundary error. Finally, the parallel efficiency is studied for the two adaptive methods, unit displacement and prescribed displacement for both the CPU as well as the memory formulation with a 2D oscillating and translating airfoil with oscillating flap, a 3D flexible locally deforming tube and deforming wind turbine blade. Generally, the memory formulation requires less work (due to the large amount of work required for evaluating RBF's), but the parallel efficiency reduces due to the limited

  13. My Family Health Portrait, A tool from the Surgeon General | NIH MedlinePlus the Magazine

    Science.gov (United States)

    ... of this page please turn Javascript on. My Family Health Portrait, A tool from the Surgeon General ... use Why is it important to know my family medical history? Your family medical history is a ...

  14. Automated multispectra alpha spectrometer and data reduction system

    International Nuclear Information System (INIS)

    Hochel, R.C.

    1975-12-01

    A complete hardware and software package for the accumulation and rapid analysis of multiple alpha pulse height spectra has been developed. The system utilizes a 4096-channel analyzer capable of accepting up to sixteen inputs from solid-state surface barrier detectors via mixer-router modules. The analyzer is interfaced to a desk-top programmable calculator and thermal line printer. A chained software package including spectrum printout, peak analysis, plutonium-238 and plutonium-239 data reduction, and automatic energy calibration routines was written. With the chained program a complete printout, peak analysis, and plutonium data reduction of a 512-channel alpha spectrum are obtained in about three minutes with an accuracy within five percent of hand analyses

  15. Individual and social learning processes involved in the acquisition and generalization of tool use in macaques

    Science.gov (United States)

    Macellini, S.; Maranesi, M.; Bonini, L.; Simone, L.; Rozzi, S.; Ferrari, P. F.; Fogassi, L.

    2012-01-01

    Macaques can efficiently use several tools, but their capacity to discriminate the relevant physical features of a tool and the social factors contributing to their acquisition are still poorly explored. In a series of studies, we investigated macaques' ability to generalize the use of a stick as a tool to new objects having different physical features (study 1), or to new contexts, requiring them to adapt the previously learned motor strategy (study 2). We then assessed whether the observation of a skilled model might facilitate tool-use learning by naive observer monkeys (study 3). Results of study 1 and study 2 showed that monkeys trained to use a tool generalize this ability to tools of different shape and length, and learn to adapt their motor strategy to a new task. Study 3 demonstrated that observing a skilled model increases the observers' manipulations of a stick, thus facilitating the individual discovery of the relevant properties of this object as a tool. These findings support the view that in macaques, the motor system can be modified through tool use and that it has a limited capacity to adjust the learnt motor skills to a new context. Social factors, although important to facilitate the interaction with tools, are not crucial for tool-use learning. PMID:22106424

  16. Data reduction for neutron scattering from plutonium samples. Final report

    International Nuclear Information System (INIS)

    Seeger, P.A.

    1997-01-01

    An experiment performed in August, 1993, on the Low-Q Diffractometer (LQD) at the Manual Lujan Jr. Neutron Scattering Center (MLNSC) was designed to study the formation and annealing of He bubbles in aged 239 Pu metal. Significant complications arise in the reduction of the data because of the very high total neutron cross section of 239 Pu, and also because the sample are difficult to make uniform and to characterize. This report gives the details of the data and the data reduction procedures, presents the resulting scattering patterns in terms of macroscopic cross section as a function of momentum transfer, and suggests improvements for future experiments

  17. A psychophysically validated metric for bidirectional texture data reduction

    Czech Academy of Sciences Publication Activity Database

    Filip, Jiří; Chantler, M.J.; Green, P.R.; Haindl, Michal

    2008-01-01

    Roč. 27, č. 5 (2008), s. 138:1-138:11 ISSN 0730-0301 R&D Projects: GA AV ČR 1ET400750407; GA ČR GA102/08/0593 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bidirectional Texture Functions * texture compression Subject RIV: BD - Theory of Information Impact factor: 3.383, year: 2008 http://library.utia.cas.cz/separaty/2008/RO/haindl-a psychophysically validated metric for bidirectional texture data reduction.pdf

  18. Physical activity in adolescents – Accelerometer data reduction criteria

    DEFF Research Database (Denmark)

    Toftager, Mette; Breum, Lars; Kristensen, Peter Lund

    and PA outcomes (mean cpm). The following parameters in the data reduction analyses were fixed: 30sec epoch, 24h duration, first registration accepted after 4h, maximum value 20,000cpm, and two activity epochs permitted in blocks of non-wear. Results: Accelerometer data were obtained from a total of 1...... 1 valid day of 6h wear time using a 10min non-wear criterion. The corresponding numbers using a 90min non-wear criterion were 20.6% and 99.4%. Lengthening the non-wear period decreases PA level (mean cpm) substantially, e.g. average PA was 641 cpm (5 days of 10h) using the 10min non-wear criterion...... compared to 570 cpm using 90min non-wear. No systematic differences in PA outcomes were found when comparing the range of days and hours. Discussion: We used a systematic approach to illustrate that even small inconsistencies in accelerometer data reduction can have substantial impact on compliance and PA...

  19. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...... workflow and computational resources. The current version of the tool implements an extension to previous work in that it allows for general workflow and resource bounds and provides optimal solutions even to systems with cyclic dependencies. Despite the virtues of the current tool, improvements...... and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  20. Final Report for Geometric Analysis for Data Reduction and Structure Discovery DE-FG02-10ER25983, STRIPES award # DE-SC0004096

    Energy Technology Data Exchange (ETDEWEB)

    Vixie, Kevin R. [Washington State Univ., Pullman, WA (United States)

    2014-11-27

    This is the final report for the project "Geometric Analysis for Data Reduction and Structure Discovery" in which insights and tools from geometric analysis were developed and exploited for their potential to large scale data challenges.

  1. Selective data reduction in gas chromatography/infrared spectrometry

    International Nuclear Information System (INIS)

    Pyo, Dong Jin; Shin, Hyun Du

    2001-01-01

    As gas chromatography/infrared spectrometry (GC/IR) becomes routinely available, methods must be developed to deal with the large amount of data produced. We demonstrate computer methods that quickly search through a large data file, locating those spectra that display a spectral feature of interest. Based on a modified library search routine, these selective data reduction methods retrieve all or nearly all of the compounds of interest, while rejection the vast majority of unrelated compounds. To overcome the shifting problem of IR spectra, a search method of moving the average pattern was designed. In this moving pattern search, the average pattern of a particular functional group was not held stationary, but was allowed to be moved a little bit right and left

  2. The JCMT Transient Survey: Data Reduction and Calibration Methods

    Energy Technology Data Exchange (ETDEWEB)

    Mairs, Steve; Lane, James [Department of Physics and Astronomy, University of Victoria, Victoria, BC, V8P 1A1 (Canada); Johnstone, Doug; Kirk, Helen [NRC Herzberg Astronomy and Astrophysics, 5071 West Saanich Road, Victoria, BC, V9E 2E7 (Canada); Lacaille, Kevin; Chapman, Scott [Department of Physics and Atmospheric Science, Dalhousie University, Halifax, NS, B3H 4R2 (Canada); Bower, Geoffrey C. [Academia Sinica Institute of Astronomy and Astrophysics, 645 N. A‘ohōkū Place, Hilo, HI 96720 (United States); Bell, Graham S.; Graves, Sarah, E-mail: smairs@uvic.ca [East Asian Observatory, 660 North A‘ohōkū Place, University Park, Hilo, Hawaii 96720 (United States); Collaboration: JCMT Transient Team

    2017-07-01

    Though there has been a significant amount of work investigating the early stages of low-mass star formation in recent years, the evolution of the mass assembly rate onto the central protostar remains largely unconstrained. Examining in depth the variation in this rate is critical to understanding the physics of star formation. Instabilities in the outer and inner circumstellar disk can lead to episodic outbursts. Observing these brightness variations at infrared or submillimeter wavelengths constrains the current accretion models. The JCMT Transient Survey is a three-year project dedicated to studying the continuum variability of deeply embedded protostars in eight nearby star-forming regions at a one-month cadence. We use the SCUBA-2 instrument to simultaneously observe these regions at wavelengths of 450 and 850 μ m. In this paper, we present the data reduction techniques, image alignment procedures, and relative flux calibration methods for 850 μ m data. We compare the properties and locations of bright, compact emission sources fitted with Gaussians over time. Doing so, we achieve a spatial alignment of better than 1″ between the repeated observations and an uncertainty of 2%–3% in the relative peak brightness of significant, localized emission. This combination of imaging performance is unprecedented in ground-based, single-dish submillimeter observations. Finally, we identify a few sources that show possible and confirmed brightness variations. These sources will be closely monitored and presented in further detail in additional studies throughout the duration of the survey.

  3. THE EFFELSBERG-BONN H I SURVEY: DATA REDUCTION

    International Nuclear Information System (INIS)

    Winkel, B.; Kalberla, P. M. W.; Kerp, J.; Floeer, L.

    2010-01-01

    Starting in winter 2008/2009 an L-band seven-feed-array receiver is used for a 21 cm line survey performed with the 100 m telescope, the Effelsberg-Bonn H I survey (EBHIS). The EBHIS will cover the whole northern hemisphere for decl. > - 5 0 comprising both the galactic and extragalactic sky out to a distance of about 230 Mpc. Using state-of-the-art FPGA-based digital fast Fourier transform spectrometers, superior in dynamic range and temporal resolution to conventional correlators, allows us to apply sophisticated radio frequency interference (RFI) mitigation schemes. In this paper, the EBHIS data reduction package and first results are presented. The reduction software consists of RFI detection schemes, flux and gain-curve calibration, stray-radiation removal, baseline fitting, and finally the gridding to produce data cubes. The whole software chain is successfully tested using multi-feed data toward many smaller test fields (1-100 deg 2 ) and recently applied for the first time to data of two large sky areas, each covering about 2000 deg 2 . The first large area is toward the northern galactic pole and the second one toward the northern tip of the Magellanic Leading Arm. Here, we demonstrate the data quality of EBHIS Milky Way data and give a first impression on the first data release in 2011.

  4. New software for neutron data reduction and visualization

    International Nuclear Information System (INIS)

    Worlton, T.; Chatterjee, A.; Hammonds, J.; Chen, D.; Loong, C.K.; Mikkelson, D.; Mikkelson, R.

    2001-01-01

    Development of advanced neutron sources and instruments has necessitated corresponding advances in software for neutron scattering data reduction and visualization. New sources produce datasets more rapidly, and new instruments produce large numbers of spectra. Because of the shorter collection times, users are able to make more measurements on a given sample. This rapid production of datasets requires that users be able to reduce and analyze data quickly to prevent a data bottleneck. In addition, the new sources and instruments are accommodating more users with less neutron-scattering specific expertise, which requires software that is easy to use and freely available. We have developed an Integrated Spectral Analysis Workbench (ISAW) software package to permit the rapid reduction and visualization of neutron data. It can handle large numbers of spectra and merge data from separate measurements. The data can be sorted according to any attribute and transformed in numerous ways. ISAW provides several views of the data that enable users to compare spectra and observe trends in the data. A command interpreter, which is now part of ISAW, allows scientists to easily set up a series of instrument-specific operations to reduce and visualize data automatically. ISAW is written entirely in Java to permit portability to different computer platforms and easy distribution of the software. The software was constructed using modern computer design methods to allow easy customization and improvement. ISAW currently only reads data from IPNS 'run' files, but work is underway to provide input of NeXus files. (author)

  5. The JCMT Transient Survey: Data Reduction and Calibration Methods

    International Nuclear Information System (INIS)

    Mairs, Steve; Lane, James; Johnstone, Doug; Kirk, Helen; Lacaille, Kevin; Chapman, Scott; Bower, Geoffrey C.; Bell, Graham S.; Graves, Sarah

    2017-01-01

    Though there has been a significant amount of work investigating the early stages of low-mass star formation in recent years, the evolution of the mass assembly rate onto the central protostar remains largely unconstrained. Examining in depth the variation in this rate is critical to understanding the physics of star formation. Instabilities in the outer and inner circumstellar disk can lead to episodic outbursts. Observing these brightness variations at infrared or submillimeter wavelengths constrains the current accretion models. The JCMT Transient Survey is a three-year project dedicated to studying the continuum variability of deeply embedded protostars in eight nearby star-forming regions at a one-month cadence. We use the SCUBA-2 instrument to simultaneously observe these regions at wavelengths of 450 and 850 μ m. In this paper, we present the data reduction techniques, image alignment procedures, and relative flux calibration methods for 850 μ m data. We compare the properties and locations of bright, compact emission sources fitted with Gaussians over time. Doing so, we achieve a spatial alignment of better than 1″ between the repeated observations and an uncertainty of 2%–3% in the relative peak brightness of significant, localized emission. This combination of imaging performance is unprecedented in ground-based, single-dish submillimeter observations. Finally, we identify a few sources that show possible and confirmed brightness variations. These sources will be closely monitored and presented in further detail in additional studies throughout the duration of the survey.

  6. FPGA based algorithms for data reduction at Belle II

    Energy Technology Data Exchange (ETDEWEB)

    Muenchow, David; Gessler, Thomas; Kuehn, Wolfgang; Lange, Jens Soeren; Liu, Ming; Spruck, Bjoern [II. Physikalisches Institut, Universitaet Giessen (Germany)

    2011-07-01

    Belle II, the upgrade of the existing Belle experiment at Super-KEKB in Tsukuba, Japan, is an asymmetric e{sup +}e{sup -} collider with a design luminosity of 8.10{sup 35}cm{sup -2}s{sup -1}. At Belle II the estimated event rate is {<=}30 kHz. The resulting data rate at the Pixel Detector (PXD) will be {<=}7.2 GB/s. This data rate needs to be reduced to be able to process and store the data. A region of interest (ROI) selection is based upon two mechanisms. a.) a tracklet finder using the silicon strip detector and b.) the HLT using all other Belle II subdetectors. These ROIs and the pixel data are forwarded to an FPGA based Compute Node for processing. Here a VHDL based algorithm on FPGA with the benefit of pipelining and parallelisation will be implemented. For a fast data handling we developed a dedicated memory management system for buffering and storing the data. The status of the implementation and performance tests of the memory manager and data reduction algorithm is presented.

  7. New software for neutron data reduction and visualization

    Energy Technology Data Exchange (ETDEWEB)

    Worlton, T.; Chatterjee, A.; Hammonds, J.; Chen, D.; Loong, C.K. [Argonne National Laboratory, Argonne, IL (United States); Mikkelson, D.; Mikkelson, R. [Univ. of Wisconsin-Stout, Menomonie, WI (United States)

    2001-03-01

    Development of advanced neutron sources and instruments has necessitated corresponding advances in software for neutron scattering data reduction and visualization. New sources produce datasets more rapidly, and new instruments produce large numbers of spectra. Because of the shorter collection times, users are able to make more measurements on a given sample. This rapid production of datasets requires that users be able to reduce and analyze data quickly to prevent a data bottleneck. In addition, the new sources and instruments are accommodating more users with less neutron-scattering specific expertise, which requires software that is easy to use and freely available. We have developed an Integrated Spectral Analysis Workbench (ISAW) software package to permit the rapid reduction and visualization of neutron data. It can handle large numbers of spectra and merge data from separate measurements. The data can be sorted according to any attribute and transformed in numerous ways. ISAW provides several views of the data that enable users to compare spectra and observe trends in the data. A command interpreter, which is now part of ISAW, allows scientists to easily set up a series of instrument-specific operations to reduce and visualize data automatically. ISAW is written entirely in Java to permit portability to different computer platforms and easy distribution of the software. The software was constructed using modern computer design methods to allow easy customization and improvement. ISAW currently only reads data from IPNS 'run' files, but work is underway to provide input of NeXus files. (author)

  8. Measuring general surgery residents' communication skills from the patient's perspective using the Communication Assessment Tool (CAT).

    Science.gov (United States)

    Stausmire, Julie M; Cashen, Constance P; Myerholtz, Linda; Buderer, Nancy

    2015-01-01

    The Communication Assessment Tool (CAT) has been used and validated to assess Family and Emergency Medicine resident communication skills from the patient's perspective. However, it has not been previously reported as an outcome measure for general surgery residents. The purpose of this study is to establish initial benchmarking data for the use of the CAT as an evaluation tool in an osteopathic general surgery residency program. Results are analyzed quarterly and used by the program director to provide meaningful feedback and targeted goal setting for residents to demonstrate progressive achievement of interpersonal and communication skills with patients. The 14-item paper version of the CAT (developed by Makoul et al. for residency programs) asks patients to anonymously rate surgery residents on discrete communication skills using a 5-point rating scale immediately after the clinical encounter. Results are reported as the percentage of items rated as "excellent" (5) by the patient. The setting is a hospital-affiliated ambulatory urban surgery office staffed by the residency program. Participants are representative of adult patients of both sexes across all ages with diverse ethnic backgrounds. They include preoperative and postoperative patients, as well as those needing diagnostic testing and follow-up. Data have been collected on 17 general surgery residents from a single residency program representing 5 postgraduate year levels and 448 patient encounters since March 2012. The reliability (Cronbach α) of the tool for surgery residents was 0.98. The overall mean percentage of items rated as excellent was 70% (standard deviations = 42%), with a median of 100%. The CAT is a useful tool for measuring 1 facet of resident communication skills-the patient's perception of the physician-patient encounter. The tool provides a unique and personalized outcome measure for identifying communication strengths and improvement opportunities, allowing residents to receive

  9. The electronic patient record as a meaningful audit tool - Accountability and autonomy in general practitioner work

    DEFF Research Database (Denmark)

    Winthereik, Brit Ross; van der Ploeg, I.; Berg, Marc

    2007-01-01

    Health authorities increasingly request that general practitioners (GPs) use information and communication technologies such as electronic patient records (EPR) for accountability purposes. This article deals with the use of EPRs among general practitioners in Britain. It examines two ways in which...... makes them active in finding ways that turn the EPR into a meaningful tool for them, that is, a tool that helps them provide what they see as good care. The article's main contribution is to show how accountability and autonomy are coproduced; less professional autonomy does not follow from more...... GPs use the EPR for accountability purposes. One way is to generate audit reports on the basis of the information that has been entered into the record. The other is to let the computer intervene in the clinical process through prompts. The article argues that GPs' ambivalence toward using the EPR...

  10. Temporal rainfall estimation using input data reduction and model inversion

    Science.gov (United States)

    Wright, A. J.; Vrugt, J. A.; Walker, J. P.; Pauwels, V. R. N.

    2016-12-01

    demonstration of equifinality. The use of a likelihood function that considers both rainfall and streamflow error combined with the use of the DWT as a model data reduction technique allows the joint inference of hydrologic model parameters along with rainfall.

  11. Swift UVOT Grism Observations of Nearby Type Ia Supernovae - I. Observations and Data Reduction

    Science.gov (United States)

    Pan, Y.-C.; Foley, R. J.; Filippenko, A. V.; Kuin, N. P. M.

    2018-05-01

    Ultraviolet (UV) observations of Type Ia supernovae (SNe Ia) are useful tools for understanding progenitor systems and explosion physics. In particular, UV spectra of SNe Ia, which probe the outermost layers, are strongly affected by the progenitor metallicity. In this work, we present 120 Neil Gehrels Swift Observatory UV spectra of 39 nearby SNe Ia. This sample is the largest UV (λ Ia to date, doubling the number of UV spectra and tripling the number of SNe with UV spectra. The sample spans nearly the full range of SN Ia light-curve shapes (Δm15(B) ≈ 0.6-1.8 mag). The fast turnaround of Swift allows us to obtain UV spectra at very early times, with 13 out of 39 SNe having their first spectra observed ≳ 1 week before peak brightness and the earliest epoch being 16.5 days before peak brightness. The slitless design of the Swift UV grism complicates the data reduction, which requires separating SN light from underlying host-galaxy light and occasional overlapping stellar light. We present a new data-reduction procedure to mitigate these issues, producing spectra that are significantly improved over those of standard methods. For a subset of the spectra we have nearly simultaneous Hubble Space Telescope UV spectra; the Swift spectra are consistent with these comparison data.

  12. TACO: a general-purpose tool for predicting cell-type-specific transcription factor dimers.

    Science.gov (United States)

    Jankowski, Aleksander; Prabhakar, Shyam; Tiuryn, Jerzy

    2014-03-19

    Cooperative binding of transcription factor (TF) dimers to DNA is increasingly recognized as a major contributor to binding specificity. However, it is likely that the set of known TF dimers is highly incomplete, given that they were discovered using ad hoc approaches, or through computational analyses of limited datasets. Here, we present TACO (Transcription factor Association from Complex Overrepresentation), a general-purpose standalone software tool that takes as input any genome-wide set of regulatory elements and predicts cell-type-specific TF dimers based on enrichment of motif complexes. TACO is the first tool that can accommodate motif complexes composed of overlapping motifs, a characteristic feature of many known TF dimers. Our method comprehensively outperforms existing tools when benchmarked on a reference set of 29 known dimers. We demonstrate the utility and consistency of TACO by applying it to 152 DNase-seq datasets and 94 ChIP-seq datasets. Based on these results, we uncover a general principle governing the structure of TF-TF-DNA ternary complexes, namely that the flexibility of the complex is correlated with, and most likely a consequence of, inter-motif spacing.

  13. Survey of Object-Based Data Reduction Techniques in Observational Astronomy

    Directory of Open Access Journals (Sweden)

    Łukasik Szymon

    2016-01-01

    Full Text Available Dealing with astronomical observations represents one of the most challenging areas of big data analytics. Besides huge variety of data types, dynamics related to continuous data flow from multiple sources, handling enormous volumes of data is essential. This paper provides an overview of methods aimed at reducing both the number of features/attributes as well as data instances. It concentrates on data mining approaches not related to instruments and observation tools instead working on processed object-based data. The main goal of this article is to describe existing datasets on which algorithms are frequently tested, to characterize and classify available data reduction algorithms and identify promising solutions capable of addressing present and future challenges in astronomy.

  14. Simrank: Rapid and sensitive general-purpose k-mer search tool

    Energy Technology Data Exchange (ETDEWEB)

    DeSantis, T.Z.; Keller, K.; Karaoz, U.; Alekseyenko, A.V; Singh, N.N.S.; Brodie, E.L; Pei, Z.; Andersen, G.L; Larsen, N.

    2011-04-01

    Terabyte-scale collections of string-encoded data are expected from consortia efforts such as the Human Microbiome Project (http://nihroadmap.nih.gov/hmp). Intra- and inter-project data similarity searches are enabled by rapid k-mer matching strategies. Software applications for sequence database partitioning, guide tree estimation, molecular classification and alignment acceleration have benefited from embedded k-mer searches as sub-routines. However, a rapid, general-purpose, open-source, flexible, stand-alone k-mer tool has not been available. Here we present a stand-alone utility, Simrank, which allows users to rapidly identify database strings the most similar to query strings. Performance testing of Simrank and related tools against DNA, RNA, protein and human-languages found Simrank 10X to 928X faster depending on the dataset. Simrank provides molecular ecologists with a high-throughput, open source choice for comparing large sequence sets to find similarity.

  15. CMS Analysis and Data Reduction with Apache Spark

    OpenAIRE

    Gutsche, Oliver; Canali, Luca; Cremer, Illia; Cremonesi, Matteo; Elmer, Peter; Fisk, Ian; Girone, Maria; Jayatilaka, Bo; Kowalkowski, Jim; Khristenko, Viktor; Motesnitsalis, Evangelos; Pivarski, Jim; Sehrish, Saba; Surdy, Kacper; Svyatkovskiy, Alexey

    2017-01-01

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HE...

  16. Patient satisfaction surveys as a market research tool for general practices.

    Science.gov (United States)

    Khayat, K; Salter, B

    1994-05-01

    Recent policy developments, embracing the notions of consumer choice, quality of care, and increased general practitioner control over practice budgets have resulted in a new competitive environment in primary care. General practitioners must now be more aware of how their patients feel about the services they receive, and patient satisfaction surveys can be an effective tool for general practices. A survey was undertaken to investigate the use of a patient satisfaction survey and whether aspects of patient satisfaction varied according to sociodemographic characteristics such as age, sex, social class, housing tenure and length of time in education. A sample of 2173 adults living in Medway District Health Authority were surveyed by postal questionnaire in September 1991 in order to elicit their views on general practice services. Levels of satisfaction varied with age, with younger people being consistently less satisfied with general practice services than older people. Women, those in social classes 1-3N, home owners and those who left school aged 17 years or older were more critical of primary care services than men, those in social classes 3M-5, tenants and those who left school before the age of 17 years. Surveys and analyses of this kind, if conducted for a single practice, can form the basis of a marketing strategy aimed at optimizing list size, list composition, and service quality. Satisfaction surveys can be readily incorporated into medical audit and financial management.

  17. Musite, a tool for global prediction of general and kinase-specific phosphorylation sites.

    Science.gov (United States)

    Gao, Jianjiong; Thelen, Jay J; Dunker, A Keith; Xu, Dong

    2010-12-01

    Reversible protein phosphorylation is one of the most pervasive post-translational modifications, regulating diverse cellular processes in various organisms. High throughput experimental studies using mass spectrometry have identified many phosphorylation sites, primarily from eukaryotes. However, the vast majority of phosphorylation sites remain undiscovered, even in well studied systems. Because mass spectrometry-based experimental approaches for identifying phosphorylation events are costly, time-consuming, and biased toward abundant proteins and proteotypic peptides, in silico prediction of phosphorylation sites is potentially a useful alternative strategy for whole proteome annotation. Because of various limitations, current phosphorylation site prediction tools were not well designed for comprehensive assessment of proteomes. Here, we present a novel software tool, Musite, specifically designed for large scale predictions of both general and kinase-specific phosphorylation sites. We collected phosphoproteomics data in multiple organisms from several reliable sources and used them to train prediction models by a comprehensive machine-learning approach that integrates local sequence similarities to known phosphorylation sites, protein disorder scores, and amino acid frequencies. Application of Musite on several proteomes yielded tens of thousands of phosphorylation site predictions at a high stringency level. Cross-validation tests show that Musite achieves some improvement over existing tools in predicting general phosphorylation sites, and it is at least comparable with those for predicting kinase-specific phosphorylation sites. In Musite V1.0, we have trained general prediction models for six organisms and kinase-specific prediction models for 13 kinases or kinase families. Although the current pretrained models were not correlated with any particular cellular conditions, Musite provides a unique functionality for training customized prediction models

  18. Detection of adverse events in general surgery using the " Trigger Tool" methodology.

    Science.gov (United States)

    Pérez Zapata, Ana Isabel; Gutiérrez Samaniego, María; Rodríguez Cuéllar, Elías; Andrés Esteban, Eva María; Gómez de la Cámara, Agustín; Ruiz López, Pedro

    2015-02-01

    Surgery is one of the high-risk areas for the occurrence of adverse events (AE). The purpose of this study is to know the percentage of hospitalisation-related AE that are detected by the «Global Trigger Tool» methodology in surgical patients, their characteristics and the tool validity. Retrospective, observational study on patients admitted to a general surgery department, who underwent a surgical operation in a third level hospital during the year 2012. The identification of AE was carried out by patient record review using an adaptation of «Global Trigger Tool» methodology. Once an AE was identified, a harm category was assigned, including the grade in which the AE could have been avoided and its relation with the surgical procedure. The prevalence of AE was 36,8%. There were 0,5 AE per patient. 56,2% were deemed preventable. 69,3% were directly related to the surgical procedure. The tool had a sensitivity of 86% and a specificity of 93,6%. The positive predictive value was 89% and the negative predictive value 92%. Prevalence of AE is greater than the estimate of other studies. In most cases the AE detected were related to the surgical procedure and more than half were also preventable. The adapted «Global Trigger Tool» methodology has demonstrated to be highly effective and efficient for detecting AE in surgical patients, identifying all the serious AE with few false negative results. Copyright © 2014 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  19. A Global Multi-Objective Optimization Tool for Design of Mechatronic Components using Generalized Differential Evolution

    DEFF Research Database (Denmark)

    Bech, Michael Møller; Nørgård, Christian; Roemer, Daniel Beck

    2016-01-01

    This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri-objectiv......This paper illustrates how the relatively simple constrained multi-objective optimization algorithm Generalized Differential Evolution 3 (GDE3), can assist with the practical sizing of mechatronic components used in e.g. digital displacement fluid power machinery. The studied bi- and tri...... different optimization control parameter settings and it is concluded that GDE3 is a reliable optimization tool that can assist mechatronic engineers in the design and decision making process....

  20. Toolkit for data reduction to tuples for the ATLAS experiment

    International Nuclear Information System (INIS)

    Snyder, Scott; Krasznahorkay, Attila

    2012-01-01

    The final step in a HEP data-processing chain is usually to reduce the data to a ‘tuple’ form which can be efficiently read by interactive analysis tools such as ROOT. Often, this is implemented independently by each group analyzing the data, leading to duplicated effort and needless divergence in the format of the reduced data. ATLAS has implemented a common toolkit for performing this processing step. By using tools from this package, physics analysis groups can produce tuples customized for a particular analysis but which are still consistent in format and vocabulary with those produced by other physics groups. The package is designed so that almost all the code is independent of the specific form used to store the tuple. The code that does depend on this is grouped into a set of small backend packages. While the ROOT backend is the most used, backends also exist for HDF5 and for specialized databases. By now, the majority of ATLAS analyses rely on this package, and it is an important contributor to the ability of ATLAS to rapidly analyze physics data.

  1. Python tools for rapid development, calibration, and analysis of generalized groundwater-flow models

    Science.gov (United States)

    Starn, J. J.; Belitz, K.

    2014-12-01

    National-scale water-quality data sets for the United States have been available for several decades; however, groundwater models to interpret these data are available for only a small percentage of the country. Generalized models may be adequate to explain and project groundwater-quality trends at the national scale by using regional scale models (defined as watersheds at or between the HUC-6 and HUC-8 levels). Coast-to-coast data such as the National Hydrologic Dataset Plus (NHD+) make it possible to extract the basic building blocks for a model anywhere in the country. IPython notebooks have been developed to automate the creation of generalized groundwater-flow models from the NHD+. The notebook format allows rapid testing of methods for model creation, calibration, and analysis. Capabilities within the Python ecosystem greatly speed up the development and testing of algorithms. GeoPandas is used for very efficient geospatial processing. Raster processing includes the Geospatial Data Abstraction Library and image processing tools. Model creation is made possible through Flopy, a versatile input and output writer for several MODFLOW-based flow and transport model codes. Interpolation, integration, and map plotting included in the standard Python tool stack also are used, making the notebook a comprehensive platform within on to build and evaluate general models. Models with alternative boundary conditions, number of layers, and cell spacing can be tested against one another and evaluated by using water-quality data. Novel calibration criteria were developed by comparing modeled heads to land-surface and surface-water elevations. Information, such as predicted age distributions, can be extracted from general models and tested for its ability to explain water-quality trends. Groundwater ages then can be correlated with horizontal and vertical hydrologic position, a relation that can be used for statistical assessment of likely groundwater-quality conditions

  2. An Autonomous Data Reduction Pipeline for Wide Angle EO Systems

    Science.gov (United States)

    Privett, G.; George, S.; Feline, W.; Ash, A.; Routledge, G.

    The UK’s National Space and Security Policy states that the identification of potential on-orbit collisions and re-entry warning over the UK is of high importance, and is driving requirements for indigenous Space Situational Awareness (SSA) systems. To meet these requirements options are being examined, including the creation of a distributed network of simple, low cost commercial–off-the-shelf electro-optical sensors to support survey work and catalogue maintenance. This paper outlines work at Dstl examining whether data obtained using readily-deployable equipment could significantly enhance UK SSA capability and support cross-cueing between multiple deployed systems. To effectively exploit data from this distributed sensor architecture, a data handling system is required to autonomously detect satellite trails in a manner that pragmatically handles highly variable target intensities, periodicity and rates of apparent motion. The processing and collection strategies must be tailored to specific mission sets to ensure effective detections of platforms as diverse as stable geostationary satellites and low altitude CubeSats. Data captured during the Automated Transfer Vehicle-5 (ATV-5) de-orbit trial and images captured of a rocket body break up and a deployed deorbit sail have been employed to inform the development of a prototype processing pipeline for autonomous on-site processing. The approach taken employs tools such as Astrometry.Net and DAOPHOT from the astronomical community, together with image processing and orbit determination software developed inhouse by Dstl. Interim results from the automated analysis of data collected from wide angle sensors are described, together with the current perceived limitations of the proposed system and our plans for future development.

  3. Interface of the general fitting tool GENFIT2 in PandaRoot

    Science.gov (United States)

    Prencipe, Elisabetta; Spataro, Stefano; Stockmanns, Tobias; PANDA Collaboration

    2017-10-01

    \\bar{{{P}}}ANDA is a planned experiment at FAIR (Darmstadt) with a cooled antiproton beam in a range [1.5; 15] GeV/c, allowing a wide physics program in nuclear and particle physics. It is the only experiment worldwide, which combines a solenoid field (B=2T) and a dipole field (B=2Tm) in a spectrometer with a fixed target topology, in that energy regime. The tracking system of \\bar{{{P}}}ANDA involves the presence of a high performance silicon vertex detector, a GEM detector, a straw-tubes central tracker, a forward tracking system, and a luminosity monitor. The offline tracking algorithm is developed within the PandaRoot framework, which is a part of the FairRoot project. The tool here presented is based on algorithms containing the Kalman Filter equations and a deterministic annealing filter. This general fitting tool (GENFIT2) offers to users also a Runge-Kutta track representation, and interfaces with Millepede II (useful for alignment) and RAVE (vertex finder). It is independent on the detector geometry and the magnetic field map, and written in C++ object-oriented modular code. Several fitting algorithms are available with GENFIT2, with user-adjustable parameters; therefore the tool is of friendly usage. A check on the fit convergence is done by GENFIT2 as well. The Kalman-Filter-based algorithms have a wide range of applications; among those in particle physics they can perform extrapolations of track parameters and covariance matrices. The adoptions of the PandaRoot framework to connect to Genfit2 are described, and the impact of GENFIT2 on the physics simulations of \\bar{{{P}}}ANDA are shown: significant improvement is reported for those channels where a good low momentum tracking is required (pT < 400 MeV/c).

  4. Mobile task management tool that improves workflow of an acute general surgical service.

    Science.gov (United States)

    Foo, Elizabeth; McDonald, Rod; Savage, Earle; Floyd, Richard; Butler, Anthony; Rumball-Smith, Alistair; Connor, Saxon

    2015-10-01

    Understanding and being able to measure constraints within a health system is crucial if outcomes are to be improved. Current systems lack the ability to capture decision making with regard to tasks performed within a patient journey. The aim of this study was to assess the impact of a mobile task management tool on clinical workflow within an acute general surgical service by analysing data capture and usability of the application tool. The Cortex iOS application was developed to digitize patient flow and provide real-time visibility over clinical decision making and task performance. Study outcomes measured were workflow data capture for patient and staff events. Usability was assessed using an electronic survey. There were 449 unique patient journeys tracked with a total of 3072 patient events recorded. The results repository was accessed 7792 times. The participants reported that the application sped up decision making, reduced redundancy of work and improved team communication. The mode of the estimated time the application saved participants was 5-9 min/h of work. Of the 14 respondents, nine discarded their analogue methods of tracking tasks by the end of the study period. The introduction of a mobile task management system improved the working efficiency of junior clinical staff. The application allowed capture of data not previously available to hospital systems. In the future, such data will contribute to the accurate mapping of patient journeys through the health system. © 2015 Royal Australasian College of Surgeons.

  5. Dynamical generalized Hurst exponent as a tool to monitor unstable periods in financial time series

    Science.gov (United States)

    Morales, Raffaello; Di Matteo, T.; Gramatica, Ruggero; Aste, Tomaso

    2012-06-01

    We investigate the use of the Hurst exponent, dynamically computed over a weighted moving time-window, to evaluate the level of stability/instability of financial firms. Financial firms bailed-out as a consequence of the 2007-2008 credit crisis show a neat increase with time of the generalized Hurst exponent in the period preceding the unfolding of the crisis. Conversely, firms belonging to other market sectors, which suffered the least throughout the crisis, show opposite behaviors. We find that the multifractality of the bailed-out firms increase at the crisis suggesting that the multi fractal properties of the time series are changing. These findings suggest the possibility of using the scaling behavior as a tool to track the level of stability of a firm. In this paper, we introduce a method to compute the generalized Hurst exponent which assigns larger weights to more recent events with respect to older ones. In this way large fluctuations in the remote past are less likely to influence the recent past. We also investigate the scaling associated with the tails of the log-returns distributions and compare this scaling with the scaling associated with the Hurst exponent, observing that the processes underlying the price dynamics of these firms are truly multi-scaling.

  6. Present status of the 4-m ILMT data reduction pipeline: application to space debris detection and characterization

    Science.gov (United States)

    Pradhan, Bikram; Delchambre, Ludovic; Hickson, Paul; Akhunov, Talat; Bartczak, Przemyslaw; Kumar, Brajesh; Surdej, Jean

    2018-04-01

    The 4-m International Liquid Mirror Telescope (ILMT) located at the ARIES Observatory (Devasthal, India) has been designed to scan at a latitude of +29° 22' 26" a band of sky having a width of about half a degree in the Time Delayed Integration (TDI) mode. Therefore, a special data-reduction and analysis pipeline to process online the large amount of optical data being produced has been dedicated to it. This requirement has led to the development of the 4-m ILMT data reduction pipeline, a new software package built with Python in order to simplify a large number of tasks aimed at the reduction of the acquired TDI images. This software provides astronomers with specially designed data reduction functions, astrometry and photometry calibration tools. In this paper we discuss the various reduction and calibration steps followed to reduce TDI images obtained in May 2015 with the Devasthal 1.3m telescope. We report here the detection and characterization of nine space debris present in the TDI frames.

  7. Large-scale inverse model analyses employing fast randomized data reduction

    Science.gov (United States)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  8. Textbook-Bundled Metacognitive Tools: A Study of LearnSmart's Efficacy in General Chemistry

    Science.gov (United States)

    Thadani, Vandana; Bouvier-Brown, Nicole C.

    2016-01-01

    College textbook publishers increasingly bundle sophisticated technology-based study tools with their texts. These tools appear promising, but empirical work on their efficacy is needed. We examined whether LearnSmart, a study tool bundled with McGraw-Hill's textbook "Chemistry" (Chang & Goldsby, 2013), improved learning in an…

  9. Micro-Arcsec mission: implications of the monitoring, diagnostic and calibration of the instrument response in the data reduction chain. .

    Science.gov (United States)

    Busonero, D.; Gai, M.

    The goals of 21st century high angular precision experiments rely on the limiting performance associated to the selected instrumental configuration and observational strategy. Both global and narrow angle micro-arcsec space astrometry require that the instrument contributions to the overall error budget has to be less than the desired micro-arcsec level precision. Appropriate modelling of the astrometric response is required for optimal definition of the data reduction and calibration algorithms, in order to ensure high sensitivity to the astrophysical source parameters and in general high accuracy. We will refer to the framework of the SIM-Lite and the Gaia mission, the most challenging space missions of the next decade in the narrow angle and global astrometry field, respectively. We will focus our dissertation on the Gaia data reduction issues and instrument calibration implications. We describe selected topics in the framework of the Astrometric Instrument Modelling for the Gaia mission, evidencing their role in the data reduction chain and we give a brief overview of the Astrometric Instrument Model Data Analysis Software System, a Java-based pipeline under development by our team.

  10. Current Events via Electronic Media: An Instructional Tool in a General Education Geology Course

    Science.gov (United States)

    Flood, T. P.

    2008-12-01

    St. Norbert College (SNC) is a liberal arts college in the Green Bay Metropolitan area with an enrollment of approximately 2100 students. All students are required to take one science course with a laboratory component as part of the general education program. Approximately 40% of all SNC students take introductory geology. Class size for this course is approximately 35 students. Each faculty member teaches one section per semester in a smart classroom A synthesis of current events via electronic media is an excellent pedagogical tool for the introductory geology course. An on-going informal survey of my introductory geology class indicates that between 75- 85% of all students in the class, mostly freshman and sophomores, do not follow the news on a regular basis in any format, i.e. print, internet, or television. Consequently, most are unaware of current scientific topics, events, trends, and relevancy. To address this issue, and develop a positive habit of the mind, a technique called In-the-News-Making-News (INMN) is employed. Each class period begins with a scientifically-related (mostly geology) online news article displayed on an overhead screen. The articles are drawn from a variety of sources that include international sites such as the BBC and CBC; national sites such as PBS, New York Times, and CNN; and local sites such as the Milwaukee Journal Sentinel and the Green Bay Press Gazette. After perusing the article, additional information is often acquired by "Google" to help supplement and clarify the original article. An interactive discussion follows. Topics that are typically covered include: global climate change, basic scientific and technological discoveries, paleontology/evolution, natural disasters, mineral/ energy/ water resources, funding for science, space exploration, and other. Ancillary areas that are often touched on in the conversation include ethics, politics, economics, philosophy, education, geography, culture, or other. INMN addresses

  11. Application of advanced data reduction methods to gas turbine dynamic analysis

    International Nuclear Information System (INIS)

    Juhl, P.B.

    1978-01-01

    This paper discusses the application of advanced data reduction methods to the evaluation of dynamic data from gas turbines and turbine components. The use of the Fast Fourier Transform and of real-time spectrum analyzers is discussed. The use of power spectral density and probability density functions for analyzing random data is discussed. Examples of the application of these modern techniques to gas turbine testing are presented. The use of the computer to automate the data reduction procedures is discussed. (orig.) [de

  12. Importance of spatial and spectral data reduction in the detection of internal defects in food products.

    Science.gov (United States)

    Zhang, Xuechen; Nansen, Christian; Aryamanesh, Nader; Yan, Guijun; Boussaid, Farid

    2015-04-01

    Despite the importance of data reduction as part of the processing of reflection-based classifications, this study represents one of the first in which the effects of both spatial and spectral data reductions on classification accuracies are quantified. Furthermore, the effects of approaches to data reduction were quantified for two separate classification methods, linear discriminant analysis (LDA) and support vector machine (SVM). As the model dataset, reflection data were acquired using a hyperspectral camera in 230 spectral channels from 401 to 879 nm (spectral resolution of 2.1 nm) from field pea (Pisum sativum) samples with and without internal pea weevil (Bruchus pisorum) infestation. We deployed five levels of spatial data reduction (binning) and eight levels of spectral data reduction (40 datasets). Forward stepwise LDA was used to select and include only spectral channels contributing the most to the separation of pixels from non-infested and infested field peas. Classification accuracies obtained with LDA and SVM were based on the classification of independent validation datasets. Overall, SVMs had significantly higher classification accuracies than LDAs (P food products with internal defects, and it highlights that spatial and spectral data reductions can (1) improve classification accuracies, (2) vastly decrease computer constraints, and (3) reduce analytical concerns associated with classifications of large and high-dimensional datasets.

  13. The 7-item generalized anxiety disorder scale as a tool for measuring generalized anxiety in multiple sclerosis.

    Science.gov (United States)

    Terrill, Alexandra L; Hartoonian, Narineh; Beier, Meghan; Salem, Rana; Alschuler, Kevin

    2015-01-01

    Generalized anxiety disorder (GAD) is common in multiple sclerosis (MS) but understudied. Reliable and valid measures are needed to advance clinical care and expand research in this area. The objectives of this study were to examine the psychometric properties of the 7-item Generalized Anxiety Disorder Scale (GAD-7) in individuals with MS and to analyze correlates of GAD. Participants (N = 513) completed the anxiety module of the Patient Health Questionnaire (GAD-7). To evaluate psychometric properties of the GAD-7, the sample was randomly split to conduct exploratory and confirmatory factor analyses. Based on the exploratory factor analysis, a one-factor structure was specified for the confirmatory factor analysis, which showed excellent global fit to the data (χ(2) 12 = 15.17, P = .23, comparative fit index = 0.99, root mean square error of approximation = 0.03, standardized root mean square residual = 0.03). The Cronbach alpha (0.75) indicated acceptable internal consistency for the scale. Furthermore, the GAD-7 was highly correlated with the Hospital Anxiety and Depression Scale-Anxiety (r = 0.70). Age and duration of MS were both negatively associated with GAD. Higher GAD-7 scores were observed in women and individuals with secondary progressive MS. Individuals with higher GAD-7 scores also endorsed more depressive symptoms. These findings support the reliability and internal validity of the GAD-7 for use in MS. Correlational analyses revealed important relationships with demographics, disease course, and depressive symptoms, which suggest the need for further anxiety research.

  14. Examination of skin lesions for cancer : Which clinical decision aids and tools are available in general practice?

    NARCIS (Netherlands)

    Koelink, Cecile J. L.; Jonkman, Marcel F.; Van der Meer, Klaas; Van der Heide, Wouter K.

    2014-01-01

    Background While skin cancer incidence is rising throughout Europe, general practitioners (GP) feel unsure about their ability to diagnose skin malignancies. Objectives To evaluate whether the GP has sufficient validated clinical decision aids and tools for the examination of potentially malignant

  15. Development of the EMAP tool facilitating existential communication between general practitioners and cancer patients

    DEFF Research Database (Denmark)

    Assing Hvidt, Elisabeth; Hansen, Dorte Gilså; Ammentorp, Jette

    2017-01-01

    BACKGROUND: General practice recognizes the existential dimension as an integral part of multidimensional patient care alongside the physical, psychological and social dimensions. However, general practitioners (GPs) report substantial barriers related to communication with patients about existen...

  16. A New Comprehensive Short-form Health Literacy Survey Tool for Patients in General

    Directory of Open Access Journals (Sweden)

    Tuyen Van Duong, RN, MSN, PhD

    2017-03-01

    Conclusion: The comprehensive HL-SF12 was a valid and easy to use tool for assessing patients’ health literacy in the hospitals to facilitate healthcare providers in enhancing patients’ health literacy and healthcare qualities.

  17. A Heuristic Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) Authoring Tools

    Science.gov (United States)

    2016-03-01

    incorporates the following: models of domain knowledge (e.g., content, feedback, and remediation); pedagogical methods based on best instructional practices... relationship to each other (i.e., an ontology). • User Control and Freedom: Users often choose system functions by mistake and will need a clearly marked...2.2.1 Issue: Name Mismatch between Gift Authoring Tool and Browser UI Users may not understand the relationship between the Gift Authoring Tool and

  18. Measurement properties of tools used to assess suicidality in autistic and general population adults: A systematic review.

    Science.gov (United States)

    Cassidy, S A; Bradley, L; Bowen, E; Wigham, S; Rodgers, J

    2018-05-05

    Adults diagnosed with autism are at significantly increased risk of suicidal thoughts, suicidal behaviours and dying by suicide. However, it is unclear whether any validated tools are currently available to effectively assess suicidality in autistic adults in research and clinical practice. This is crucial for understanding and preventing premature death by suicide in this vulnerable group. This two stage systematic review therefore aimed to identify tools used to assess suicidality in autistic and general population adults, evaluate these tools for their appropriateness and measurement properties, and make recommendations for appropriate selection of suicidality assessment tools in research and clinical practice. Three databases were searched (PsycInfo, Medline and Web of Knowledge). Four frequently used suicidality assessment tools were identified, and subsequently rated for quality of the evidence in support of their measurement properties using the COSMIN checklist. Despite studies having explored suicidality in autistic adults, none had utilised a validated tool. Overall, there was lack of evidence in support of suicidality risk assessments successfully predicting future suicide attempts. We recommend adaptations to current suicidality assessment tools and priorities for future research, in order to better conceptualise suicidality and its measurement in autism. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Chronic Care Team Profile: a brief tool to measure the structure and function of chronic care teams in general practice.

    Science.gov (United States)

    Proudfoot, Judith G; Bubner, Tanya; Amoroso, Cheryl; Swan, Edward; Holton, Christine; Winstanley, Julie; Beilby, Justin; Harris, Mark F

    2009-08-01

    At a time when workforce shortages in general practices are leading to greater role substitution and skill-mix diversification, and the demand on general practices for chronic disease care is increasing, the structure and function of the general practice team is taking on heightened importance. To assist general practices and the organizations supporting them to assess the effectiveness of their chronic care teamworking, we developed an interview tool, the Chronic Care Team Profile (CCTP), to measure the structure and function of teams in general practice. This paper describes its properties and potential use. An initial pool of items was derived from guidelines of best-practice for chronic disease care and performance standards for general practices. The items covered staffing, skill-mix, job descriptions and roles, training, protocols and procedures within the practice. The 41-item pool was factor analysed, retained items were measured for internal consistency and the reduced instrument's face, content and construct validity were evaluated. A three-factor solution corresponding to non-general practitioner staff roles in chronic care, administrative functions and management structures provided the best fit to the data and explained 45% of the variance in the CCTP. Further analyses suggested that the CCTP is reliable, valid and has some utility. The CCTP measures aspects of the structure and function of general practices which are independent of team processes. It is associated with the job satisfaction of general practice staff and the quality of care provided to patients with chronic illnesses. As such, the CCTP offers a simple and useful tool for general practices to assess their teamworking in chronic disease care.

  20. Usefulness of a virtual community of practice and web 2.0 tools for general practice training: experiences and expectations of general practitioner registrars and supervisors.

    Science.gov (United States)

    Barnett, Stephen; Jones, Sandra C; Bennett, Sue; Iverson, Don; Bonney, Andrew

    2013-01-01

    General practice training is a community of practice in which novices and experts share knowledge. However, there are barriers to knowledge sharing for general practioner (GP) registrars, including geographic and workplace isolation. Virtual communities of practice (VCoP) can be effective in overcoming these barriers using social media tools. The present study examined the perceived usefulness, features and barriers to implementing a VCoP for GP training. Following a survey study of GP registrars and supervisors on VCoP feasibility, a qualitative telephone interview study was undertaken within a regional training provider. Participants with the highest Internet usage in the survey study were selected. Two researchers worked independently conducting thematic analysis using manual coding of transcriptions, later discussing themes until agreement was reached. Seven GP registrars and three GP supervisors participated in the study (average age 38.2 years). Themes emerged regarding professional isolation, potential of social media tools to provide peer support and improve knowledge sharing, and barriers to usage, including time, access and skills. Frequent Internet-using GP registrars and supervisors perceive a VCoP for GP training as a useful tool to overcome professional isolation through improved knowledge sharing. Given that professional isolation can lead to decreased rural work and reduced hours, a successful VCoP may have a positive outcome on the rural medical workforce.

  1. A low-cost multichannel analyzer with data reduction assembly for continuous air monitoring system

    International Nuclear Information System (INIS)

    Zoghi, B.; Lee, Y.; Nelson, D.C.

    1992-01-01

    This paper reports on a microcontroller-based multichannel analyzer (MCA) with a data reduction assembly (DRA) for a plutonium continuous air monitor (CAM) system. The MCA is capable of detecting the airborne alpha emitters in the presence of radon daughter products. The pulse output from the preamplifier has been stretched to allow the peak detector sufficient time to capture the pulse height. The pulse amplitude conversion, the data acquisition, and the output functions are carried out fully by software. The DRA consists of a data reduction unit (DRU) and its operator interface panel. The data reduction assembly has the ability to be networked to a single PC with up to 332 different CAM's remotely connected to it

  2. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    Science.gov (United States)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  3. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Imprecision and uncertainty in information representation and processing new tools based on intuitionistic fuzzy sets and generalized nets

    CERN Document Server

    Sotirov, Sotir

    2016-01-01

    The book offers a comprehensive and timely overview of advanced mathematical tools for both uncertainty analysis and modeling of parallel processes, with a special emphasis on intuitionistic fuzzy sets and generalized nets. The different chapters, written by active researchers in their respective areas, are structured to provide a coherent picture of this interdisciplinary yet still evolving field of science. They describe key tools and give practical insights into and research perspectives on the use of Atanassov's intuitionistic fuzzy sets and logic, and generalized nets for describing and dealing with uncertainty in different areas of science, technology and business, in a single, to date unique book. Here, readers find theoretical chapters, dealing with intuitionistic fuzzy operators, membership functions and algorithms, among other topics, as well as application-oriented chapters, reporting on the implementation of methods and relevant case studies in management science, the IT industry, medicine and/or ...

  5. The OENORM S 5200 'Radioactivity in building materials' as a tool for radiation protection of the general population

    International Nuclear Information System (INIS)

    Kunsch, B.

    1989-04-01

    This report comprises two papers, one which is announced in the title, i.e. B. Kunsch, F. Steger, E. Tschirf: The OENORM S 5200 'Radioactivity in building materials' as a tool for radiation protection of the general population; and in addition a paper by F. Steger, H. Stadtmann, P. Kindl, L. Breitenhuber: Radon in dwellings: investigations and measurements. The two papers are treated separately. (qui)

  6. Evaluation of data reduction methods for dynamic PET series based on Monte Carlo techniques and the NCAT phantom

    International Nuclear Information System (INIS)

    Thireou, Trias; Rubio Guivernau, Jose Luis; Atlamazoglou, Vassilis; Ledesma, Maria Jesus; Pavlopoulos, Sotiris; Santos, Andres; Kontaxakis, George

    2006-01-01

    A realistic dynamic positron-emission tomography (PET) thoracic study was generated, using the 4D NURBS-based (non-uniform rational B-splines) cardiac-torso (NCAT) phantom and a sophisticated model of the PET imaging process, simulating two solitary pulmonary nodules. Three data reduction and blind source separation methods were applied to the simulated data: principal component analysis, independent component analysis and similarity mapping. All methods reduced the initial amount of image data to a smaller, comprehensive and easily managed set of parametric images, where structures were separated based on their different kinetic characteristics and the lesions were readily identified. The results indicate that the above-mentioned methods can provide an accurate tool for the support of both visual inspection and subsequent detailed kinetic analysis of the dynamic series via compartmental or non-compartmental models

  7. Composite Material Testing Data Reduction to Adjust for the Systematic 6-DOF Testing Machine Aberrations

    Science.gov (United States)

    Athanasios lliopoulos; John G. Michopoulos; John G. C. Hermanson

    2012-01-01

    This paper describes a data reduction methodology for eliminating the systematic aberrations introduced by the unwanted behavior of a multiaxial testing machine, into the massive amounts of experimental data collected from testing of composite material coupons. The machine in reference is a custom made 6-DoF system called NRL66.3 and developed at the NAval...

  8. Data reduction and analysis programs for neutron reflection studies of monolayer adsorption at interfaces

    International Nuclear Information System (INIS)

    Penfold, J.

    1992-07-01

    Data reduction and analysis programs for neutron reflectivity data from monolayer adsorption at interfaces are described. The application of model fitting to the reflectivity data, and the determination of partial structure factors within the kinematic approximation are discussed. Recent data for the adsorption of surfactants at the air-solution interface are used to illustrate the programs described. (author)

  9. Atomic absorption spectrometer readout and data reduction using the LSI-11 microcomputer

    International Nuclear Information System (INIS)

    Allen, M.J.; Wikkerink, R.W.

    1978-01-01

    Some common instruments found in the chemistry laboratory have analog chart recorder output as their primary data readout media. Data reduction from this medium is slow and relatively inaccurate. This paper describes how to interface a single LSI-11 microcomputer to PERKIN-ELMER models 603 and 303 Atomic Absorption Spectrophotometers

  10. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

    Science.gov (United States)

    Konrad, T. G.; Kropfli, R. A.

    1975-01-01

    Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

  11. Application of Generalized Mie Theory to EELS Calculations as a Tool for Optimization of Plasmonic Structures

    DEFF Research Database (Denmark)

    Thomas, Stefan; Matyssek, Christian; Hergert, Wolfram

    2015-01-01

    Technical applications of plasmonic nanostructures require a careful structural optimization with respect to the desired functionality. The success of such optimizations strongly depends on the applied method. We extend the generalized multiparticle Mie (GMM) computational electromagnetic method ...... by the application of genetic algorithms combined with a simplex algorithm. The scheme is applied to the design of plasmonic filters.......Technical applications of plasmonic nanostructures require a careful structural optimization with respect to the desired functionality. The success of such optimizations strongly depends on the applied method. We extend the generalized multiparticle Mie (GMM) computational electromagnetic method...

  12. Evaluation of the generalized gamma as a tool for treatment planning optimization

    Directory of Open Access Journals (Sweden)

    Emmanouil I Petrou

    2014-12-01

    Full Text Available Purpose: The aim of that work is to study the theoretical behavior and merits of the Generalized Gamma (generalized dose response gradient as well as to investigate the usefulness of this concept in practical radiobiological treatment planning.Methods: In this study, the treatment planning system RayStation 1.9 (Raysearch Laboratories AB, Stockholm, Sweden was used. Furthermore, radiobiological models that provide the tumor control probability (TCP, normal tissue complication probability (NTCP, complication-free tumor control probability (P+ and the Generalized Gamma were employed. The Generalized Gammas of TCP and NTCP, respectively were calculated for given heterogeneous dose distributions to different organs in order to verify the TCP and NTCP computations of the treatment planning system. In this process, a treatment plan was created, where the target and the organs at risk were included in the same ROI in order to check the validity of the system regarding the objective function P+ and the Generalized Gamma. Subsequently, six additional treatment plans were created with the target organ and the organs at risk placed in the same or different ROIs. In these plans, the mean dose was increased in order to investigate the behavior of dose change on tissue response and on Generalized Gamma before and after the change in dose. By theoretically calculating these quantities, the agreement of different theoretical expressions compared to the values that the treatment planning system provides could be evaluated. Finally, the relative error between the real and approximate response values using the Poisson and the Probit models, for the case of having a target organ consisting of two compartments in a parallel architecture and with the same number of clonogens could be investigated and quantified. Results: The computations of the RayStation regarding the values of the Generalized Gamma and the objective function (P+ were verified by using an

  13. Workplace Learning among General Practitioners and Specialists: The Use of Videoconferencing as a Tool

    Science.gov (United States)

    Nilsen, Line Lundvoll

    2011-01-01

    Purpose: Videoconferencing between general practitioners and hospitals has been developed to provide higher quality health care services in Norway by promoting interaction between levels of care. This article aims to explore the use of videoconferencing for information exchange and consultation throughout the patient trajectory and to investigate…

  14. Hospital discharge summary scorecard: a quality improvement tool used in a tertiary hospital general medicine service.

    Science.gov (United States)

    Singh, G; Harvey, R; Dyne, A; Said, A; Scott, I

    2015-12-01

    We assessed the impact of completion and feedback of discharge summary scorecards on the quality of discharge summaries written by interns in a general medicine service of a tertiary hospital. The scorecards significantly improved summary quality in the first three rotations of the intern year and could be readily adopted by other units as a quality improvement intervention for optimizing clinical handover to primary care providers. © 2015 Royal Australasian College of Physicians.

  15. Data reduction, radial velocities and stellar parameters from spectra in the very low signal-to-noise domain

    Science.gov (United States)

    Malavolta, Luca

    2013-10-01

    Large astronomical facilities usually provide data reduction pipeline designed to deliver ready-to-use scientific data, and too often as- tronomers are relying on this to avoid the most difficult part of an astronomer job Standard data reduction pipelines however are usu- ally designed and tested to have good performance on data with av- erage Signal to Noise Ratio (SNR) data, and the issues that are related with the reduction of data in the very low SNR domain are not taken int account properly. As a result, informations in data with low SNR are not optimally exploited. During the last decade our group has collected thousands of spec- tra using the GIRAFFE spectrograph at Very Large Telescope (Chile) of the European Southern Observatory (ESO) to determine the ge- ometrical distance and dynamical state of several Galactic Globular Clusters but ultimately the analysis has been hampered by system- atics in data reduction, calibration and radial velocity measurements. Moreover these data has never been exploited to get other informa- tions like temperature and metallicity of stars, because considered too noisy for these kind of analyses. In this thesis we focus our attention on data reduction and analysis of spectra with very low SNR. The dataset we analyze in this thesis comprises 7250 spectra for 2771 stars of the Globular Cluster M 4 (NGC 6121) in the wavelength region 5145-5360Å obtained with GIRAFFE. Stars from the upper Red Giant Branch down to the Main Sequence have been observed in very different conditions, including nights close to full moon, and reaching SNR - 10 for many spectra in the dataset. We will first review the basic steps of data reduction and spec- tral extraction, adapting techniques well tested in other field (like photometry) but still under-developed in spectroscopy. We improve the wavelength dispersion solution and the correction of radial veloc- ity shift between day-time calibrations and science observations by following a completely

  16. "iBIM"--internet-based interactive modules: an easy and interesting learning tool for general surgery residents.

    Science.gov (United States)

    Azer, Nader; Shi, Xinzhe; de Gara, Chris; Karmali, Shahzeer; Birch, Daniel W

    2014-04-01

    The increased use of information technology supports a resident- centred educational approach that promotes autonomy, flexibility and time management and helps residents to assess their competence, promoting self-awareness. We established a web-based e-learning tool to introduce general surgery residents to bariatric surgery and evaluate them to determine the most appropriate implementation strategy for Internet-based interactive modules (iBIM) in surgical teaching. Usernames and passwords were assigned to general surgery residents at the University of Alberta. They were directed to the Obesity101 website and prompted to complete a multiple-choice precourse test. Afterwards, they were able to access the interactive modules. Residents could review the course material as often as they wanted before completing a multiple-choice postcourse test and exit survey. We used paired t tests to assess the difference between pre- and postcourse scores. Out of 34 residents who agreed to participate in the project, 12 completed the project (35.3%). For these 12 residents, the precourse mean score was 50 ± 17.3 and the postcourse mean score was 67 ± 14 (p = 0.020). Most residents who participated in this study recommended using the iBIMs as a study tool for bariatric surgery. Course evaluation scores suggest this novel approach was successful in transferring knowledge to surgical trainees. Further development of this tool and assessment of implementation strategies will determine how iBIM in bariatric surgery may be integrated into the curriculum.

  17. A generally applicable lightweight method for calculating a value structure for tools and services in bioinformatics infrastructure projects.

    Science.gov (United States)

    Mayer, Gerhard; Quast, Christian; Felden, Janine; Lange, Matthias; Prinz, Manuel; Pühler, Alfred; Lawerenz, Chris; Scholz, Uwe; Glöckner, Frank Oliver; Müller, Wolfgang; Marcus, Katrin; Eisenacher, Martin

    2017-10-30

    Sustainable noncommercial bioinformatics infrastructures are a prerequisite to use and take advantage of the potential of big data analysis for research and economy. Consequently, funders, universities and institutes as well as users ask for a transparent value model for the tools and services offered. In this article, a generally applicable lightweight method is described by which bioinformatics infrastructure projects can estimate the value of tools and services offered without determining exactly the total costs of ownership. Five representative scenarios for value estimation from a rough estimation to a detailed breakdown of costs are presented. To account for the diversity in bioinformatics applications and services, the notion of service-specific 'service provision units' is introduced together with the factors influencing them and the main underlying assumptions for these 'value influencing factors'. Special attention is given on how to handle personnel costs and indirect costs such as electricity. Four examples are presented for the calculation of the value of tools and services provided by the German Network for Bioinformatics Infrastructure (de.NBI): one for tool usage, one for (Web-based) database analyses, one for consulting services and one for bioinformatics training events. Finally, from the discussed values, the costs of direct funding and the costs of payment of services by funded projects are calculated and compared. © The Author 2017. Published by Oxford University Press.

  18. Online Platform as a Tool to Support Postgraduate Training in General Practice – A Case Report

    Science.gov (United States)

    Dini, Lorena; Galanski, Claire; Döpfmer, Susanne; Gehrke-Beck, Sabine; Bayer, Gudrun; Boeckle, Martin; Micheel, Isabel; Novak, Jasminko; Heintze, Christoph

    2017-01-01

    Objective: Physicians in postgraduate training (PPT) in General Practice (GP) typically have very little interaction with their peers, as there is usually only one resident physician working in their respective department or GP office at a given time. Therefore, the online platform KOLEGEA, presented here, aims to support postgraduate training in general practice (PT in GP) in Germany through virtual interaction. Methodology: In 2012, the interdisciplinary research project KOLEGEA set up an online platform that any physicians in PT in GP can use for free after registration with their unitary continuous education number (Einheitliche Fortbildungsnummer, EFN). It offers problem-based learning and allows to discuss self-published anonymized patient cases with the community that can be classified and discussed with experienced mentors (specialists in general practice - GPs) in small virtual groups. Results: An anonymous online survey carried out as part of the 2014 project evaluation showed a good acceptance of the platform, even though shortage of time was mentioned as a limiting factor for its use. Data analysis showed that KOLEGEA was used by PPT in GP in all federal states. Patterns of passive use were predominant (90%). This report also describes the further development of the platform (in 2015 and 2016) that integrates an activity monitor as part of a gamification concept. Conclusions: Due to a low response rate of the 2014 online survey and the preliminary evaluations of usage patterns we could identify only initial trends regarding the role of KOLEGEA in supporting PPT. The platform was perceived as a helpful supplement to better structure PT in GP. PMID:29226227

  19. Wormholes in spacetime and their use for interstellar travel: A tool for teaching general relativity

    International Nuclear Information System (INIS)

    Morris, M.S.; Thorne, K.S.

    1988-01-01

    Rapid interstellar travel by means of spacetime wormholes is described in a way that is useful for teaching elementary general relativity. The description touches base with Carl Sagan's novel Contact, which, unlike most science fiction novels, treats such travel in a manner that accords with the best 1986 knowledge of the laws of physics. Many objections are given against the use of black holes or Schwarzschild wormholes for rapid interstellar travel. A new class of solutions of the Einstein field equations is presented, which describe wormholes that, in principle, could be traversed by human beings. It is essential in these solutions that the wormhole possess a throat at which there is no horizon; and this property, together with the Einstein field equations, places an extreme constraint on the material that generates the wormhole's spacetime curvature: In the wormhole's throat that material must possess a radial tension tau 0 with the enormous magnitude tau 0 ∼ (pressure at the center of the most massive of neutron stars) x (20 km) 2 /(circumference of throat) 2 . Moreover, this tension must exceed the material's density of mass-energy, rho 0 c 2 . No known material has this tau 0 >rho 0 c 2 property, and such material would violate all the ''energy conditions'' that underlie some deeply cherished theorems in general relativity. However, it is not possible today to rule out firmly the existence of such material; and quantum field theory gives tantalizing hints that such material might, in fact, be possible

  20. General Authorisations as a Tool to Promote Water Allocation Reform in South Africa

    Directory of Open Access Journals (Sweden)

    A. Anderson, G. Quibell, J. Cullis and N. Ncapayi

    2007-09-01

    Full Text Available South Africa faces significant inequities in access to and use of water for productive purposes. The National Water Act seeks to address these inequities and introduced a public rights system where water is owned by the people of South Africa and held in custody by the state. This public trust doctrine forms the basis for the State to give effect to its constitutional obligation for redress. Compulsory licensing is a mechanism to proactively reallocate water on a catchment basis to achieve redress, while at the same time promoting economic efficiency and ecological sustainability. During compulsory licensing, all users are required to reapply for their water use entitlement, and a process is followed to allow for a fairer allocation of water between competing users and sectors. Some concerns have been raised that equity may not be achieved through compulsory licensing as historically disadvantaged individuals may not have the capacity to partake in the process. Similarly, the administrative burden of processing large numbers of licences from small scale users may cripple licensing authorities. Moreover, the compulsory licensing process, while encouraging Historically Disadvantaged Individuals (HDIs to apply, may have little impact on poverty if the poorest are not able to participate in the process. General authorisations are proposed as a way of addressing these concerns by setting water aside for specific categories of users. This paper introduces the concept of general authorisations in support of compulsory licensing and outlines some of the implementation challenges.

  1. A General Tool for Engineering the NAD/NADP Cofactor Preference of Oxidoreductases.

    Science.gov (United States)

    Cahn, Jackson K B; Werlang, Caroline A; Baumschlager, Armin; Brinkmann-Chen, Sabine; Mayo, Stephen L; Arnold, Frances H

    2017-02-17

    The ability to control enzymatic nicotinamide cofactor utilization is critical for engineering efficient metabolic pathways. However, the complex interactions that determine cofactor-binding preference render this engineering particularly challenging. Physics-based models have been insufficiently accurate and blind directed evolution methods too inefficient to be widely adopted. Building on a comprehensive survey of previous studies and our own prior engineering successes, we present a structure-guided, semirational strategy for reversing enzymatic nicotinamide cofactor specificity. This heuristic-based approach leverages the diversity and sensitivity of catalytically productive cofactor binding geometries to limit the problem to an experimentally tractable scale. We demonstrate the efficacy of this strategy by inverting the cofactor specificity of four structurally diverse NADP-dependent enzymes: glyoxylate reductase, cinnamyl alcohol dehydrogenase, xylose reductase, and iron-containing alcohol dehydrogenase. The analytical components of this approach have been fully automated and are available in the form of an easy-to-use web tool: Cofactor Specificity Reversal-Structural Analysis and Library Design (CSR-SALAD).

  2. Clinical audit, a valuable tool to improve quality of care: General methodology and applications in nephrology

    Science.gov (United States)

    Esposito, Pasquale; Dal Canton, Antonio

    2014-01-01

    Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings. PMID:25374819

  3. Development and performance analysis of a lossless data reduction algorithm for voip

    International Nuclear Information System (INIS)

    Misbahuddin, S.; Boulejfen, N.

    2014-01-01

    VoIP (Voice Over IP) is becoming an alternative way of voice communications over the Internet. To better utilize voice call bandwidth, some standard compression algorithms are applied in VoIP systems. However, these algorithms affect the voice quality with high compression ratios. This paper presents a lossless data reduction technique to improve VoIP data transfer rate over the IP network. The proposed algorithm exploits the data redundancies in digitized VFs (Voice Frames) generated by VoIP systems. Performance of proposed data reduction algorithm has been presented in terms of compression ratio. The proposed algorithm will help retain the voice quality along with the improvement in VoIP data transfer rates. (author)

  4. Implementation of on-line data reduction algorithms in the CMS Endcap Preshower Data Concentrator Cards

    CERN Document Server

    Barney, D; Kokkas, P; Manthos, N; Sidiropoulos, G; Reynaud, S; Vichoudis, P

    2007-01-01

    The CMS Endcap Preshower (ES) sub-detector comprises 4288 silicon sensors, each containing 32 strips. The data are transferred from the detector to the counting room via 1208 optical fibres running at 800Mbps. Each fibre carries data from two, three or four sensors. For the readout of the Preshower, a VME-based system, the Endcap Preshower Data Concentrator Card (ES-DCC), is currently under development. The main objective of each readout board is to acquire on-detector data from up to 36 optical links, perform on-line data reduction via zero suppression and pass the concentrated data to the CMS event builder. This document presents the conceptual design of the Reduction Algorithms as well as their implementation in the ES-DCC FPGAs. These algorithms, as implemented in the ES-DCC, result in a data-reduction factor of 20.

  5. The Spiral Discovery Network as an Automated General-Purpose Optimization Tool

    Directory of Open Access Journals (Sweden)

    Adam B. Csapo

    2018-01-01

    Full Text Available The Spiral Discovery Method (SDM was originally proposed as a cognitive artifact for dealing with black-box models that are dependent on multiple inputs with nonlinear and/or multiplicative interaction effects. Besides directly helping to identify functional patterns in such systems, SDM also simplifies their control through its characteristic spiral structure. In this paper, a neural network-based formulation of SDM is proposed together with a set of automatic update rules that makes it suitable for both semiautomated and automated forms of optimization. The behavior of the generalized SDM model, referred to as the Spiral Discovery Network (SDN, and its applicability to nondifferentiable nonconvex optimization problems are elucidated through simulation. Based on the simulation, the case is made that its applicability would be worth investigating in all areas where the default approach of gradient-based backpropagation is used today.

  6. Stuttering generalization self-measure: Preliminary development of a self-measuring tool.

    Science.gov (United States)

    Alameer, Mohammad; Meteyard, Lotte; Ward, David

    2017-09-01

    Generalization of treatment is considered a difficult task for clinicians and people who stutter (PWS), and can constitute a barrier to long-term treatment success. To our knowledge, there are no standardized tests that collect measurement of the behavioral and cognitive aspects alongside the client's self-perception in real-life speaking situations. This paper describes the preliminary development of a Stuttering Generalization Self-Measure (SGSM). The purpose of SGSM is to assess 1) stuttering severity and 2) speech-anxiety level during real-life situations as perceived by PWS. Additionally, this measurement aims to 3) investigate correlations between stuttering severity and speech-anxiety level within the same real-life situation. The SGSM initially reported includes nine speaking situations designed that are developed to cover a variety of frequent speaking scenario situations. However, two of these were less commonly encountered by participants and subsequently not included in the final analyses. Items were created according to five listener categories (family and close friends, acquaintances, strangers, persons of authority, and giving a short speech to small audience). Forty-three participants (22 PWS, and 21 control) aged 18 to 53 years were asked to complete the assessment in real-life situations. Analyses indicated that test-retest reliability was high for both groups. Discriminant validity was also achieved as the SGSM scores significantly differed between the controls and PWS two groups for stuttering and speech-anxiety. Convergent validity was confirmed by significant correlations between the SGSM and other speech-related anxiety measures. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Fast ADC interface with data reduction facilities for multi-parameter experiments in nuclear physics

    Energy Technology Data Exchange (ETDEWEB)

    Liebl, W; Franz, N; Ziegler, G [Technische Univ. Muenchen, Garching (Germany, F.R.). Fakultaet Physik; Hegewisch, S; Kunz, D; Maier, D; Lutter, R; Schoeffel, K; Stanzel, B [Muenchen Univ. (Germany, F.R.). Sektion Physik; Drescher, B [Hahn-Meitner-Institut fuer Kernforschung Berlin G.m.b.H. (Germany, F.R.)

    1982-03-01

    A modular ADC interface system for multi-parameter experiments with single NIM ADCs is described. 16 fast ADCs are handled by CAMAC modules and data buses in order to build up a sophisticated hardware system which is able to take coincidence data and singles spectra in parallel. The coincidence logic is handled by one of the interface modules; the interface allows online data reduction. The further expansion of the system will be discussed.

  8. A fast ADC interface with data reduction facilities for multi-parameter experiments in nuclear physics

    International Nuclear Information System (INIS)

    Liebl, W.; Franz, N.; Ziegler, G.

    1982-01-01

    A modular ADC interface system for multi-parameter experiments with single NIM ADCs is described. 16 fast ADCs are handled by CAMAC modules and data buses in order to build up a sophisticated hardware system which is able to take coincidence data and singles spectra in parallel. The coincidence logic is handled by one of the interface modules; the interface allows online data reduction. The further expansion of the system will be discussed. (orig.)

  9. FIRBACK Far Infrared Survey with ISO: Data Reduction, Analysis and First Results

    OpenAIRE

    Dole, Herve; Lagache, Guilaine; Puget, Jean-Loup; Gispert, Richard; Aussel, H.; Bouchet, F. R.; Ciliegi, C.; Clements, D. L.; Cesarsky, C.; Desert, F-X; Elbaz, D.; Franceschini, A.; Guiderdoni, B.; Harwit, M.; Laureijs, R.

    1999-01-01

    FIRBACK is one of the deepest cosmological surveys performed in the far infrared, using ISOPHOT. We describe this survey, its data reduction and analysis. We present the maps of fields at 175 microns. We point out some first results: source identifications with radio and mid infrared, and source counts at 175 microns. These two results suggest that half of the FIRBACK sources are probably at redshifts greater than 1. We also present briefly the large follow-up program.

  10. Multistation iodine-125 continuous air monitor with minicomputer alarm and data reduction

    International Nuclear Information System (INIS)

    Garfield, D.K.

    1978-01-01

    The components, operation, and calibration are described of a Multistation Continuous Air Monitor for the analysis of Iodine-125 and the functions of the Minicomputer in providing alarm functions and data reduction to units specified by regulation for permanent records. The sensitivity and accuracy, as well as the justification for purchase and comparison of costs with other types of air monitoring systems are also described

  11. Data Reduction and Control Software for Meteor Observing Stations Based on CCD Video Systems

    Science.gov (United States)

    Madiedo, J. M.; Trigo-Rodriguez, J. M.; Lyytinen, E.

    2011-01-01

    The SPanish Meteor Network (SPMN) is performing a continuous monitoring of meteor activity over Spain and neighbouring countries. The huge amount of data obtained by the 25 video observing stations that this network is currently operating made it necessary to develop new software packages to accomplish some tasks, such as data reduction and remote operation of autonomous systems based on high-sensitivity CCD video devices. The main characteristics of this software are described here.

  12. Sample survey methods as a quality assurance tool in a general practice immunisation audit.

    Science.gov (United States)

    Cullen, R

    1994-04-27

    In a multidoctor family practice there are often just too many sets of patients records to make it practical to repeat an audit by census of even an age band of the practice on a regular basis. This paper attempts to demonstrate how sample survey methodology can be incorporated into the quality assurance cycle. A simple random sample (with replacement) of 120 from 580 children with permanent records who were aged between 6 weeks and 2 years old from an Auckland general practice was performed, with sample size selected to give a predetermined precision. The survey was then repeated after 4 weeks. Both surveys were able to be completed within the course of a normal working day. An unexpectedly low level of under 2 years olds that were recorded as not overdue for any immunisations was found (22.5%) with only a modest improvement after a standard telephone/letter catch up campaign. Seventy-two percent of the sample held a group one community services card. The advantages of properly conducted sample surveys in producing useful estimates of known precision without disrupting office routines excessively were demonstrated. Through some attention to methodology, the trauma of a practice census can be avoided.

  13. A tool to measure whether business management capacity in general practice impacts on the quality of chronic illness care.

    Science.gov (United States)

    Holton, Christine H; Proudfoot, Judith G; Jayasinghe, Upali W; Grimm, Jane; Bubner, Tanya K; Winstanley, Julie; Harris, Mark F; Beilby, Justin J

    2010-11-01

    Our aim was to develop a tool to identify specific features of the business and financial management of practices that facilitate better quality care for chronic illness in primary care. Domains of management were identified, resulting in the development of a structured interview tool that was administered in 97 primary care practices in Australia. Interview items were screened and subjected to factor analysis, subscales identified and the overall model fit determined. The instrument's validity was assessed against another measure of quality of care. Analysis provided a four-factor solution containing 21 items, which explained 42.5% of the variance in the total scores. The factors related to administrative processes, human resources, marketing analysis and business development. All scores increased significantly with practice size. The business development subscale and total score were higher for rural practices. There was a significant correlation between the business development subscale and quality of care. The indicators of business and financial management in the final tool appear to be useful predictors of the quality of care. The instrument may help inform policy regarding the structure of general practice and implementation of a systems approach to chronic illness care. It can provide information to practices about areas for further development.

  14. Phylogenetic studies of transmission dynamics in generalized HIV epidemics: An essential tool where the burden is greatest?

    Science.gov (United States)

    Dennis, Ann M.; Herbeck, Joshua T.; Brown, Andrew Leigh; Kellam, Paul; de Oliveira, Tulio; Pillay, Deenan; Fraser, Christophe; Cohen, Myron S.

    2014-01-01

    Efficient and effective HIV prevention measures for generalized epidemics in sub-Saharan Africa have not yet been validated at the population-level. Design and impact evaluation of such measures requires fine-scale understanding of local HIV transmission dynamics. The novel tools of HIV phylogenetics and molecular epidemiology may elucidate these transmission dynamics. Such methods have been incorporated into studies of concentrated HIV epidemics to identify proximate and determinant traits associated with ongoing transmission. However, applying similar phylogenetic analyses to generalized epidemics, including the design and evaluation of prevention trials, presents additional challenges. Here we review the scope of these methods and present examples of their use in concentrated epidemics in the context of prevention. Next, we describe the current uses for phylogenetics in generalized epidemics, and discuss their promise for elucidating transmission patterns and informing prevention trials. Finally, we review logistic and technical challenges inherent to large-scale molecular epidemiological studies of generalized epidemics, and suggest potential solutions. PMID:24977473

  15. Participants' evaluation of a group-based organisational assessment tool in Danish general practice: the Maturity Matrix.

    Science.gov (United States)

    Buch, Martin Sandberg; Edwards, Adrian; Eriksson, Tina

    2009-01-01

    The Maturity Matrix is a group-based formative self-evaluation tool aimed at assessing the degree of organisational development in general practice and providing a starting point for local quality improvement. Earlier studies of the Maturity Matrix have shown that participants find the method a useful way of assessing their practice's organisational development. However, little is known about participants' views on the resulting efforts to implement intended changes. To explore users' perspectives on the Maturity Matrix method, the facilitation process, and drivers and barriers for implementation of intended changes. Observation of two facilitated practice meetings, 17 semi-structured interviews with participating general practitioners (GPs) or their staff, and mapping of reasons for continuing or quitting the project. General practices in Denmark Main outcomes: Successful change was associated with: a clearly identified anchor person within the practice, a shared and regular meeting structure, and an external facilitator who provides support and counselling during the implementation process. Failure to implement change was associated with: a high patient-related workload, staff or GP turnover (that seemed to affect small practices more), no clearly identified anchor person or anchor persons who did not do anything, no continuous support from an external facilitator, and no formal commitment to working with agreed changes. Future attempts to improve the impact of the Maturity Matrix, and similar tools for quality improvement, could include: (a) attention to matters of variation caused by practice size, (b) systematic counselling on barriers to implementation and support to structure the change processes, (c) a commitment from participants that goes beyond participation in two-yearly assessments, and (d) an anchor person for each identified goal who takes on the responsibility for improvement in practice.

  16. Peer mentoring of telescope operations and data reduction at Western Kentucky University

    Science.gov (United States)

    Williams, Joshua; Carini, M. T.

    2014-01-01

    Peer mentoring plays an important role in the astronomy program at Western Kentucky University. I will describe how undergraduates teach and mentor other undergraduates the basics of operating our 0.6m telescope and data reduction (IRAF) techniques. This peer to peer mentoring creates a community of undergraduate astronomy scholars at WKU. These scholars bond and help each other with research, coursework, social, and personal issues. This community atmosphere helps to draw in and retain other students interested in astronomy and other STEM careers.

  17. È VIVO: Virtual eruptions at Vesuvius; A multimedia tool to illustrate numerical modeling to a general public

    Science.gov (United States)

    Todesco, Micol; Neri, Augusto; Demaria, Cristina; Marmo, Costantino; Macedonio, Giovanni

    2006-07-01

    Dissemination of scientific results to the general public has become increasingly important in our society. When science deals with natural hazards, public outreach is even more important: on the one hand, it contributes to hazard perception and it is a necessary step toward preparedness and risk mitigation; on the other hand, it contributes to establish a positive link of mutual confidence between scientific community and the population living at risk. The existence of such a link plays a relevant role in hazard communication, which in turn is essential to mitigate the risk. In this work, we present a tool that we have developed to illustrate our scientific results on pyroclastic flow propagation at Vesuvius. This tool, a CD-ROM that we developed joining scientific data with appropriate knowledge in communication sciences is meant to be a first prototype that will be used to test the validity of this approach to public outreach. The multimedia guide contains figures, images of real volcanoes and computer animations obtained through numerical modeling of pyroclastic density currents. Explanatory text, kept as short and simple as possible, illustrates both the process and the methodology applied to study this very dangerous natural phenomenon. In this first version, the CD-ROM will be distributed among selected categories of end-users together with a short questionnaire that we have drawn to test its readability. Future releases will include feedback from the users, further advancement of scientific results as well as a higher degree of interactivity.

  18. Nulling Data Reduction and On-Sky Performance of the Large Binocular Telescope Interferometer

    Science.gov (United States)

    Defrere, D.; Hinz, P. M.; Mennesson, B.; Hoffman, W. F.; Millan-Gabet, R.; Skemer, A. J.; Bailey, V.; Danchi, W. C.; Downy, E. C.; Durney, O.; hide

    2016-01-01

    The Large Binocular Telescope Interferometer (LBTI) is a versatile instrument designed for high angular resolution and high-contrast infrared imaging (1.5-13 micrometers). In this paper, we focus on the mid-infrared (8-13 micrometers) nulling mode and present its theory of operation, data reduction, and on-sky performance as of the end of the commissioning phase in 2015 March. With an interferometric baseline of 14.4 m, the LBTI nuller is specifically tuned to resolve the habitable zone of nearby main-sequence stars, where warm exozodiacal dust emission peaks. Measuring the exozodi luminosity function of nearby main-sequence stars is a key milestone to prepare for future exo-Earth direct imaging instruments. Thanks to recent progress in wavefront control and phase stabilization, as well as in data reduction techniques, the LBTI demonstrated in 2015 February a calibrated null accuracy of 0.05% over a 3 hr long observing sequence on the bright nearby A3V star Beta Leo. This is equivalent to an exozodiacal disk density of 15-30 zodi for a Sun-like star located at 10 pc, depending on the adopted disk model. This result sets a new record for high-contrast mid-infrared interferometric imaging and opens a new window on the study of planetary systems.

  19. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Selin Aviyente

    2010-01-01

    Full Text Available Joint time-frequency representations offer a rich representation of event related potentials (ERPs that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.

  20. Design and Implementation of Data Reduction Pipelines for the Keck Observatory Archive

    Science.gov (United States)

    Gelino, C. R.; Berriman, G. B.; Kong, M.; Laity, A. C.; Swain, M. A.; Campbell, R.; Goodrich, R. W.; Holt, J.; Lyke, J.; Mader, J. A.; Tran, H. D.; Barlow, T.

    2015-09-01

    The Keck Observatory Archive (KOA), a collaboration between the NASA Exoplanet Science Institute and the W. M. Keck Observatory, serves science and calibration data for all active and inactive instruments from the twin Keck Telescopes located near the summit of Mauna Kea, Hawaii. In addition to the raw data, we produce and provide quick look reduced data for four instruments (HIRES, LWS, NIRC2, and OSIRIS) so that KOA users can more easily assess the scientific content and the quality of the data, which can often be difficult with raw data. The reduced products derive from both publicly available data reduction packages (when available) and KOA-created reduction scripts. The automation of publicly available data reduction packages has the benefit of providing a good quality product without the additional time and expense of creating a new reduction package, and is easily applied to bulk processing needs. The downside is that the pipeline is not always able to create an ideal product, particularly for spectra, because the processing options for one type of target (eg., point sources) may not be appropriate for other types of targets (eg., extended galaxies and nebulae). In this poster we present the design and implementation for the current pipelines used at KOA and discuss our strategies for handling data for which the nature of the targets and the observers' scientific goals and data taking procedures are unknown. We also discuss our plans for implementing automated pipelines for the remaining six instruments.

  1. NULLING DATA REDUCTION AND ON-SKY PERFORMANCE OF THE LARGE BINOCULAR TELESCOPE INTERFEROMETER

    Energy Technology Data Exchange (ETDEWEB)

    Defrère, D.; Hinz, P. M.; Hoffmann, W. F.; Skemer, A. J.; Bailey, V.; Downey, E. C.; Durney, O.; Grenz, P.; McMahon, T. J.; Montoya, M.; Spalding, E.; Vaz, A.; Arbo, P.; Brusa, G. [Steward Observatory, Department of Astronomy, University of Arizona, 933 N. Cherry Avenue, Tucson, AZ 85721 (United States); Mennesson, B. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Millan-Gabet, R. [NASA Exoplanet Science Institute, California Institute of Technology, 770 South Wilson Avenue, Pasadena, CA 91125 (United States); Danchi, W. C. [NASA Goddard Space Flight Center, Exoplanets and Stellar Astrophysics Laboratory, Code 667, Greenbelt, MD 20771 (United States); Hill, J. M. [Large Binocular Telescope Observatory, University of Arizona, 933 N. Cherry Avenue, Tucson, AZ 85721 (United States); Absil, O. [Institut d’Astrophysique et de Géophysique, Université de Liège, 19c Allée du Six Août, B-4000 Sart Tilman (Belgium); Bailey, H., E-mail: ddefrere@email.arizona.edu [Lunar and Planetary Laboratory, University of Arizona, 1541 E, University Boulevard, Tucson, AZ 85721 (United States); and others

    2016-06-20

    The Large Binocular Telescope Interferometer (LBTI) is a versatile instrument designed for high angular resolution and high-contrast infrared imaging (1.5–13 μ m). In this paper, we focus on the mid-infrared (8–13 μ m) nulling mode and present its theory of operation, data reduction, and on-sky performance as of the end of the commissioning phase in 2015 March. With an interferometric baseline of 14.4 m, the LBTI nuller is specifically tuned to resolve the habitable zone of nearby main-sequence stars, where warm exozodiacal dust emission peaks. Measuring the exozodi luminosity function of nearby main-sequence stars is a key milestone to prepare for future exo-Earth direct imaging instruments. Thanks to recent progress in wavefront control and phase stabilization, as well as in data reduction techniques, the LBTI demonstrated in 2015 February a calibrated null accuracy of 0.05% over a 3 hr long observing sequence on the bright nearby A3V star β Leo. This is equivalent to an exozodiacal disk density of 15–30 zodi for a Sun-like star located at 10 pc, depending on the adopted disk model. This result sets a new record for high-contrast mid-infrared interferometric imaging and opens a new window on the study of planetary systems.

  2. The DATCON system of the Belle II experiment. Tracking and data reduction

    Energy Technology Data Exchange (ETDEWEB)

    Wessel, Christian; Dingfelder, Jochen; Marinas, Carlos; Deschamps, Bruno [Universitaet Bonn (Germany). Physikalisches Institut

    2016-07-01

    The SuperKEKB e{sup +}e{sup -} accelerator at KEK in Japan will have a luminosity which is a factor of 40 higher than the luminosity of its predecessor KEKB. The Belle II detector at SuperKEKB will contain a two-layer pixel detector at radii of 1.421 and 2.179 cm from the interaction point, based on the DEPFET (DEpleted P-channel Field Effect Transistor) technology. It is surrounded by four layers of strip detectors. Due to the high collision rate, the data rate of the pixel detector needs to by drastically reduced by an online data reduction system. The DATCON (Data Acquisition Tracking and Concentrator Online Node) system performs track reconstruction in the SVD (Strip Vertex Detector) and extrapolates to the PXD (PiXel Detector) to calculate ROI and to keep only hits in the ROI. The track reconstruction algorithm is based on a Hough transform, which reduces track finding to finding intersection points in the Hough parameter space. In this talk the employed algorithm for fast online track reconstruction on FPGA, ROI finding and the performance of the data reduction are presented.

  3. Online data reduction with FPGA-based track reconstruction for the Belle II DEPFET pixel detector

    Energy Technology Data Exchange (ETDEWEB)

    Deschamps, Bruno; Wessel, Christian; Marinas, Carlos; Dingfelder, Jochen [Physikalisches Institut, Universitaet Bonn (Germany)

    2016-07-01

    The innermost two layers of the Belle II vertex detector at the KEK facility in Tsukuba, Japan, will be covered by high-granularity DEPFET pixel sensors (PXD). The large number of pixels leads to a maximum data rate of 256 Gbps, which has to be significantly reduced by the Data Acquisition System (DATCON). For the data reduction the hit information of the surrounding Silicon strip Vertex Detector (SVD) is utilized to define so-called Regions of Interest (ROI). Only hit information of the pixels located inside these ROIs are saved. The ROIs for the PXD are computed by reconstructing track segments from SVD data and extrapolation to the PXD. The goal is to achieve a data reduction of at least a factor of 10 with this ROI selection. All the necessary processing stages, the receiving, decoding and multiplexing of SVD data on 48 optical fibers, the track reconstruction and the definition of the ROIs, will be performed by the presented system. The planned hardware design is based on a distributed set of Advanced Mezzanine Cards (AMC) each equipped with a Field Programmable Gate Array (FPGA) and 4 optical transceivers. In this talk, the status and plans for the DATCON prototype and the FPGA-based tracking algorithm are introduced as well as the plans for their test in the upcoming test beam at DESY.

  4. Is there Place for Perfectionism in the NIR Spectral Data Reduction?

    Science.gov (United States)

    Chilingarian, Igor

    2017-09-01

    "Despite the crucial importance of the near-infrared spectral domain for understanding the star formation and galaxy evolution, NIR observations and data reduction represent a significant challenge. The known complexity of NIR detectors is aggravated by the airglow emission in the upper atmosphere and the water absorption in the troposphere so that up until now, the astronomical community is divided on the issue whether ground based NIR spectroscopy has a future or should it move completely to space (JWST, Euclid, WFIRST). I will share my experience of pipeline development for low- and intermediate-resolution spectrographs operated at Magellan and MMT. The MMIRS data reduction pipeline became the first example of the sky subtraction quality approaching the limit set by the Poisson photon noise and demonstrated the feasibility of low-resolution (R=1200-3000) NIR spectroscopy from the ground even for very faint (J=24.5) continuum sources. On the other hand, the FIRE Bright Source Pipeline developed specifically for high signal-to-noise intermediate resolution stellar spectra proves that systematics in the flux calibration and telluric absorption correction can be pushed down to the (sub-)percent level. My conclusion is that even though substantial effort and time investment is needed to design and develop NIR spectroscopic pipelines for ground based instruments, it will pay off, if done properly, and open new windows of opportunity in the ELT era."

  5. The ESPAT tool: a general-purpose DSS shell for solving stochastic optimization problems in complex river-aquifer systems

    Science.gov (United States)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel; Tilmant, Amaury

    2015-04-01

    Stochastic programming methods are better suited to deal with the inherent uncertainty of inflow time series in water resource management. However, one of the most important hurdles in their use in practical implementations is the lack of generalized Decision Support System (DSS) shells, usually based on a deterministic approach. The purpose of this contribution is to present a general-purpose DSS shell, named Explicit Stochastic Programming Advanced Tool (ESPAT), able to build and solve stochastic programming problems for most water resource systems. It implements a hydro-economic approach, optimizing the total system benefits as the sum of the benefits obtained by each user. It has been coded using GAMS, and implements a Microsoft Excel interface with a GAMS-Excel link that allows the user to introduce the required data and recover the results. Therefore, no GAMS skills are required to run the program. The tool is divided into four modules according to its capabilities: 1) the ESPATR module, which performs stochastic optimization procedures in surface water systems using a Stochastic Dual Dynamic Programming (SDDP) approach; 2) the ESPAT_RA module, which optimizes coupled surface-groundwater systems using a modified SDDP approach; 3) the ESPAT_SDP module, capable of performing stochastic optimization procedures in small-size surface systems using a standard SDP approach; and 4) the ESPAT_DET module, which implements a deterministic programming procedure using non-linear programming, able to solve deterministic optimization problems in complex surface-groundwater river basins. The case study of the Mijares river basin (Spain) is used to illustrate the method. It consists in two reservoirs in series, one aquifer and four agricultural demand sites currently managed using historical (XIV century) rights, which give priority to the most traditional irrigation district over the XX century agricultural developments. Its size makes it possible to use either the SDP or

  6. An introduction to data reduction: space-group determination, scaling and intensity statistics.

    Science.gov (United States)

    Evans, Philip R

    2011-04-01

    This paper presents an overview of how to run the CCP4 programs for data reduction (SCALA, POINTLESS and CTRUNCATE) through the CCP4 graphical interface ccp4i and points out some issues that need to be considered, together with a few examples. It covers determination of the point-group symmetry of the diffraction data (the Laue group), which is required for the subsequent scaling step, examination of systematic absences, which in many cases will allow inference of the space group, putting multiple data sets on a common indexing system when there are alternatives, the scaling step itself, which produces a large set of data-quality indicators, estimation of |F| from intensity and finally examination of intensity statistics to detect crystal pathologies such as twinning. An appendix outlines the scoring schemes used by the program POINTLESS to assign probabilities to possible Laue and space groups.

  7. Computer programs for data reduction and interpretation in plutonium and uranium analysis by gamma ray spectrometry

    International Nuclear Information System (INIS)

    Singh, R.K.; Moorthy, A.D.; Babbar, R.K.; Udagatti, S.V.

    1989-01-01

    Non destructive gamma ray have been developed for analysis of isotopic abundances and concentrations of plutonium and uranium in the respective product solutions of a reprocessing plant. The method involves analysis of gamma rays emitted from the sample and uses a multichannel analyser system. Data reduction and interpretation of these techniques are tedious and time consuming. In order to make it possible to use them in routine analysis, computer programs have been developed in HP-BASIC language which can be used in HP-9845B desktop computer. A set of programs, for plutonium estimation by high resolution gamma ray spectrometry and for on-line measurement of uranium by gamma ray spectrometry are described in this report. (author) 4 refs., 3 tabs., 6 figs

  8. Automation of an ion chromatograph for precipitation analysis with computerized data reduction

    Science.gov (United States)

    Hedley, Arthur G.; Fishman, Marvin J.

    1982-01-01

    Interconnection of an ion chromatograph, an autosampler, and a computing integrator to form an analytical system for simultaneous determination of fluoride, chloride, orthophosphate, bromide, nitrate, and sulfate in precipitation samples is described. Computer programs provided with the integrator are modified to implement ionchromatographic data reduction and data storage. The liquid-flow scheme for the ion chromatograph is changed by addition of a second suppressor column for greater analytical capacity. An additional vave enables selection of either suppressor column for analysis, as the other column is regenerated and stabilized with concentrated eluent.Minimum limits of detection and quantitation for each anion are calculated; these limits are a function of suppressor exhaustion. Precision for replicate analyses of six precipitation samples for fluoride, chloride, orthophosphate, nitrate, and sulfate ranged from 0.003 to 0.027 milligrams per liter. To determine accuracy of results, the same samples were spiked with known concentrations of the above mentioned anions. Average recovery was 108 percent.

  9. Measurement of aerosol size distribution by impaction and sedimentation An experimental study and data reduction

    International Nuclear Information System (INIS)

    Diouri, Mohamed.

    1981-09-01

    This study concerns essentially solid aerosols produced by combustion and more particulary the aerosol liberated by a sodium fire taken into account in safety studies related to sodium cooled nuclear reactors. The accurate determination of the aerosol size distribution depends on the selection device use. An experimental study of the parameters affecting the solid aerosol collection efficiency was made with the Andersen Mark II cascade impactor (blow off and bounce, electrical charge of particles, wall-loss). A sedimentation chamber was built and calibrated for the range between 4 and 10 μm. The second part describes a comparative study of different data reduction methods for the impactor and a new method for setting up the aerosol size distribution with data obtained by the sedimentation chamber [fr

  10. Implementation of On-Line Data Reduction Algorithms in the CMS Endcap Preshower Data Concentrator Card

    CERN Document Server

    Barney, David; Kokkas, Panagiotis; Manthos, Nikolaos; Reynaud, Serge; Sidiropoulos, Georgios; Vichoudis, Paschalis

    2006-01-01

    The CMS Endcap Preshower (ES) sub-detector comprises 4288 silicon sensors, each containing 32 strips. The data are transferred from the detector to the counting room via 1208 optical fibres running at 800Mbps. Each fibre carries data from 2, 3 or 4 sensors. For the readout of the Preshower, a VME-based system - the Endcap Preshower Data Concentrator Card (ES-DCC) is currently under development. The main objective of each readout board is to acquire on-detector data from up to 36 optical links, perform on-line data reduction (zero suppression) and pass the concentrated data to the CMS event builder. This document presents the conceptual design of the Reduction Algorithms as well as their implementation into the ES-DCC FPGAs. The algorithms implemented into the ES-DCC resulted in a reduction factor of ~20.

  11. Microcomputer development at Bonn and its application in FADC data reduction

    International Nuclear Information System (INIS)

    Mertens, V.; Schmitt, H. von der.

    1983-04-01

    With the 16/32-bit microprocessors 68K, high CPU performance (comparable to a VAX 11/780) and large address space (several Mbytes) become available for data processing on the crate level. Some software effort is necessary to take full advantage of such devices. For the Camac environment, an ACC (auxiliary crate controller) based on the 68K has been built at Bonn. Fortran-77 with specific support of real-time applications in the crate environment is made available as portable cross software. A comprehensive compiler-writing language has been developed and employed for this purpose, offering the flexibility to adapt the compiler to specific hardware conditions. A first application of this hardware/software system in FADC data reduction is described. (orig.)

  12. HS.Register - An Audit-Trail Tool to Respond to the General Data Protection Regulation (GDPR).

    Science.gov (United States)

    Gonçalves-Ferreira, Duarte; Leite, Mariana; Santos-Pereira, Cátia; Correia, Manuel E; Antunes, Luis; Cruz-Correia, Ricardo

    2018-01-01

    Introduction The new General Data Protection Regulation (GDPR) compels health care institutions and their software providers to properly document all personal data processing and provide clear evidence that their systems are inline with the GDPR. All applications involved in personal data processing should therefore produce meaningful event logs that can later be used for the effective auditing of complex processes. Aim This paper aims to describe and evaluate HS.Register, a system created to collect and securely manage at scale audit logs and data produced by a large number of systems. Methods HS.Register creates a single audit log by collecting and aggregating all kinds of meaningful event logs and data (e.g. ActiveDirectory, syslog, log4j, web server logs, REST, SOAP and HL7 messages). It also includes specially built dashboards for easy auditing and monitoring of complex processes, crossing different systems in an integrated way, as well as providing tools for helping on the auditing and on the diagnostics of difficult problems, using a simple web application. HS.Register is currently installed at five large Portuguese Hospitals and is composed of the following open-source components: HAproxy, RabbitMQ, Elasticsearch, Logstash and Kibana. Results HS.Register currently collects and analyses an average of 93 million events per week and it is being used to document and audit HL7 communications. Discussion Auditing tools like HS.Register are likely to become mandatory in the near future to allow for traceability and detailed auditing for GDPR compliance.

  13. Parallel Landscape Driven Data Reduction & Spatial Interpolation Algorithm for Big LiDAR Data

    Directory of Open Access Journals (Sweden)

    Rahil Sharma

    2016-06-01

    Full Text Available Airborne Light Detection and Ranging (LiDAR topographic data provide highly accurate digital terrain information, which is used widely in applications like creating flood insurance rate maps, forest and tree studies, coastal change mapping, soil and landscape classification, 3D urban modeling, river bank management, agricultural crop studies, etc. In this paper, we focus mainly on the use of LiDAR data in terrain modeling/Digital Elevation Model (DEM generation. Technological advancements in building LiDAR sensors have enabled highly accurate and highly dense LiDAR point clouds, which have made possible high resolution modeling of terrain surfaces. However, high density data result in massive data volumes, which pose computing issues. Computational time required for dissemination, processing and storage of these data is directly proportional to the volume of the data. We describe a novel technique based on the slope map of the terrain, which addresses the challenging problem in the area of spatial data analysis, of reducing this dense LiDAR data without sacrificing its accuracy. To the best of our knowledge, this is the first ever landscape-driven data reduction algorithm. We also perform an empirical study, which shows that there is no significant loss in accuracy for the DEM generated from a 52% reduced LiDAR dataset generated by our algorithm, compared to the DEM generated from an original, complete LiDAR dataset. For the accuracy of our statistical analysis, we perform Root Mean Square Error (RMSE comparing all of the grid points of the original DEM to the DEM generated by reduced data, instead of comparing a few random control points. Besides, our multi-core data reduction algorithm is highly scalable. We also describe a modified parallel Inverse Distance Weighted (IDW spatial interpolation method and show that the DEMs it generates are time-efficient and have better accuracy than the one’s generated by the traditional IDW method.

  14. Can we import quality tools? a feasibility study of European practice assessment in a country with less organised general practice

    Directory of Open Access Journals (Sweden)

    Pestiaux Dominique

    2009-10-01

    Full Text Available Abstract Background Quality is on the agenda of European general practice (GP. European researchers have, in collaboration, developed tools to assess quality of GPs. In this feasibility study, we tested the European Practice Assessment (EPA in a one-off project in Belgium, where general practice has a low level of GP organisation. Methods A framework for feasibility analysis included describing the recruiting of participants, a brief telephone study survey among non-responders, organisational and logistic problems. Using field notes and focus groups, we studied the participants' opinions. Results In this study, only 36 of 1000 invited practices agreed to participate. Co-ordination, administrative work, practice visits and organisational problems required several days per practice. The researchers further encountered technical problems, for instance when entering the data and uploading to the web-based server. In subsequent qualitative analysis using two focus groups, most participant GPs expressed a positive feeling after the EPA procedure. In the short period of follow-up, only a few GPs reported improvements after the visit. The participant GPs suggested that follow-up and coaching would probably facilitate the implementation of changes. Conclusion This feasibility study shows that prior interest in EPA is low in the GP community. We encountered a number of logistic and organisational problems. It proved attractive to participants, but it can be augmented by coaching of participants in more than a one-off project to identify and achieve targets for quality improvement. In the absence of commitment of the government, a network of universities and one scientific organisation will offer EPA as a service to training practices.

  15. Don't Be a Stranger-Designing a Digital Intercultural Sensitivity Training Tool that is Culture General

    NARCIS (Netherlands)

    Degens, Nick; Hofstede, Gert Jan; Beulens, Adrie; Krumhuber, E.; Kappas, Arvid

    2016-01-01

    Digital intercultural training tools play an important role in helping people to mediate cultural misunderstandings. In recent years, these tools were made to teach about specific cultures, but there has been little attention for the design of a tool to teach about differences across a wide range

  16. Don't Be a Stranger--Designing a Digital Intercultural Sensitivity Training Tool That Is Culture General

    Science.gov (United States)

    Degens, Nick; Hofstede, Gert Jan; Beulens, Adrie; Krumhuber, Eva; Kappas, Arvid

    2016-01-01

    Digital intercultural training tools play an important role in helping people to mediate cultural misunderstandings. In recent years, these tools were made to teach about specific cultures, but there has been little attention for the design of a tool to teach about differences across a wide range of cultures. In this work, we take the first steps…

  17. THE DEEP2 GALAXY REDSHIFT SURVEY: DESIGN, OBSERVATIONS, DATA REDUCTION, AND REDSHIFTS

    International Nuclear Information System (INIS)

    Newman, Jeffrey A.; Cooper, Michael C.; Davis, Marc; Faber, S. M.; Guhathakurta, Puragra; Koo, David C.; Phillips, Andrew C.; Conroy, Charlie; Harker, Justin J.; Lai, Kamson; Coil, Alison L.; Dutton, Aaron A.; Finkbeiner, Douglas P.; Gerke, Brian F.; Rosario, David J.; Weiner, Benjamin J.; Willmer, C. N. A.; Yan Renbin; Kassin, Susan A.; Konidaris, N. P.

    2013-01-01

    We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ∼ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude M B = –20 at z ∼ 1 via ∼90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg 2 divided into four separate fields observed to a limiting apparent magnitude of R AB = 24.1. Objects with z ∼ 0.7 to be targeted ∼2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ∼ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm –1 grating used for the survey delivers high spectral resolution (R ∼ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate

  18. The translators’ workstation for 2015: the example of the CAT tools of the European Commission’s Directorate General for Translation

    Directory of Open Access Journals (Sweden)

    Anna Walicka

    2016-03-01

    Full Text Available The aim of this article is to provide an answer to the question about the current state of advancement of computer-assisted translation tools. We assume that several decades of research in the field carried out by the EU institutions in the context of the European integration process have provided the most advanced computer-assisted translation tools available in the biggest translation service in the world, i.e., the Directorate General for Translation of the European Commission. The present work therefore focuses on the following three main types of CAT tools employed by the EU translators: translation memory tools, terminology management tools and machine translation tools. The same types of tools, offered by the EU providers, i.e. SDL and SYSTRAN, are also used by translators working outside the EU structures. We can therefore presume that the EU translation services set work standards which are then accepted by all professional translators. For that reason, in order to define the most probable directions of future development of these tools, this article also reports the current research conducted by the EU in the CAT tools field.

  19. THE DATA REDUCTION PIPELINE FOR THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    International Nuclear Information System (INIS)

    Nidever, David L.; Holtzman, Jon A.; Prieto, Carlos Allende; Mészáros, Szabolcs; Beland, Stephane; Bender, Chad; Desphande, Rohit; Bizyaev, Dmitry; Burton, Adam; García Pérez, Ana E.; Hearty, Fred R.; Majewski, Steven R.; Skrutskie, Michael F.; Sobeck, Jennifer S.; Wilson, John C.; Fleming, Scott W.; Muna, Demitri; Nguyen, Duy; Schiavon, Ricardo P.; Shetrone, Matthew

    2015-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, explores the stellar populations of the Milky Way using the Sloan 2.5-m telescope linked to a high resolution (R ∼ 22,500), near-infrared (1.51–1.70 μm) spectrograph with 300 optical fibers. For over 150,000 predominantly red giant branch stars that APOGEE targeted across the Galactic bulge, disks and halo, the collected high signal-to-noise ratio (>100 per half-resolution element) spectra provide accurate (∼0.1 km s −1 ) RVs, stellar atmospheric parameters, and precise (≲0.1 dex) chemical abundances for about 15 chemical species. Here we describe the basic APOGEE data reduction software that reduces multiple 3D raw data cubes into calibrated, well-sampled, combined 1D spectra, as implemented for the SDSS-III/APOGEE data releases (DR10, DR11 and DR12). The processing of the near-IR spectral data of APOGEE presents some challenges for reduction, including automated sky subtraction and telluric correction over a 3°-diameter field and the combination of spectrally dithered spectra. We also discuss areas for future improvement

  20. A vertical-energy-thresholding procedure for data reduction with multiple complex curves.

    Science.gov (United States)

    Jung, Uk; Jeong, Myong K; Lu, Jye-Chyi

    2006-10-01

    Due to the development of sensing and computer technology, measurements of many process variables are available in current manufacturing processes. It is very challenging, however, to process a large amount of information in a limited time in order to make decisions about the health of the processes and products. This paper develops a "preprocessing" procedure for multiple sets of complicated functional data in order to reduce the data size for supporting timely decision analyses. The data type studied has been used for fault detection, root-cause analysis, and quality improvement in such engineering applications as automobile and semiconductor manufacturing and nanomachining processes. The proposed vertical-energy-thresholding (VET) procedure balances the reconstruction error against data-reduction efficiency so that it is effective in capturing key patterns in the multiple data signals. The selected wavelet coefficients are treated as the "reduced-size" data in subsequent analyses for decision making. This enhances the ability of the existing statistical and machine-learning procedures to handle high-dimensional functional data. A few real-life examples demonstrate the effectiveness of our proposed procedure compared to several ad hoc techniques extended from single-curve-based data modeling and denoising procedures.

  1. DEEP U BAND AND R IMAGING OF GOODS-SOUTH: OBSERVATIONS, DATA REDUCTION AND FIRST RESULTS ,

    International Nuclear Information System (INIS)

    Nonino, M.; Cristiani, S.; Vanzella, E.; Dickinson, M.; Reddy, N.; Rosati, P.; Grazian, A.; Giavalisco, M.; Kuntschner, H.; Fosbury, R. A. E.; Daddi, E.; Cesarsky, C.

    2009-01-01

    We present deep imaging in the U band covering an area of 630 arcmin 2 centered on the southern field of the Great Observatories Origins Deep Survey (GOODS). The data were obtained with the VIMOS instrument at the European Southern Observatory (ESO) Very Large Telescope. The final images reach a magnitude limit U lim ∼ 29.8 (AB, 1σ, in a 1'' radius aperture), and have good image quality, with full width at half-maximum ∼0.''8. They are significantly deeper than previous U-band images available for the GOODS fields, and better match the sensitivity of other multiwavelength GOODS photometry. The deeper U-band data yield significantly improved photometric redshifts, especially in key redshift ranges such as 2 lim ∼ 29 (AB, 1σ, 1'' radius aperture), and image quality ∼0.''75. We discuss the strategies for the observations and data reduction, and present the first results from the analysis of the co-added images.

  2. The user's manual of 'Manyo Library' data reduction software framework at MLF, J-PARC

    International Nuclear Information System (INIS)

    Inamura, Yasuhiro; Nakatani, Takeshi; Ito, Takayoshi; Suzuki, Jiro

    2016-06-01

    Manyo Library is a software framework for developing analysis software of neutron scattering data produced at MLF, J-PARC. This software framework is required to work on many instruments in MLF and to include base functions applied to various scientific purposes at beam lines. This framework mainly consists of data containers, which enable to store 1, 2 and 3 dimensional axes data for neutron scattering. Data containers have many functions to calculate four arithmetic operations with errors distribution between containers, to store the meta-data about measurements and to read or write text file. The analysis codes are constructed using various analysis operators defined in Manyo Library, which executes functions with given data containers and output the results. On the other hands, the main interface for instrument scientists and users must be easy and interactive to treat data containers and functions or to develop new analysis codes. Therefore we chose Python as user interface. Since Manyo Library is built in C++ language, we've introduced the technology to call C++ function from Python environment into the framework. As a result, we have already developed a lot of software for data reduction, analysis and visualization, which are utilized widely in beam lines at MLF. This document is the manual for the beginner to touch this framework. (author)

  3. THE PRISM MULTI-OBJECT SURVEY (PRIMUS). II. DATA REDUCTION AND REDSHIFT FITTING

    Energy Technology Data Exchange (ETDEWEB)

    Cool, Richard J. [MMT Observatory, Tucson, AZ 85721 (United States); Moustakas, John [Department of Physics, Siena College, 515 Loudon Rd., Loudonville, NY 12211 (United States); Blanton, Michael R.; Hogg, David W. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Burles, Scott M. [D.E. Shaw and Co. L.P, 20400 Stevens Creek Blvd., Suite 850, Cupertino, CA 95014 (United States); Coil, Alison L.; Aird, James; Mendez, Alexander J. [Department of Physics, Center for Astrophysics and Space Sciences, University of California, 9500 Gilman Dr., La Jolla, San Diego, CA 92093 (United States); Eisenstein, Daniel J. [Harvard-Smithsonian Center for Astrophysics, 60 Garden St, MS 20, Cambridge, MA 02138 (United States); Wong, Kenneth C. [Steward Observatory, The University of Arizona, 933 N. Cherry Ave., Tucson, AZ 85721 (United States); Zhu, Guangtun [Center for Astrophysical Sciences, Department of Physics and Astronomy, Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Bernstein, Rebecca A. [Department of Astronomy and Astrophysics, UCA/Lick Observatory, University of California, 1156 High Street, Santa Cruz, CA 95064 (United States); Bolton, Adam S. [Department of Physics and Astronomy, University of Utah, Salt Lake City, UT 84112 (United States)

    2013-04-20

    The PRIsm MUlti-object Survey (PRIMUS) is a spectroscopic galaxy redshift survey to z {approx} 1 completed with a low-dispersion prism and slitmasks allowing for simultaneous observations of {approx}2500 objects over 0.18 deg{sup 2}. The final PRIMUS catalog includes {approx}130,000 robust redshifts over 9.1 deg{sup 2}. In this paper, we summarize the PRIMUS observational strategy and present the data reduction details used to measure redshifts, redshift precision, and survey completeness. The survey motivation, observational techniques, fields, target selection, slitmask design, and observations are presented in Coil et al. Comparisons to existing higher-resolution spectroscopic measurements show a typical precision of {sigma}{sub z}/(1 + z) = 0.005. PRIMUS, both in area and number of redshifts, is the largest faint galaxy redshift survey completed to date and is allowing for precise measurements of the relationship between active galactic nuclei and their hosts, the effects of environment on galaxy evolution, and the build up of galactic systems over the latter half of cosmic history.

  4. Test and data reduction algorithm for the evaluation of lead-acid battery packs

    Energy Technology Data Exchange (ETDEWEB)

    Nowak, D.

    1986-01-15

    Experience from the DOE Electric Vehicle Demonstration Project indicated severe battery problems associated with driving electric cars in temperature extremes. The vehicle batteries suffered from a high module failure rate, reduced capacity, and low efficiency. To assess the nature and the extent of the battery problems encountered at various operating temperatures, a test program was established at the University of Alabama in Huntsville (UAH). A test facility was built that is based on Propel cycling equipment, the Hewlett Packard 3497A Data Acquisition System, and the HP85F and HP87 computers. The objective was to establish a cost effective facility that could generate the engineering data base needed for the development of thermal management systems, destratification systems, central watering systems and proper charge algorithms. It was hoped that the development and implementation of these systems by EV manufacturers and fleet operators of EVs would eliminate the most pressing problems that occurred in the DOE EV Demonstration Project. The data reduction algorithm is described.

  5. Parameter-free Network Sparsification and Data Reduction by Minimal Algorithmic Information Loss

    KAUST Repository

    Zenil, Hector

    2018-02-16

    The study of large and complex datasets, or big data, organized as networks has emerged as one of the central challenges in most areas of science and technology. Cellular and molecular networks in biology is one of the prime examples. Henceforth, a number of techniques for data dimensionality reduction, especially in the context of networks, have been developed. Yet, current techniques require a predefined metric upon which to minimize the data size. Here we introduce a family of parameter-free algorithms based on (algorithmic) information theory that are designed to minimize the loss of any (enumerable computable) property contributing to the object\\'s algorithmic content and thus important to preserve in a process of data dimension reduction when forcing the algorithm to delete first the least important features. Being independent of any particular criterion, they are universal in a fundamental mathematical sense. Using suboptimal approximations of efficient (polynomial) estimations we demonstrate how to preserve network properties outperforming other (leading) algorithms for network dimension reduction. Our method preserves all graph-theoretic indices measured, ranging from degree distribution, clustering-coefficient, edge betweenness, and degree and eigenvector centralities. We conclude and demonstrate numerically that our parameter-free, Minimal Information Loss Sparsification (MILS) method is robust, has the potential to maximize the preservation of all recursively enumerable features in data and networks, and achieves equal to significantly better results than other data reduction and network sparsification methods.

  6. Analysis and interpretation of dynamic FDG PET oncological studies using data reduction techniques

    Directory of Open Access Journals (Sweden)

    Santos Andres

    2007-10-01

    Full Text Available Abstract Background Dynamic positron emission tomography studies produce a large amount of image data, from which clinically useful parametric information can be extracted using tracer kinetic methods. Data reduction methods can facilitate the initial interpretation and visual analysis of these large image sequences and at the same time can preserve important information and allow for basic feature characterization. Methods We have applied principal component analysis to provide high-contrast parametric image sets of lower dimensions than the original data set separating structures based on their kinetic characteristics. Our method has the potential to constitute an alternative quantification method, independent of any kinetic model, and is particularly useful when the retrieval of the arterial input function is complicated. In independent component analysis images, structures that have different kinetic characteristics are assigned opposite values, and are readily discriminated. Furthermore, novel similarity mapping techniques are proposed, which can summarize in a single image the temporal properties of the entire image sequence according to a reference region. Results Using our new cubed sum coefficient similarity measure, we have shown that structures with similar time activity curves can be identified, thus facilitating the detection of lesions that are not easily discriminated using the conventional method employing standardized uptake values.

  7. THE DATA REDUCTION PIPELINE FOR THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Nidever, David L. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Holtzman, Jon A. [New Mexico State University, Las Cruces, NM 88003 (United States); Prieto, Carlos Allende; Mészáros, Szabolcs [Instituto de Astrofísica de Canarias, Via Láctea s/n, E-38205 La Laguna, Tenerife (Spain); Beland, Stephane [Laboratory for Atmospheric and Space Sciences, University of Colorado at Boulder, Boulder, CO (United States); Bender, Chad; Desphande, Rohit [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Bizyaev, Dmitry [Apache Point Observatory and New Mexico State University, P.O. Box 59, sunspot, NM 88349-0059 (United States); Burton, Adam; García Pérez, Ana E.; Hearty, Fred R.; Majewski, Steven R.; Skrutskie, Michael F.; Sobeck, Jennifer S.; Wilson, John C. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Fleming, Scott W. [Computer Sciences Corporation, 3700 San Martin Dr, Baltimore, MD 21218 (United States); Muna, Demitri [Department of Astronomy and the Center for Cosmology and Astro-Particle Physics, The Ohio State University, Columbus, OH 43210 (United States); Nguyen, Duy [Department of Astronomy and Astrophysics, University of Toronto, Toronto, Ontario, M5S 3H4 (Canada); Schiavon, Ricardo P. [Gemini Observatory, 670 N. A’Ohoku Place, Hilo, HI 96720 (United States); Shetrone, Matthew, E-mail: dnidever@umich.edu [University of Texas at Austin, McDonald Observatory, Fort Davis, TX 79734 (United States)

    2015-12-15

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, explores the stellar populations of the Milky Way using the Sloan 2.5-m telescope linked to a high resolution (R ∼ 22,500), near-infrared (1.51–1.70 μm) spectrograph with 300 optical fibers. For over 150,000 predominantly red giant branch stars that APOGEE targeted across the Galactic bulge, disks and halo, the collected high signal-to-noise ratio (>100 per half-resolution element) spectra provide accurate (∼0.1 km s{sup −1}) RVs, stellar atmospheric parameters, and precise (≲0.1 dex) chemical abundances for about 15 chemical species. Here we describe the basic APOGEE data reduction software that reduces multiple 3D raw data cubes into calibrated, well-sampled, combined 1D spectra, as implemented for the SDSS-III/APOGEE data releases (DR10, DR11 and DR12). The processing of the near-IR spectral data of APOGEE presents some challenges for reduction, including automated sky subtraction and telluric correction over a 3°-diameter field and the combination of spectrally dithered spectra. We also discuss areas for future improvement.

  8. THE DATA REDUCTION PIPELINE FOR THE SDSS-IV MaNGA IFU GALAXY SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Law, David R. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Cherinka, Brian [Center for Astrophysical Sciences, Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States); Yan, Renbin [Department of Physics and Astronomy, University of Kentucky, 505 Rose Street, Lexington, KY 40506-0055 (United States); Andrews, Brett H. [Department of Physics and Astronomy and PITT PACC, University of Pittsburgh, 3941 O’Hara Street, Pittsburgh, PA 15260 (United States); Bershady, Matthew A. [Department of Astronomy, University of Wisconsin-Madison, 475 N. Charter Street, Madison, WI 53706 (United States); Bizyaev, Dmitry [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349 (United States); Blanc, Guillermo A. [Departamento de Astronomía, Universidad de Chile, Camino del Observatorio 1515, Las Condes, Santiago (Chile); Blanton, Michael R. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Bolton, Adam S.; Brownstein, Joel R. [Department of Physics and Astronomy, University of Utah, 115 S 1400 E, Salt Lake City, UT 84112 (United States); Bundy, Kevin [Kavli Institute for the Physics and Mathematics of the universe, Todai Institutes for Advanced Study, the University of Tokyo, Kashiwa, 277-8583 (Kavli IPMU, WPI) (Japan); Chen, Yanmei [School of Astronomy and Space Science, Nanjing University, Nanjing 210093 (China); Drory, Niv [McDonald Observatory, Department of Astronomy, University of Texas at Austin, 1 University Station, Austin, TX 78712-0259 (United States); D’Souza, Richard; Jones, Amy; Kauffmann, Guinevere [Max Planck Institute for Astrophysics, Karl-Schwarzschild-Str. 1, D-85748 Garching (Germany); Fu, Hai, E-mail: dlaw@stsci.edu [Department of Physics and Astronomy, University of Iowa, Iowa City, IA 52242 (United States); and others

    2016-10-01

    Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) is an optical fiber-bundle integral-field unit (IFU) spectroscopic survey that is one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV). With a spectral coverage of 3622–10354 Å and an average footprint of ∼500 arcsec{sup 2} per IFU the scientific data products derived from MaNGA will permit exploration of the internal structure of a statistically large sample of 10,000 low-redshift galaxies in unprecedented detail. Comprising 174 individually pluggable science and calibration IFUs with a near-constant data stream, MaNGA is expected to obtain ∼100 million raw-frame spectra and ∼10 million reduced galaxy spectra over the six-year lifetime of the survey. In this contribution, we describe the MaNGA Data Reduction Pipeline algorithms and centralized metadata framework that produce sky-subtracted spectrophotometrically calibrated spectra and rectified three-dimensional data cubes that combine individual dithered observations. For the 1390 galaxy data cubes released in Summer 2016 as part of SDSS-IV Data Release 13, we demonstrate that the MaNGA data have nearly Poisson-limited sky subtraction shortward of ∼8500 Å and reach a typical 10 σ limiting continuum surface brightness μ  = 23.5 AB arcsec{sup −2} in a five-arcsecond-diameter aperture in the g -band. The wavelength calibration of the MaNGA data is accurate to 5 km s{sup −1} rms, with a median spatial resolution of 2.54 arcsec FWHM (1.8 kpc at the median redshift of 0.037) and a median spectral resolution of σ  = 72 km s{sup −1}.

  9. The Data Reduction Pipeline for the SDSS-IV MaNGA IFU Galaxy Survey

    Science.gov (United States)

    Law, David R.; Cherinka, Brian; Yan, Renbin; Andrews, Brett H.; Bershady, Matthew A.; Bizyaev, Dmitry; Blanc, Guillermo A.; Blanton, Michael R.; Bolton, Adam S.; Brownstein, Joel R.; Bundy, Kevin; Chen, Yanmei; Drory, Niv; D'Souza, Richard; Fu, Hai; Jones, Amy; Kauffmann, Guinevere; MacDonald, Nicholas; Masters, Karen L.; Newman, Jeffrey A.; Parejko, John K.; Sánchez-Gallego, José R.; Sánchez, Sebastian F.; Schlegel, David J.; Thomas, Daniel; Wake, David A.; Weijmans, Anne-Marie; Westfall, Kyle B.; Zhang, Kai

    2016-10-01

    Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) is an optical fiber-bundle integral-field unit (IFU) spectroscopic survey that is one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV). With a spectral coverage of 3622-10354 Å and an average footprint of ˜500 arcsec2 per IFU the scientific data products derived from MaNGA will permit exploration of the internal structure of a statistically large sample of 10,000 low-redshift galaxies in unprecedented detail. Comprising 174 individually pluggable science and calibration IFUs with a near-constant data stream, MaNGA is expected to obtain ˜100 million raw-frame spectra and ˜10 million reduced galaxy spectra over the six-year lifetime of the survey. In this contribution, we describe the MaNGA Data Reduction Pipeline algorithms and centralized metadata framework that produce sky-subtracted spectrophotometrically calibrated spectra and rectified three-dimensional data cubes that combine individual dithered observations. For the 1390 galaxy data cubes released in Summer 2016 as part of SDSS-IV Data Release 13, we demonstrate that the MaNGA data have nearly Poisson-limited sky subtraction shortward of ˜8500 Å and reach a typical 10σ limiting continuum surface brightness μ = 23.5 AB arcsec-2 in a five-arcsecond-diameter aperture in the g-band. The wavelength calibration of the MaNGA data is accurate to 5 km s-1 rms, with a median spatial resolution of 2.54 arcsec FWHM (1.8 kpc at the median redshift of 0.037) and a median spectral resolution of σ = 72 km s-1.

  10. THE DATA REDUCTION PIPELINE FOR THE SDSS-IV MaNGA IFU GALAXY SURVEY

    International Nuclear Information System (INIS)

    Law, David R.; Cherinka, Brian; Yan, Renbin; Andrews, Brett H.; Bershady, Matthew A.; Bizyaev, Dmitry; Blanc, Guillermo A.; Blanton, Michael R.; Bolton, Adam S.; Brownstein, Joel R.; Bundy, Kevin; Chen, Yanmei; Drory, Niv; D’Souza, Richard; Jones, Amy; Kauffmann, Guinevere; Fu, Hai

    2016-01-01

    Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) is an optical fiber-bundle integral-field unit (IFU) spectroscopic survey that is one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV). With a spectral coverage of 3622–10354 Å and an average footprint of ∼500 arcsec 2 per IFU the scientific data products derived from MaNGA will permit exploration of the internal structure of a statistically large sample of 10,000 low-redshift galaxies in unprecedented detail. Comprising 174 individually pluggable science and calibration IFUs with a near-constant data stream, MaNGA is expected to obtain ∼100 million raw-frame spectra and ∼10 million reduced galaxy spectra over the six-year lifetime of the survey. In this contribution, we describe the MaNGA Data Reduction Pipeline algorithms and centralized metadata framework that produce sky-subtracted spectrophotometrically calibrated spectra and rectified three-dimensional data cubes that combine individual dithered observations. For the 1390 galaxy data cubes released in Summer 2016 as part of SDSS-IV Data Release 13, we demonstrate that the MaNGA data have nearly Poisson-limited sky subtraction shortward of ∼8500 Å and reach a typical 10 σ limiting continuum surface brightness μ  = 23.5 AB arcsec −2 in a five-arcsecond-diameter aperture in the g -band. The wavelength calibration of the MaNGA data is accurate to 5 km s −1 rms, with a median spatial resolution of 2.54 arcsec FWHM (1.8 kpc at the median redshift of 0.037) and a median spectral resolution of σ  = 72 km s −1 .

  11. Data reduction and analysis of the multiband optical images of the blazar Mrk180

    Directory of Open Access Journals (Sweden)

    M Sabzi Sarvestani

    2012-09-01

    Full Text Available  Nearly simultaneous multiband monitoring of blazars is very limited and most studies reported in literature are conflicting, too. Although optical variability on intra-night timescales is now a well established phenomenon for blazars, its relationship to long-term variability remains unclear. Possible clues could come from monitoring the optical spectrum for correlation with brightness. The presence or absence of bluer color in blazar color index, when its luminosity is increased on intra-night and inter-night timescales, can provide interesting clues to the origin of blazar variability from hourly to much longer timescales. Luminosity of blazars varies at all wavelengths over a variety of timescales. Various models have been proposed to explain blazar variability. However, the mechanism responsible for variability is not conclusively understood. One factor which can discriminate the various variability models is that of color (spectral index variations of blazars. This factor may help to better understand the mechanism of blazar variability. Therefore, it was initially proposed, by the second author of this paper to the OHP observatory, to carry out quasi-simultaneous multiband monitoring of one of the brightest blazer, Mrk180. Fortunately, it was accepted by the scientific team of the observatory and the 1.20m telescope time was allocated to the project from 23 to 28 April 2009. Because of the weather conditions, we could only monitor this blazar for three nights. Raw data processing and data reduction were performed using the standard system of Europe Southerner Observatory, ESO-MIDAS. We considered two reference stars and measured the magnitudes of the reference stars and the blazar Mrk 180 and then plotted the light curves and the color index diagrams. The light curves showed the optical variations of the blazar. The maximum amplitude value of its variations was 0.185 mag for the V filter. Investigating the blazar color index shows its

  12. User's manual for the UNDERDOG [Underground Nuclear Depository Evaluation, Reduction, and Detailed Output Generator] data reduction software

    International Nuclear Information System (INIS)

    Ball, J.R.; Shepard, L.K.

    1987-12-01

    UNDERDOG is a computer program that aids experimentalists in the process of data reduction. This software allows a user to reduce, extract, and generate displays of data collected at the WIPP site. UNDERDOG contains three major functional components: a Data Reduction package, a Data Analysis interface, and a Publication-Quality Output generator. It also maintains audit trails of all actions performed for quality assurance purposes and provides mechanisms which control an individual's access to the data. UNDERDOG was designed to run on a Digital Equipment Corporation VAX computer using the VMS operating system. 8 refs., 24 figs., 2 tabs

  13. Using Data Reduction Methods To Predict Quality Of Life In Brest ...

    African Journals Online (AJOL)

    Background: Quality of life study has an important role in health care especially in chronic diseases, in clinical judgment and in medical resources supplying. Statistical tools like linear regression are widely used to assess the predictors of quality of life. But usually existed a lot of factor cause difficulty for fitting the models ...

  14. Using the electronic health record to build a culture of practice safety: evaluating the implementation of trigger tools in one general practice.

    Science.gov (United States)

    Margham, Tom; Symes, Natalie; Hull, Sally A

    2018-04-01

    Identifying patients at risk of harm in general practice is challenging for busy clinicians. In UK primary care, trigger tools and case note reviews are mainly used to identify rates of harm in sample populations. This study explores how adaptions to existing trigger tool methodology can identify patient safety events and engage clinicians in ongoing reflective work around safety. Mixed-method quantitative and narrative evaluation using thematic analysis in a single East London training practice. The project team developed and tested five trigger searches, supported by Excel worksheets to guide the case review process. Project evaluation included summary statistics of completed worksheets and a qualitative review focused on ease of use, barriers to implementation, and perception of value to clinicians. Trigger searches identified 204 patients for GP review. Overall, 117 (57%) of cases were reviewed and 62 (53%) of these cases had patient safety events identified. These were usually incidents of omission, including failure to monitor or review. Key themes from interviews with practice members included the fact that GPs' work is generally reactive and GPs welcomed an approach that identified patients who were 'under the radar' of safety. All GPs expressed concern that the tool might identify too many patients at risk of harm, placing further demands on their time. Electronic trigger tools can identify patients for review in domains of clinical risk for primary care. The high yield of safety events engaged clinicians and provided validation of the need for routine safety checks. © British Journal of General Practice 2018.

  15. Is OperaVOX a clinically useful tool for the assessment of voice in a general ENT clinic?

    Directory of Open Access Journals (Sweden)

    Richard Teck Kee Siau

    2017-04-01

    Full Text Available Abstract Background Objective acoustic analysis is a key component of multidimensional voice assessment. OperaVOX is an iOS app which has been shown to be comparable to Multi Dimensional Voice Program for most principal measures of vocal function. As a relatively cheap, portable and easily accessible form of acoustic analysis, OperaVOX may be more clinically useful than laboratory-based software in many situations. This study aims to determine whether correlation exists between acoustic measurements obtained using OperaVOX, and perceptual evaluation of voice. Methods Forty-four voices from the multidisciplinary voice clinic were examined. Each voice was assessed blindly by a single experienced voice therapist using the GRBAS scale, and analysed using OperaVOX. The Spearman rank correlation co-efficient was calculated between each element of the GRBAS scale and acoustic measurements obtained by OperaVOX. Results Significant correlations were identified between GRBAS scores and OperaVOX parameters. Grade correlated significantly with jitter (ρ = 0.495, p < 0.05, shimmer (ρ = 0.385, p < 0.05, noise-to-harmonic ratio (NHR; ρ = 0.526, p < 0.05 and maximum phonation time (MPT; ρ = −0.415, p < 0.05. Roughness did not correlate with any of the measured variables. Breathiness correlated significantly with jitter (ρ = 0.342, p < 0.05, NHR (ρ = 0.344, p < 0.05 and MPT (ρ = −0.336, p < 0.05. Aesthenia correlated with NHR (ρ = 0.413, p < 0.05 and MPT (ρ = −0.399, p < 0.05. Strain correlated with Jitter (ρ = 0.560, p < 0.05, NHR (ρ = 0.600, p < 0.05 and MPT (ρ = −0.356, p < 0.05. Conclusions OperaVOX provides objective acoustic analysis which has shown statistically significant correlation to perceptual evaluation using the GRBAS scale. The accessibility of the software package makes it possible for a wide range of health practitioners, e.g. general ENT

  16. Improving the effectiveness of ecological site descriptions: General state-and-transition models and the Ecosystem Dynamics Interpretive Tool (EDIT)

    Science.gov (United States)

    Bestelmeyer, Brandon T.; Williamson, Jeb C.; Talbot, Curtis J.; Cates, Greg W.; Duniway, Michael C.; Brown, Joel R.

    2016-01-01

    State-and-transition models (STMs) are useful tools for management, but they can be difficult to use and have limited content.STMs created for groups of related ecological sites could simplify and improve their utility. The amount of information linked to models can be increased using tables that communicate management interpretations and important within-group variability.We created a new web-based information system (the Ecosystem Dynamics Interpretive Tool) to house STMs, associated tabular information, and other ecological site data and descriptors.Fewer, more informative, better organized, and easily accessible STMs should increase the accessibility of science information.

  17. Semi-structured interview is a reliable and feasible tool for selection of doctors for general practice specialist training

    DEFF Research Database (Denmark)

    Isaksen, Jesper; Hertel, Niels Thomas; Kjær, Niels Kristian

    2013-01-01

    In order to optimise the selection process for admission to specialist training in family medicine, we developed a new design for structured applications and selection interviews. The design contains semi-structured interviews, which combine individualised elements from the applications...... with standardised behaviour-based questions. This paper describes the design of the tool, and offers reflections concerning its acceptability, reliability and feasibility....

  18. Semi-structured interview is a reliable and feasible tool for selection of doctors for general practice specialist training.

    Science.gov (United States)

    Isaksen, Jesper Hesselbjerg; Hertel, Niels Thomas; Kjær, Niels Kristian

    2013-09-01

    In order to optimise the selection process for admission to specialist training in family medicine, we developed a new design for structured applications and selection interviews. The design contains semi-structured interviews, which combine individualised elements from the applications with standardised behaviour-based questions. This paper describes the design of the tool, and offers reflections concerning its acceptability, reliability and feasibility. We used a combined quantitative and qualitative evaluation method. Ratings obtained by the applicants in two selection rounds were analysed for reliability and generalisability using the GENOVA programme. Applicants and assessors were randomly selected for individual semi-structured in-depth interviews. The qualitative data were analysed in accordance with the grounded theory method. Quantitative analysis yielded a high Cronbach's alpha of 0.97 for the first round and 0.90 for the second round, and a G coefficient of the first round of 0.74 and of the second round of 0.40. Qualitative analysis demonstrated high acceptability and fairness and it improved the assessors' judgment. Applicants reported concerns about loss of personality and some anxiety. The applicants' ability to reflect on their competences was important. The developed selection tool demonstrated an acceptable level of reliability, but only moderate generalisability. The users found that the tool provided a high degree of acceptability; it is a feasible and useful tool for -selection of doctors for specialist training if combined with work-based assessment. Studies on the benefits and drawbacks of this tool compared with other selection models are relevant. not relevant. not relevant.

  19. [Psychoprophylaxis in elective paediatric general surgery: does audiovisual tools improve the perioperative anxiety in children and their families?

    Science.gov (United States)

    Álvarez García, N; Gómez Palacio, V; Siles Hinojosa, A; Gracia Romero, J

    2017-10-25

    Surgery is considered a stressful experience for children and their families who undergo elective procedures. Different tools have been developed to improve perioperative anxiety. Our objective is to demonstrate if the audiovisual psychoprophylaxis reduces anxiety linked to paediatric surgery. A randomized prospective case-control study was carried out in children aged 4-15 who underwent surgery in a Paediatric Surgery Department. We excluded patients with surgical backgrounds, sever illness or non-elective procedures. Simple randomization was performed and cases watched a video before being admitted, under medical supervision. Trait and state anxiety levels were measured using the STAI-Y2, STAI-Y2, STAI-C tests and VAS in children under 6-years-old, at admission and discharge. 100 patients (50 cases/50 controls) were included, mean age at diagnosis was 7.98 and 7.32 respectively. Orchiopexy was the most frequent surgery performed in both groups. Anxiety state levels from parents were lower in the Cases Group (36.06 vs 39.93 p= 0.09 in fathers, 38.78 vs 40.34 p= 0.43 in mothers). At discharge, anxiety levels in children aged > 6 were statistically significant among cases (26.84 vs 32.96, ppsychoprophylaxis tools shows a clinically relevant improvement in perioperative anxiety, both in children and their parents. Our results are similar to those reported by other authors supporting these tools as beneficial strategy for the family.

  20. The Potential of Web 2.0 Tools to Promote Reading Engagement in a General Education Course

    Science.gov (United States)

    Park, Seung Won

    2013-01-01

    General education classes involve extensive course readings. College instructors have a limited time to cover every detail of the materials students are supposed to learn in class; thus, they expect students to learn through course readings. However, many college students demonstrate a low level of engagement in course reading tasks. Existing…

  1. Hypersonic research engine project. Phase 2: Aerothermodynamic Integration Model (AIM) data reduction computer program, data item no. 54.16

    Science.gov (United States)

    Gaede, A. E.; Platte, W. (Editor)

    1975-01-01

    The data reduction program used to analyze the performance of the Aerothermodynamic Integration Model is described. Routines to acquire, calibrate, and interpolate the test data, to calculate the axial components of the pressure area integrals and the skin function coefficients, and to report the raw data in engineering units are included along with routines to calculate flow conditions in the wind tunnel, inlet, combustor, and nozzle, and the overall engine performance. Various subroutines were modified and used to obtain species concentrations and transport properties in chemical equilibrium at each of the internal and external engine stations. It is recommended that future test plans include the configuration, calibration, and channel assignment data on a magnetic tape generated at the test site immediately before or after a test, and that the data reduction program be designed to operate in a batch environment.

  2. Towards a responsive and interactive graphical user interface for neutron data reduction and visualization

    International Nuclear Information System (INIS)

    Chatterjee, Alok; Worlton, T.; Hammonds, J.; Loong, C.K.; Mikkelson, D.; Mikkelson, R.; Chen, D.

    2001-01-01

    An Integrated Spectral Analysis Workbench, ISAW has been developed at IPNS with the goal of providing a flexible and powerful tool to visualize and analyze neutron scattering time-of-flight data. The software, written in Java, is platform independent, object oriented and modular, making it easier to maintain and add features. The graphical user interface (GUI) for ISAW allows intuitive and interactive loading and manipulation of multiple spectra from different 'runs'. ISAW provides multiple displays of the spectra in a Runfile' and most of the functions can be performed through the GUI menu bar as well as through command scripts. All displays are simultaneously updated when the data is changed using the Observable-observer object-model pattern. All displays are observers of the Dataset (observable) and respond to changes or selections in it simultaneously. A 'tree' display of the spectra in run files is provided for a detailed view of detector elements and easy selection of spectra. The operations menu is instrument sensitive so that it displays the appropriate set of operators accordingly. Automatic menu generation is made possible by the ability of the DataSet objects to furnish a list of operations contained in the particular DataSet selected at the time the menu bar is accessed. The transformed and corrected data can be saved to a disk in different file formats for further analyses (e.g., GSAS for structure refinement). (author)

  3. Validity of silhouette showcards as a measure of body size and obesity in a population in the African region: A practical research tool for general-purpose surveys.

    Science.gov (United States)

    Yepes, Maryam; Viswanathan, Barathi; Bovet, Pascal; Maurer, Jürgen

    2015-01-01

    The purpose of this study is to validate the Pulvers silhouette showcard as a measure of weight status in a population in the African region. This tool is particularly beneficial when scarce resources do not allow for direct anthropometric measurements due to limited survey time or lack of measurement technology in face-to-face general-purpose surveys or in mailed, online, or mobile device-based surveys. A cross-sectional study was conducted in the Republic of Seychelles with a sample of 1240 adults. We compared self-reported body sizes measured by Pulvers' silhouette showcards to four measurements of body size and adiposity: body mass index (BMI), body fat percent measured, waist circumference, and waist to height ratio. The accuracy of silhouettes as an obesity indicator was examined using sex-specific receiver operator curve (ROC) analysis and the reliability of this tool to detect socioeconomic gradients in obesity was compared to BMI-based measurements. Our study supports silhouette body size showcards as a valid and reliable survey tool to measure self-reported body size and adiposity in an African population. The mean correlation coefficients of self-reported silhouettes with measured BMI were 0.80 in men and 0.81 in women (P general-purpose surveys of obesity in social sciences, where limited resources do not allow for direct anthropometric measurements.

  4. Predictors for assessing electronic messaging between nurses and general practitioners as a useful tool for communication in home health care services: a cross-sectional study.

    Science.gov (United States)

    Lyngstad, Merete; Hofoss, Dag; Grimsmo, Anders; Hellesø, Ragnhild

    2015-02-17

    Nurses providing home health care services are dependent on access to patient information and communicating with general practitioners (GPs) to deliver safe and effective health care to patients. Information and communication technology (ICT) systems are viewed as powerful tools for this purpose. In Norway, a standardized electronic messaging (e-messaging) system is currently being established in health care. The aim of this study was to explore home health care nurses' assessments of the utility of the e-messaging system for communicating with GPs and identify elements that influence the assessment of e-messaging as a useful communication tool. The data were collected using a self-developed questionnaire based on variables identified by focus group interviews with home health care nurses (n=425) who used e-messaging and existing research. Data were analyzed using logistic regression analyses. Over two-thirds (425/632, 67.2%) of the home health care nurses returned the questionnaire. A high proportion (388/399, 97.2%) of the home health care nurses who returned the questionnaire found the e-messaging system to be a useful tool for communication with GPs. The odds of reporting that e-messaging was a useful tool were over five times higher (OR 5.1, CI 2.489-10.631, Pcommunicate with GPs. By identifying these elements, it is easier to determine which interventions are the most important for the development and implementation of ICT systems in home health care services.

  5. On-line monitoring and data reduction of seismic events at Gauribidanur array

    International Nuclear Information System (INIS)

    Bharthur, R.N.; Rao, B.S.; Roy, F.

    1977-01-01

    Reduction of the threshold may improve the detection capability of the system, but it will lead to more spurious triggers. In order to overcome this problem, the nature of the spurious triggers is studied in detail. It is found that in general the cross correlation coefficient between the two beams viz. Ssup(A) and Ssup(B), due to spurious triggers has a maximum value of .4, where as the corresponding value of seismic events showed a minimum of .6. Therefore with the incorporation of a programme which suppresses all the triggers having a cross correlation coefficient of .4 and less, it will be possible to further bring down the threshold level. (author)

  6. Using mass media and the Internet as tools to diagnose hepatitis C infections in the general population.

    Science.gov (United States)

    Zuure, Freke R; Davidovich, Udi; Coutinho, Roel A; Kok, Gerjo; Hoebe, Christian J P A; van den Hoek, Anneke; Jansen, Peter L M; van Leeuwen-Gilbert, Paula; Verheuvel, Nicole C; Weegink, Christine J; Prins, Maria

    2011-03-01

    Many individuals with hepatitis C virus (HCV) infection are undiagnosed. This study describes the development and the use and outcomes of a mass media campaign, combined with an Internet risk assessment and an Internet-mediated blood-testing procedure for HCV to identify individuals infected with HCV in the general population. From April 2007 to December 2008, individuals in HCV risk groups were referred to an online, previously validated risk-assessment questionnaire at www.heptest.nl. Individuals at risk could download a referral letter for a free, anonymous HCV blood test in a nonclinical setting. Test results could be obtained online, 1 week later, using a personal log-in code. Anti-HCV-positive participants were requested to visit the Public Health Service for confirmation and RNA testing. Chronically HCV-infected individuals were referred for treatment. Data were analyzed in 2009-2010. The website attracted 40,902 visitors. Of the 9653 who completed the questionnaire, 2553 were at risk for HCV (26.4%). Main reported risk factors were a blood transfusion prior to 1992 and noninjecting drug use. Of the 1480 eligible for the blood test, 420 opted for testing (28%). HCV antibodies were detected in 3.6% (n=15, 95% CI=2.1%, 5.7%); of the 12 with a chronic HCV infection, six began treatment. Internet-mediated risk-based testing for HCV has proved to be a feasible and effective strategy to identify undiagnosed HCV infection in the general population. All HCV-infected individuals belonged to hard-to-reach populations. Test uptake was 28%, which is high for an online project that includes blood testing. Because Internet-mediated testing is low-cost, this strategy holds promise for future screening. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  7. Generalized additive models used to predict species abundance in the Gulf of Mexico: an ecosystem modeling tool.

    Directory of Open Access Journals (Sweden)

    Michael Drexler

    Full Text Available Spatially explicit ecosystem models of all types require an initial allocation of biomass, often in areas where fisheries independent abundance estimates do not exist. A generalized additive modelling (GAM approach is used to describe the abundance of 40 species groups (i.e. functional groups across the Gulf of Mexico (GoM using a large fisheries independent data set (SEAMAP and climate scale oceanographic conditions. Predictor variables included in the model are chlorophyll a, sediment type, dissolved oxygen, temperature, and depth. Despite the presence of a large number of zeros in the data, a single GAM using a negative binomial distribution was suitable to make predictions of abundance for multiple functional groups. We present an example case study using pink shrimp (Farfantepenaeus duroarum and compare the results to known distributions. The model successfully predicts the known areas of high abundance in the GoM, including those areas where no data was inputted into the model fitting. Overall, the model reliably captures areas of high and low abundance for the large majority of functional groups observed in SEAMAP. The result of this method allows for the objective setting of spatial distributions for numerous functional groups across a modeling domain, even where abundance data may not exist.

  8. Observations of the Hubble Deep Field with the Infrared Space Observatory .1. Data reduction, maps and sky coverage

    DEFF Research Database (Denmark)

    Serjeant, S.B.G.; Eaton, N.; Oliver, S.J.

    1997-01-01

    We present deep imaging at 6.7 and 15 mu m from the CAM instrument on the Infrared Space Observatory (ISO), centred on the Hubble Deep Field (HDF). These are the deepest integrations published to date at these wavelengths in any region of sky. We discuss the observational strategy and the data...... reduction. The observed source density appears to approach the CAM confusion limit at 15 mu m, and fluctuations in the 6.7-mu m sky background may be identifiable with similar spatial fluctuations in the HDF galaxy counts. ISO appears to be detecting comparable field galaxy populations to the HDF, and our...

  9. The Oncor Geodatabase for the Columbia Estuary Ecosystem Restoration Program: Handbook of Data Reduction Procedures, Workbooks, and Exchange Templates

    Energy Technology Data Exchange (ETDEWEB)

    Sather, Nichole K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Borde, Amy B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Diefenderfer, Heida L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Serkowski, John A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coleman, Andre M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Gary E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-12-01

    This Handbook of Data Reduction Procedures, Workbooks, and Exchange Templates is designed to support the Oncor geodatabase for the Columbia Estuary Ecosystem Restoration Program (CEERP). The following data categories are covered: water-surface elevation and temperature, sediment accretion rate, photo points, herbaceous wetland vegetation cover, tree plots and site summaries, fish catch and density, fish size, fish diet, fish prey, and Chinook salmon genetic stock identification. The handbook is intended for use by scientists collecting monitoring and research data for the CEERP. The ultimate goal of Oncor is to provide quality, easily accessible, geospatial data for synthesis and evaluation of the collective performance of CEERP ecosystem restoration actions at a program scale.

  10. Cultural adaptation into Spanish of the generalized anxiety disorder-7 (GAD-7 scale as a screening tool

    Directory of Open Access Journals (Sweden)

    Pérez-Páramo María

    2010-01-01

    Full Text Available Abstract Background Generalized anxiety disorder (GAD is a prevalent mental health condition which is underestimated worldwide. This study carried out the cultural adaptation into Spanish of the 7-item self-administered GAD-7 scale, which is used to identify probable patients with GAD. Methods The adaptation was performed by an expert panel using a conceptual equivalence process, including forward and backward translations in duplicate. Content validity was assessed by interrater agreement. Criteria validity was explored using ROC curve analysis, and sensitivity, specificity, predictive positive value and negative value for different cut-off values were determined. Concurrent validity was also explored using the HAM-A, HADS, and WHO-DAS-II scales. Results The study sample consisted of 212 subjects (106 patients with GAD with a mean age of 50.38 years (SD = 16.76. Average completion time was 2'30''. No items of the scale were left blank. Floor and ceiling effects were negligible. No patients with GAD had to be assisted to fill in the questionnaire. The scale was shown to be one-dimensional through factor analysis (explained variance = 72%. A cut-off point of 10 showed adequate values of sensitivity (86.8% and specificity (93.4%, with AUC being statistically significant [AUC = 0.957-0.985; p 0.001. Limitations Elderly people, particularly those very old, may need some help to complete the scale. Conclusion After the cultural adaptation process, a Spanish version of the GAD-7 scale was obtained. The validity of its content and the relevance and adequacy of items in the Spanish cultural context were confirmed.

  11. DIDACTIC PRINCIPLES AND PSYCHOLOGICAL CHARACTERISTICS IN DEFINITION OF QUALITY OF SOFTWARE TOOLS FOR EDUCATIONAL PURPOSE IN THE GENERAL EDUCATIONAL ENVIRONMENT OF UKRAINE

    Directory of Open Access Journals (Sweden)

    Maryna V. Pirko

    2011-02-01

    Full Text Available The fundamental feature of economy of postindustrial society is the knowledge that represents the basic source of competitive advantage. In the article the circle of didactic, psychological indicators in researches of problems of achievement of a high degree of quality of education and educational services is considered and described. The attention is paid to pedagogical requirements of the given period which are a standard substantiation in orientations for quality estimation of software tools for educational purpose of the general educational environment in Ukraine. The scheme of internal model of maintenance of quality of software tools for educational purpose is considered, the aspects integrated by internal model of quality of software for educational purpose are listed. The article describes the directions of researches in the conditions of formation of the global international educational environment and uniform information space of  education system taking into account the growth of availability of educational services. It is specified the main principles in the organization of pedagogical software tools.

  12. GPs' knowledge, use, and confidence in national physical activity and health guidelines and tools: a questionnaire-based survey of general practice in England.

    Science.gov (United States)

    Chatterjee, Robin; Chapman, Tim; Brannan, Mike Gt; Varney, Justin

    2017-10-01

    Physical activity (PA) brief advice in health care is effective at getting individuals active. It has been suggested that one in four people would be more active if advised by a GP or nurse, but as many as 72% of GPs do not discuss the benefits of physical activity with patients. To assess the knowledge, use, and confidence in national PA and Chief Medical Officer (CMO) health guidelines and tools among GPs in England. Online questionnaire-based survey of self-selecting GPs in England that took place over a 10-day period in March 2016. The questionnaire consisted of six multiple-choice questions and was available on the Doctors.net.uk (DNUK) homepage. Quotas were used to ensure good regional representation. The final analysis included 1013 responses. Only 20% of responders were broadly or very familiar with the national PA guidelines. In all, 70% of GPs were aware of the General Practice Physical Activity Questionnaire (GPPAQ), but 26% were not familiar with any PA assessment tools, and 55% reported that they had not undertaken any training with respect to encouraging PA. The majority of GPs in England (80%) are unfamiliar with the national PA guidelines. Awareness of the recommended tool for assessment, GPPAQ, is higher than use by GPs. This may be because it is used by other clinical staff, for example, as part of the NHS Health Check programme. Although brief advice in isolation by GPs on PA will only be a part of the behaviour change journey, it is an important prompt, especially if repeated as part of routine practice. This study highlights the need for significant improvement in knowledge, skills, and confidence to maximise the potential for PA advice in GP consultations. © British Journal of General Practice 2017.

  13. TCF7L2 variant genotypes and type 2 diabetes risk in Brazil: significant association, but not a significant tool for risk stratification in the general population

    Directory of Open Access Journals (Sweden)

    Mill JG

    2008-12-01

    Full Text Available Abstract Background Genetic polymorphisms of the TCF7L2 gene are strongly associated with large increments in type 2 diabetes risk in different populations worldwide. In this study, we aimed to confirm the effect of the TCF7L2 polymorphism rs7903146 on diabetes risk in a Brazilian population and to assess the use of this genetic marker in improving diabetes risk prediction in the general population. Methods We genotyped the single nucleotide polymorphisms (SNP rs7903146 of the TCF7L2 gene in 560 patients with known coronary disease enrolled in the MASS II (Medicine, Angioplasty, or Surgery Study Trial and in 1,449 residents of Vitoria, in Southeast Brazil. The associations of this gene variant to diabetes risk and metabolic characteristics in these two different populations were analyzed. To access the potential benefit of using this marker for diabetes risk prediction in the general population we analyzed the impact of this genetic variant on a validated diabetes risk prediction tool based on clinical characteristics developed for the Brazilian general population. Results SNP rs7903146 of the TCF7L2 gene was significantly associated with type 2 diabetes in the MASS-II population (OR = 1.57 per T allele, p = 0.0032, confirming, in the Brazilian population, previous reports of the literature. Addition of this polymorphism to an established clinical risk prediction score did not increased model accuracy (both area under ROC curve equal to 0.776. Conclusion TCF7L2 rs7903146 T allele is associated with a 1.57 increased risk for type 2 diabetes in a Brazilian cohort of patients with known coronary heart disease. However, the inclusion of this polymorphism in a risk prediction tool developed for the general population resulted in no improvement of performance. This is the first study, to our knowledge, that has confirmed this recent association in a South American population and adds to the great consistency of this finding in studies around the world

  14. BASTILLE - Better Analysis Software to Treat ILL Experiments - a unified, unifying approach to data reduction and analysis

    International Nuclear Information System (INIS)

    Johnson, M.

    2011-01-01

    Data reduction and analysis is a key component in the production of scientific results. If this component, like any other in the chain, is weak, the final output is compromised. The current situation for data reduction and analysis may be regarded as adequate, but it is variable, depending on the instrument, and should be improved. In particular the delivery of new and upgraded instruments in Millennium Phase I and those proposed for Phase II will bring new demands and challenges for software development. Failure to meet these challenges will hamper the exploitation of higher data rates and the delivery of new science. The proposed project is to provide a single, underpinning software infrastructure for data analysis, which would ensure: 1) a clear vision of software provision at ILL; 2) a clear role for the 'Computing for Science' Group (CS) in maintaining and developing the infrastructure and the codes; 3) a well-defined framework for recruiting and training CS staff; 4) ease and efficiency of development within a common, well-defined software environment; 5) safeguarding of key, existing software; and 6) ease of communication with other software like instrument control software to allow real-time data analysis and experiment control, or software from other institutes or sources

  15. A two-domain real-time algorithm for optimal data reduction: A case study on accelerator magnet measurements

    CERN Document Server

    Arpaia, P; Inglese, V

    2010-01-01

    A real-time algorithm of data reduction, based on the combination a two lossy techniques specifically optimized for high-rate magnetic measurements in two domains (e.g. time and space), is proposed. The first technique exploits an adaptive sampling rule based on the power estimation of the flux increments in order to optimize the information to be gathered for magnetic field analysis in real time. The tracking condition is defined by the target noise level in the Nyquist band required by post-processing procedure of magnetic analysis. The second technique uses a data reduction algorithm in order to improve the compression ratio while preserving the consistency of the measured signal. The allowed loss is set equal to the random noise level in the signal in order to force the loss and the noise to cancel rather than to add, by improving the signal-to-noise ratio. Numerical analysis and experimental results of on-field performance characterization and validation for two case studies of magnetic measurement syste...

  16. A two-domain real-time algorithm for optimal data reduction: a case study on accelerator magnet measurements

    International Nuclear Information System (INIS)

    Arpaia, Pasquale; Buzio, Marco; Inglese, Vitaliano

    2010-01-01

    A real-time algorithm of data reduction, based on the combination of two lossy techniques specifically optimized for high-rate magnetic measurements in two domains (e.g. time and space), is proposed. The first technique exploits an adaptive sampling rule based on the power estimation of the flux increments in order to optimize the information to be gathered for magnetic field analysis in real time. The tracking condition is defined by the target noise level in the Nyquist band required by the post-processing procedure of magnetic analysis. The second technique uses a data reduction algorithm in order to improve the compression ratio while preserving the consistency of the measured signal. The allowed loss is set equal to the random noise level in the signal in order to force the loss and the noise to cancel rather than to add, by improving the signal-to-noise ratio. Numerical analysis and experimental results of on-field performance characterization and validation for two case studies of magnetic measurement systems for testing magnets of the Large Hadron Collider at the European Organization for Nuclear Research (CERN) are reported

  17. Comparative efficacy of the generalized anxiety disorder 7-item scale and the Edinburgh Postnatal Depression Scale as screening tools for generalized anxiety disorder in pregnancy and the postpartum period.

    Science.gov (United States)

    Simpson, William; Glazer, Melanie; Michalski, Natalie; Steiner, Meir; Frey, Benicio N

    2014-08-01

    About 24.1% of pregnant women suffer from at least 1 anxiety disorder, 8.5% of whom suffer specifically from generalized anxiety disorder (GAD). GAD is often associated with major depressive disorder (MDD). During the perinatal period, the presence of physical and somatic symptoms often makes differentiation between depression and anxiety more challenging. To date, no screening tools have been developed to detect GAD in the perinatal population. We investigated the psychometric properties of the GAD 7-item Scale (GAD-7) as a screening tool for GAD in pregnant and postpartum women. Two hundred and forty perinatal women (n = 155 pregnant and n = 85 postpartum) referred for psychiatric consultation were enrolled. On the day of initial assessment, all women completed the GAD-7 and the Edinburgh Postnatal Depression Scale (EPDS). Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition-based diagnoses were made by experienced psychiatrists. Scores from the GAD-7 and EPDS were compared with the clinical diagnoses to evaluate the psychometric properties of the GAD-7 and EPDS when used as a screening tool for GAD. The GAD-7 yielded a sensitivity of 61.3% and specificity of 72.7% at an optimal cut-off score of 13. Compared with the EPDS and the EPDS-3A subscale, the GAD-7 displayed greater accuracy and specificity over a greater range of cut-off scores and more accurately identified GAD in patients with comorbid MDD. Our findings suggest that the GAD-7 represents a clinically useful scale for the detection of GAD in perinatal women.

  18. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    Science.gov (United States)

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  19. Radiological and nuclear terrorism: within the framework of a re-thought planning system, specific tools are complementary to the general purpose instrument

    International Nuclear Information System (INIS)

    Bekaert, E.

    2006-01-01

    Since 2001, France engaged in the revision of its emergency planning system in order to improve its answer to a major crisis in the field of civil defence and protection. On one side, the Vigipirate, which deals with vigilance and prevention and the response plan of the Pirate family, and particularly Piratom, which is intended to answer the radiological or nuclear terrorist events, are specific and up-to-date tools to deal with the issues of hyper-terrorism. On the other side, the new ORSEC system, the last evolution of one of the main pillars of the global public protection against risks and threats of all kinds, provides the modern methods of emergency planning which allow to consider in the most pragmatic way the response to the risks and threats of our modern world. This coherent planning system establishes the general framework for the action of the responders including medical resources. (author)

  20. General Mission Analysis Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — OverviewGMAT is a feature rich system containing high fidelity space system models, optimization and targeting,built in scripting and programming infrastructure, and...

  1. Predictors for Assessing Electronic Messaging Between Nurses and General Practitioners as a Useful Tool for Communication in Home Health Care Services: A Cross-Sectional Study

    Science.gov (United States)

    Hofoss, Dag; Grimsmo, Anders; Hellesø, Ragnhild

    2015-01-01

    Background Nurses providing home health care services are dependent on access to patient information and communicating with general practitioners (GPs) to deliver safe and effective health care to patients. Information and communication technology (ICT) systems are viewed as powerful tools for this purpose. In Norway, a standardized electronic messaging (e-messaging) system is currently being established in health care. Objective The aim of this study was to explore home health care nurses’ assessments of the utility of the e-messaging system for communicating with GPs and identify elements that influence the assessment of e-messaging as a useful communication tool. Methods The data were collected using a self-developed questionnaire based on variables identified by focus group interviews with home health care nurses (n=425) who used e-messaging and existing research. Data were analyzed using logistic regression analyses. Results Over two-thirds (425/632, 67.2%) of the home health care nurses returned the questionnaire. A high proportion (388/399, 97.2%) of the home health care nurses who returned the questionnaire found the e-messaging system to be a useful tool for communication with GPs. The odds of reporting that e-messaging was a useful tool were over five times higher (OR 5.1, CI 2.489-10.631, Pmessaging was easy to use. The odds of finding e-messaging easy to use were nearly seven times higher (OR 6.9, CI 1.713-27.899, P=.007) if the nurses did not consider the system functionality poor. If the nurses had received training in the use of e-messaging, the odds were over six times higher (OR 6.6, CI 2.515-17.437, Pmessaging easy to use. The odds that a home health care nurse would experience e-messaging as easy to use increased as the full-time equivalent percentage of the nurses increased (OR 1.032, CI 1.001-1.064, P=.045). Conclusions This study has shown that technical (ease of use and system functionality), organizational (training), and individual (full

  2. Useful tool for general practitioners, home health care nurses and social workers in assessing determinants of the health status and treatment of patients visited in their homes

    Directory of Open Access Journals (Sweden)

    Andrzej Brodziak

    2012-09-01

    Full Text Available The necessity is emphasized to distinguish between the traditional model of data acquisition reported by a patient in doctor’s office and the more valuable and desired model to become acquainted with the core of the problem by going to a patient’s domicile. In the desired model it is possible to come across various determinants of health during home visits. Family members can be approached and there is a possibility to evaluate the relationships between the patient and his loved ones. One can visually assess one’s living conditions and predictable environmental hazard. For several years, the desired model has been put into practice by general practitioners and home health care nurses. Recently this model is also promoted by “health care therapists” who are members of “teams of home health care”. The authors, being convinced of the merits of “home and environmental model” of practical medicine, have developed a method of recording and illustrating data collected during visits in patient’s home. The elaborated tool helps to communicate and exchange information among general practitioners, home health care nurses, social workers of primary health care centers and specialists. The method improves the formulation of the plan of further therapeutic steps and remedial interventions in psycho-social relations and living conditions of patients.

  3. DrSPINE - New approach to data reduction and analysis for neutron spin echo experiments from pulsed and reactor sources

    International Nuclear Information System (INIS)

    Zolnierczuk, P.A.; Ohl, M.; Holderer, O.; Monkenbusch, M.

    2015-01-01

    Neutron spin echo (NSE) method at a pulsed neutron source presents new challenges to the data reduction and analysis as compared to the instruments installed at reactor sources. The main advantage of the pulsed source NSE is the ability to resolve the neutron wavelength and collect neutrons over a wider bandwidth. This allows us to more precisely determine the symmetry phase and measure the data for several Q-values at the same time. Based on the experience gained at the SNS NSE - the first, and to date the only one, NSE instrument installed at a pulsed spallation source, we propose a novel and unified approach to the NSE data processing called DrSPINE. The goals of the DrSPINE project are: -) exploit better symmetry phase determination due to the broader bandwidth at a pulsed source; -) take advantage of larger Q coverage for TOF instruments; -) use objective statistical criteria to get the echo fits right; -) provide robust reduction with report generation; -) incorporate absolute instrument calibration; and -) allow for background subtraction. The software must be able to read the data from various instruments, perform data integrity, consistency and compatibility checks and combine the data from compatible sets, partial scans, etc. We chose to provide a console-based interface with the ability to process macros (scripts) for batch evaluation. And last and not the least, a good software package has to provide adequate documentation. DrSPINE software is currently under development

  4. Data reduction pipeline for the CHARIS integral-field spectrograph I: detector readout calibration and data cube extraction

    Science.gov (United States)

    Brandt, Timothy D.; Rizzo, Maxime; Groff, Tyler; Chilcote, Jeffrey; Greco, Johnny P.; Kasdin, N. Jeremy; Limbach, Mary Anne; Galvin, Michael; Loomis, Craig; Knapp, Gillian; McElwain, Michael W.; Jovanovic, Nemanja; Currie, Thayne; Mede, Kyle; Tamura, Motohide; Takato, Naruhisa; Hayashi, Masahiko

    2017-10-01

    We present the data reduction pipeline for CHARIS, a high-contrast integral-field spectrograph for the Subaru Telescope. The pipeline constructs a ramp from the raw reads using the measured nonlinear pixel response and reconstructs the data cube using one of three extraction algorithms: aperture photometry, optimal extraction, or χ2 fitting. We measure and apply both a detector flatfield and a lenslet flatfield and reconstruct the wavelength- and position-dependent lenslet point-spread function (PSF) from images taken with a tunable laser. We use these measured PSFs to implement a χ2-based extraction of the data cube, with typical residuals of ˜5% due to imperfect models of the undersampled lenslet PSFs. The full two-dimensional residual of the χ2 extraction allows us to model and remove correlated read noise, dramatically improving CHARIS's performance. The χ2 extraction produces a data cube that has been deconvolved with the line-spread function and never performs any interpolations of either the data or the individual lenslet spectra. The extracted data cube also includes uncertainties for each spatial and spectral measurement. CHARIS's software is parallelized, written in Python and Cython, and freely available on github with a separate documentation page. Astrometric and spectrophotometric calibrations of the data cubes and PSF subtraction will be treated in a forthcoming paper.

  5. Youth Actuarial Risk Assessment Tool (Y-ARAT): The development of an actuarial risk assessment instrument for predicting general offense recidivism on the basis of police records.

    Science.gov (United States)

    van der Put, Claudia E

    2014-06-01

    Estimating the risk for recidivism is important for many areas of the criminal justice system. In the present study, the Youth Actuarial Risk Assessment Tool (Y-ARAT) was developed for juvenile offenders based solely on police records, with the aim to estimate the risk of general recidivism among large groups of juvenile offenders by police officers without clinical expertise. On the basis of the Y-ARAT, juvenile offenders are classified into five risk groups based on (combinations of) 10 variables including different types of incidents in which the juvenile was a suspect, total number of incidents in which the juvenile was a suspect, total number of other incidents, total number of incidents in which co-occupants at the youth's address were suspects, gender, and age at first incident. The Y-ARAT was developed on a sample of 2,501 juvenile offenders and validated on another sample of 2,499 juvenile offenders, showing moderate predictive accuracy (area under the receiver-operating-characteristic curve = .73), with little variation between the construction and validation sample. The predictive accuracy of the Y-ARAT was considered sufficient to justify its use as a screening instrument for the police. © The Author(s) 2013.

  6. A Generalized Correlation Plot Package for the CEBAF Control System

    International Nuclear Information System (INIS)

    D. Wu; W. Akers; S. Schaffner; H. Shoaee; W. A. Watson; D. Wetherholt

    1996-01-01

    The Correlation Package is a general facility for data acquisition and analysis serving as an online environment for performing a wide variety of machine physics experiments and engineering diagnostics. Typical correlation experiments consist of an initial set of actions followed by stepping one or two accelerator parameters while measuring up to several hundred control system parameters. The package utilizes the CDEV [1] device API to access accelerator systems. A variety of analysis and graphics tools are included through integration with the Matlab math modeling package. A post- acquisition script capability is available to automate the data reduction process. A callable interface allows this facility to serve as the data acquisition and analysis engine for high level applications. A planned interface to archived accelerator data will allow the same analysis and graphics tools to be used for viewing and correlating history data. The object oriented design and C++ implementation details as well as the current status of the Correlation Package will be presented

  7. The Panchromatic High-Resolution Spectroscopic Survey of Local Group Star Clusters. I. General data reduction procedures for the VLT/X-shooter UVB and VIS arm

    NARCIS (Netherlands)

    Schönebeck, Frederik; Puzia, Thomas H.; Pasquali, Anna; Grebel, Eva K.; Kissler-Patig, Markus; Kuntschner, Harald; Lyubenova, Mariya; Perina, Sibilla

    2014-01-01

    Aims: Our dataset contains spectroscopic observations of 29 globular clusters in the Magellanic Clouds and the Milky Way performed with VLT/X-shooter over eight full nights. To derive robust results instrument and pipeline systematics have to be well understood and properly modeled. We aim at a

  8. Development of the online data reduction system and feasibility studies of 6-layer tracking for the Belle II pixel detector

    Energy Technology Data Exchange (ETDEWEB)

    Muenchow, David

    2015-04-24

    The Belle II experiment, the upgrade of the Belle experiment, at KEK (High Energy Accelerator Research Organization) in Tsukuba, Japan, will be built to answer fundamental questions that are not covered by the Standard Model of particle physics. For this reason, decays should be observed with high precision. To be able to measure all decay products with a very accurate vertex resolution, it was decided to add a Pixel Detector (PXD) with an inner radius of only 14 mm in short distance around the beam (outer radius 12.5 mm). This increases the vertex resolution and it is possible to improve the reconstruction efficiency and accuracy. Because of the short distance to the interaction point, we expect to have a background induced occupancy of up to 3% on the pixel detector. This generates an expected data rate of about 20 GB/s and exceeds the bandwidth limitations of the data storage. Based on hits in the outer detectors, back projections of particle tracks are performed and Region of Interests (ROI) on the PXD sensors are calculated. Based on those ROIs the data are reduced. In this thesis I present my development of the ROI based data reduction algorithm as well as my feasibility studies about a future 6-layer tracking. Online Data Reduction for Belle II A first test with the whole DAQ integration and prototype sensors of PXD and SVD had been performed at DESY. For the verification of the ROI selection logic a full recording of in- and output data was included. With this setup I recorded 1.2.10{sup 6} events containing in total 4.8.10{sup 8} hits. The occupancy of originally ∼ 0.80% was reduced with my ROI selection logic by a factor of 6.9 to ∼ 0.12% by rejecting all hits outside any ROI. In addition I investigated the ROI positioning and got a result of a distance between ROI center and hit of 17.624±0.029 with a main offset direction of (π)/(2) and (3π)/(2). With a more accurate position of the ROIs their size could be reduced which would optimize the

  9. BioXTAS RAW, a software program for high-throughput automated small-angle X-ray scattering data reduction and preliminary analysis

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Toft, K.N.; Snakenborg, Detlef

    2009-01-01

    A fully open source software program for automated two-dimensional and one-dimensional data reduction and preliminary analysis of isotropic small-angle X-ray scattering (SAXS) data is presented. The program is freely distributed, following the open-source philosophy, and does not rely on any...... commercial software packages. BioXTAS RAW is a fully automated program that, via an online feature, reads raw two-dimensional SAXS detector output files and processes and plots data as the data files are created during measurement sessions. The software handles all steps in the data reduction. This includes...... mask creation, radial averaging, error bar calculation, artifact removal, normalization and q calibration. Further data reduction such as background subtraction and absolute intensity scaling is fast and easy via the graphical user interface. BioXTAS RAW also provides preliminary analysis of one...

  10. [General principles of database storage operations with emphasis on OLAP reports].

    Science.gov (United States)

    Borkowski, Włodzimierz; Mielniczuk, Hanna

    2004-01-01

    In article general principles and features of data warehouse were presented in particular of OLAP reports. The data warehouse was built using Oracle tools. The repository was filled with death records from Central Office of Statistics. Various features adequate for epidemiological analyses have been discussed and illustrated like pivoting and rotating dimension, drilling on hierarchical data, reduction of dimensions. The possibility of specific for epidemiology indicators creation was shown. The need of implementation of data warehouses and OLAP reports in Polish healthcare was discussed. In comparison with traditional manner of analysis and presentation epidemiological facts OLAP reports give new perspectives.

  11. The neutron porosity tool

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1988-01-01

    The report contains a review of available information on neutron porosity tools with the emphasis on dual thermal-neutron-detector porosity tools and epithermal-neutron-detector porosity tools. The general principle of such tools is discussed and theoretical models are very briefly reviewed. Available data on tool designs are summarized with special regard to the source-detector distance. Tool operational data, porosity determination and correction of measurements are briefly discussed. (author) 15 refs

  12. Modelling the Common Agricultural Policy with the Modular Agricultural GeNeral Equilibrium Tool (MAGNET). Effects of the 2014-2020 CAP financial agreement on welfare, trade, factor and product markets

    OpenAIRE

    BOULANGER PIERRE; PHILIPPIDIS GEORGE

    2013-01-01

    This JRC report presents methodological development of the Modular Applied GeNeral Equilibrium Tool (MAGNET), a global computable general equilibrium (CGE) model, for representing the Common Agricultural Policy (CAP). Using original data on European Union (EU) domestic support, it examines some likely macroeconomic effects of the expected CAP budget over the period 2014-2020. Results suggest that agreed budget cuts, in constant price, have limited impacts on EU and world markets, given the br...

  13. Exploration, analysis and explanation of 'employee satisfaction' as an organization development and general improvement tool for the it sector of Pakistan

    International Nuclear Information System (INIS)

    Ahsan, A.; Kiani, H.S.; Khurshid, O.

    2011-01-01

    Pakistan's IT industry is currently one of the top performers as compared to other industrial sectors within Pakistan. As per the findings of Ahsan (2008), despite the fact that Pakistan's IT industry is competitive (with respect to other industries within Pakistan), its true potential is yet to be unfolded. Ahsan (2008); states that Pakistan's so called competitive IT industry has to be in lined with the international performers (Particularly South Asian economies). A simple proof of this statement can be obtained from the fact that Pakistan's general economy is 1/5 of Indian economy. This must be true for IT sector of both the economies, which, unfortunately is not the case because Pakistan's IT sector is currently 1/27 of the Indian IT Sector. Ahsan (2008) believes that partial reason of this unwanted difference may be revenue models, business practices and political situations of the two countries. Other than these reasons Ahsan (2008) believes that several soft issues are also responsible for this industrial difference. Out of these soft issues 'motivation' is one such important factor. The role of motivation as an imperative soft issue for revitalizing workforce can also be reproduced for the discussion concerning the role of 'basic employee satisfaction' as an organisation's productivity and quality enhancement tool. Employees, being an integral asset of the organizations, impact organizations in accomplishment of their objectives. The impact of employee satisfaction in software industry of Pakistan is relatively less known but plays significant role. This paper analyzes the major causes of employee satisfaction and the impact of employee satisfaction on quality and productivity dimensions (particularly) in the IT organizations in Pakistan. This research presents analysis of 'Employee Satisfaction' for IT sector of Pakistan. The study not only explores but also presents detailed explanation and analysis of the subject area for the IT industry of Pakistan by

  14. Risk factors and trends in attempting or committing suicide in Dutch general practice in 1983–2009 and tools for early recognition.

    NARCIS (Netherlands)

    Donker, G.A.; Wolters, I.; Schellevis, F.

    2010-01-01

    Background: Many patients visit their general practitioner (GP) before attempting or committing suicide. This study analyses determinants and trends of suicidal behaviour to enhance early recognition of risk factors in general practice. Method: Analysis of trends, patient and treatment

  15. Improvements in Precise and Accurate Isotope Ratio Determination via LA-MC-ICP-MS by Application of an Alternative Data Reduction Protocol

    Science.gov (United States)

    Fietzke, J.; Liebetrau, V.; Guenther, D.; Frische, M.; Zumholz, K.; Hansteen, T. H.; Eisenhauer, A.

    2008-12-01

    An alternative approach for the evaluation of isotope ratio data using LA-MC-ICP-MS will be presented. In contrast to previously applied methods it is based on the simultaneous responses of all analyte isotopes of interest and the relevant interferences without performing a conventional background correction. Significant improvements in precision and accuracy can be achieved when applying this new method and will be discussed based on the results of two first methodical applications: a) radiogenic and stable Sr isotopes in carbonates b) stable chlorine isotopes of pyrohydrolytic extracts. In carbonates an external reproducibility of the 87Sr/86Sr ratios of about 19 ppm (RSD) was achieved, an improvement of about a factor of 5. For recent and sub-recent marine carbonates a mean radiogenic strontium isotope ratio 87Sr/86Sr of 0.709170±0.000007 (2SE) was determined, which agrees well with the value of 0.7091741±0.0000024 (2SE) reported for modern sea water [1,2]. Stable chlorine isotope ratios were determined ablating pyrohydrolytic extracts with a reproducibility of about 0.05‰ (RSD). For basaltic reference material JB1a and JB2 chlorine isotope ratios were determined relative to SMOC (standard mean ocean chlorinity) δ37ClJB-1a = (-0.99±0.06) ‰ and δ37ClJB-1a = (-0.60±0.03) ‰ (SD), respectively, in accordance with published data [3]. The described strategies for data reduction are considered to be generally applicable for all isotope ratio measurements using LA-MC-ICP-MS. [1] J.M. McArthur, D. Rio, F. Massari, D. Castradori, T.R. Bailey, M. Thirlwall, S. Houghton, Palaeogeo. Palaeoclim. Palaeoeco., 2006, 242 (126), doi: 10.1016/j.palaeo.2006.06.004 [2] J. Fietzke, V. Liebetrau, D. Guenther, K. Guers, K. Hametner, K. Zumholz, T.H. Hansteen and A. Eisenhauer, J. Anal. At. Spectrom., 2008, 23, 955-961, doi:10.1039/B717706B [3] J. Fietzke, M. Frische, T.H. Hansteen and A. Eisenhauer, J. Anal. At. Spectrom., 2008, 23, 769-772, doi:10.1039/B718597A

  16. Two-Year Community: Tools for Success--A Study of the Resources and Study Habits of General Chemistry I Students at Two Community Colleges

    Science.gov (United States)

    Bruck, Laura B.; Bruck, Aaron D.

    2018-01-01

    Recruitment and retention in the sciences is both difficult and crucial, especially in the community college setting. In this study, the resources used by General Chemistry I students at two different public, predominantly two-year colleges in two states were studied via surveys for a semester. Data were analyzed with respect to student attitudes…

  17. Visualization of scientific data for high energy physics: PAW, a general-purpose portable software tool for data analysis and presentation

    International Nuclear Information System (INIS)

    Brun, R.; Couet, O.; Vandoni, C.E.; Zanarini, P.

    1990-01-01

    Visualization of scientific data although a fashionable word in the world of computer graphics, is not a new invention, but it is hundreds years old. With the advent of computer graphics the visualization of Scientific Data has now become a well understood and widely used technology, with hundreds of applications in the most different fields, ranging from media applications to real scientific ones. In the present paper, we shall discuss the design concepts of the Visualization of Scientific Data systems in particular in the specific field of High Energy Physics. During the last twenty years, CERN has played a leading role as the focus for development of packages and software libraries to solve problems related to High Energy Physics (HEP). The results of the integration of resources from many different Laboratories can be expressed in several million lines of code written at CERN during this period of time, used at CERN and distributed to collaborating laboratories. Nowadays, this role of software developer is considered very important by the entire HEP community. In this paper a large software package, where man-machine interaction and graphics play a key role (PAW-Physics Analysis Workstation), is described. PAW is essentially an interactive system which includes many different software tools, strongly oriented towards data analysis and data presentation. Some of these tools have been available in different forms and with different human interfaces for several years. 6 figs

  18. Nutritional screening tools application in a general hospital: a comparative study Aplicação de instrumentos de triagem nutricional em hospital geral: um estudo comparativo

    Directory of Open Access Journals (Sweden)

    Janaína Damasceno Bezerra

    2012-05-01

    Full Text Available Introduction: There are many nutritional screening tools and it becomes difficult to choose which one is the best to be used in clinical nutrition practice. Objective: To compare five nutritional screening tools (MST, NRS-2002, MUST, MNA and MNA-SF in adults and elderly hospitalized. Materials and Methods: A cross-sectional study, with the application of nutritional screening tools in adult and elderly patients in the first 48 hours of hospitalization was performed. Nutritional risk occurrence between adult and elderly patients was compared. Statistical analyses were performed using descriptive data and a non-parametric test (Man Whitney. Results: We evaluated 77 patients, 51 (66.2% adults and 26 (33.8% elderly, aged 53.6 (standard deviation of 17.9 years, with female predominance (53.2%. The main reasons for hospitalization were neoplasia and nephrolithotripsy. Overall, one quarter of patients was at nutritional risk. Nutritional risk in adults was detected with similarity by MUST and MST. However it was underestimated by NRS-2002. The MNA and MNA-SF, exclusively for the elderly, also had similar result to detect nutritional risk. In relation to the time of application, the MNA was the instrument with longer application time. Conclusion: Considering the higher detection of patients with nutritional risk, the easiness and the lower application time, we suggest, respectively, MUST and MNA-SF to be used in adult and elderly patients admitted in this hospital.Introdução: Com inúmeros instrumentos de triagem nutricional existentes, é difícil eleger o mais adequado para os protocolos de nutrição hospitalar. Objetivo: Comparar cinco instrumentos de triagem nutricional (MST, NRS-2002, MUST, MNA e MNA-SF em adultos e idosos hospitalizados. Materiais e Métodos: Nesse estudo transversal, cinco instrumentos de triagem nutricional foram aplicados aos pacientes nas primeiras 48 horas de internação hospitalar. A ocorrência de risco nutricional

  19. Human bones obtained from routine joint replacement surgery as a tool for studies of plutonium, americium and {sup 90}Sr body-burden in general public

    Energy Technology Data Exchange (ETDEWEB)

    Mietelski, Jerzy W., E-mail: jerzy.mietelski@ifj.edu.pl [Henryk Niewodniczanski Institute of Nuclear Physics, Polish Academy of Sciences, Radzikowskiego 152, 31-342 Cracow (Poland); Golec, Edward B. [Traumatology and Orthopaedic Clinic, 5th Military Clinical Hospital and Polyclinic, Independent Public Healthcare Facility, Wroclawska 1-3, 30-901 Cracow (Poland); Orthopaedic Rehabilitation Department, Chair of Clinical Rehabilitation, Faculty of Motor of the Bronislaw Czech' s Academy of Physical Education, Cracow (Poland); Department of Physical Therapy Basics, Faculty of Physical Therapy, Administration College, Bielsko-Biala (Poland); Tomankiewicz, Ewa [Henryk Niewodniczanski Institute of Nuclear Physics, Polish Academy of Sciences, Radzikowskiego 152, 31-342 Cracow (Poland); Golec, Joanna [Orthopaedic Rehabilitation Department, Chair of Clinical Rehabilitation, Faculty of Motor of the Bronislaw Czech' s Academy of Physical Education, Cracow (Poland); Physical Therapy Department, Institute of Physical Therapy, Faculty of Heath Science, Jagiellonian University, Medical College, Cracow (Poland); Nowak, Sebastian [Traumatology and Orthopaedic Clinic, 5th Military Clinical Hospital and Polyclinic, Independent Public Healthcare Facility, Wroclawska 1-3, 30-901 Cracow (Poland); Orthopaedic Rehabilitation Department, Chair of Clinical Rehabilitation, Faculty of Motor of the Bronislaw Czech' s Academy of Physical Education, Cracow (Poland); Szczygiel, Elzbieta [Physical Therapy Department, Institute of Physical Therapy, Faculty of Heath Science, Jagiellonian University, Medical College, Cracow (Poland); Brudecki, Kamil [Henryk Niewodniczanski Institute of Nuclear Physics, Polish Academy of Sciences, Radzikowskiego 152, 31-342 Cracow (Poland)

    2011-06-15

    The paper presents a new sampling method for studying in-body radioactive contamination by bone-seeking radionuclides such as {sup 90}Sr, {sup 239+240}Pu, {sup 238}Pu, {sup 241}Am and selected gamma-emitters, in human bones. The presented results were obtained for samples retrieved from routine surgeries, namely knee or hip joints replacements with implants, performed on individuals from Southern Poland. This allowed to collect representative sets of general public samples. The applied analytical radiochemical procedure for bone matrix is described in details. Due to low concentrations of {sup 238}Pu the ratio of Pu isotopes which might be used for Pu source identification is obtained only as upper limits other then global fallout (for example Chernobyl) origin of Pu. Calculated concentrations of radioisotopes are comparable to the existing data from post-mortem studies on human bones retrieved from autopsy or exhumations. Human bones removed during knee or hip joint surgery provide a simple and ethical way for obtaining samples for plutonium, americium and {sup 90}Sr in-body contamination studies in general public. - Highlights: > Surgery for joint replacement as novel sampling method for studying in-body radioactive contamination. > Proposed way of sampling is not causing ethic doubts. > It is a convenient way of collecting human bone samples from global population. > The applied analytical radiochemical procedure for bone matrix is described in details. > The opposite patient age correlations trends were found for 90Sr (negative) and Pu, Am (positive).

  20. Supervised learning of tools for content-based search of image databases

    Science.gov (United States)

    Delanoy, Richard L.

    1996-03-01

    A computer environment, called the Toolkit for Image Mining (TIM), is being developed with the goal of enabling users with diverse interests and varied computer skills to create search tools for content-based image retrieval and other pattern matching tasks. Search tools are generated using a simple paradigm of supervised learning that is based on the user pointing at mistakes of classification made by the current search tool. As mistakes are identified, a learning algorithm uses the identified mistakes to build up a model of the user's intentions, construct a new search tool, apply the search tool to a test image, display the match results as feedback to the user, and accept new inputs from the user. Search tools are constructed in the form of functional templates, which are generalized matched filters capable of knowledge- based image processing. The ability of this system to learn the user's intentions from experience contrasts with other existing approaches to content-based image retrieval that base searches on the characteristics of a single input example or on a predefined and semantically- constrained textual query. Currently, TIM is capable of learning spectral and textural patterns, but should be adaptable to the learning of shapes, as well. Possible applications of TIM include not only content-based image retrieval, but also quantitative image analysis, the generation of metadata for annotating images, data prioritization or data reduction in bandwidth-limited situations, and the construction of components for larger, more complex computer vision algorithms.

  1. The potential of Virtual Reality as anxiety management tool: a randomized controlled study in a sample of patients affected by Generalized Anxiety Disorder

    Directory of Open Access Journals (Sweden)

    Gorini Alessandra

    2008-05-01

    Full Text Available Abstract Background Generalized anxiety disorder (GAD is a psychiatric disorder characterized by a constant and unspecific anxiety that interferes with daily-life activities. Its high prevalence in general population and the severe limitations it causes, point out the necessity to find new efficient strategies to treat it. Together with the cognitive-behavioural treatments, relaxation represents a useful approach for the treatment of GAD, but it has the limitation that it is hard to be learned. To overcome this limitation we propose the use of virtual reality (VR to facilitate the relaxation process by visually presenting key relaxing images to the subjects. The visual presentation of a virtual calm scenario can facilitate patients' practice and mastery of relaxation, making the experience more vivid and real than the one that most subjects can create using their own imagination and memory, and triggering a broad empowerment process within the experience induced by a high sense of presence. According to these premises, the aim of the present study is to investigate the advantages of using a VR-based relaxation protocol in reducing anxiety in patients affected by GAD. Methods/Design The trial is based on a randomized controlled study, including three groups of 25 patients each (for a total of 75 patients: (1 the VR group, (2 the non-VR group and (3 the waiting list (WL group. Patients in the VR group will be taught to relax using a VR relaxing environment and audio-visual mobile narratives; patients in the non-VR group will be taught to relax using the same relaxing narratives proposed to the VR group, but without the VR support, and patients in the WL group will not receive any kind of relaxation training. Psychometric and psychophysiological outcomes will serve as quantitative dependent variables, while subjective reports of participants will be used as qualitative dependent variables. Conclusion We argue that the use of VR for relaxation

  2. From Forecasters to the General Public: A Communication Tool to Understand Decision-making Challenges in Weather-related Early Warning Systems

    Science.gov (United States)

    Terti, G.; Ruin, I.; Kalas, M.; Lorini, V.; Sabbatini, T.; i Alonso, A. C.

    2017-12-01

    New technologies are currently adopted worldwide to improve weather forecasts and communication of the corresponding warnings to the end-users. "EnhANcing emergency management and response to extreme WeatHER and climate Events" (ANYWHERE) project is an innovating action that aims at developing and implementing a European decision-support platform for weather-related risks integrating cutting-edge forecasting technology. The initiative is built in a collaborative manner where researchers, developers, potential users and other stakeholders meet frequently to define needs, capabilities and challenges. In this study, we propose a role-playing game to test the added value of the ANYWHERE platform on i) the decision-making process and the choice of warning levels under uncertainty, ii) the management of the official emergency response and iii) the crisis communication and triggering of protective actions at different levels of the warning system (from hazard detection to citizen response). The designed game serves as an interactive communication tool. Here, flood and flash flood focused simulations seek to enhance participant's understanding of the complexities and challenges embedded in various levels of the decision-making process under the threat of weather disasters (e.g., forecasting/warnings, official emergency actions, self-protection). Also, we facilitate collaboration and coordination between the participants who belong to different national or local agencies/authorities across Europe. The game is first applied and tested in ANYWHERE's workshop in Helsinki (September, 2017) where about 30-50 people, including researchers, forecasters, civil protection and representatives of related companies, are anticipated to play the simulation. The main idea is to provide to the players a virtual case study that well represents realistic uncertainties and dilemmas embedded in the real-time forecasting-warning processes. At the final debriefing step the participants are

  3. Pneumocafé project: an inquiry on current COPD diagnosis and management among General Practitioners in Italy through a novel tool for professional education.

    Science.gov (United States)

    Sanguinetti, Claudio M; De Benedetto, Fernando; Donner, Claudio F; Nardini, Stefano; Visconti, Alberto

    2014-01-01

    Symptoms of COPD are frequently disregarded by patients and also by general practitioners (GPs) in early stages of the disease, that consequently is diagnosed when already at an advanced grade of severity. Underdiagnosis and undertreatment of COPD and scarce use of spirometry are widely recurrent, while a better knowledge of the disease and a wider use of spirometry would be critical to diagnose more patients still neglected, do it at an earlier stage and properly treat established COPD. The aim of Pneumocafè project is to improve, through an innovative approach, the diagnosis and management of COPD at primary care level increasing the awareness of issues pertaining to early diagnosis, adequate prevention and correct treatment of the disease. Pneumocafè is based on informal meetings between GPs of various geographical zones of Italy and their reference respiratory specialist (RS), aimed at discussing the current practice in comparison to suggestions of official guidelines, analyzing the actual problems in diagnosing and managing COPD patients and sharing the possible solution at the community level. In these meetings RSs faced many issues including patho-physiological mechanisms of bronchial obstruction, significance of clinical symptoms, patients' phenotyping, and clinical approach to diagnosis and long-term treatment, also reinforcing the importance of a timely diagnosis, proper long term treatment and the compliance to treatment. At the end of each meeting GPs had to fill in a questionnaire arranged by the scientific board of the Project that included 18 multiple-choice questions concerning their approach to COPD management. The results of the analysis of these questionnaires are here presented. 1, 964 questionnaires were returned from 49 RSs. 1,864 questionnaires out of those received (94.91% of the total) resulted properly compiled and form the object of the present analysis. The 49 RSs, 37 males and 12 females, were distributed all over the Italian country

  4. Improved Data Reduction Algorithm for the Needle Probe Method Applied to In-Situ Thermal Conductivity Measurements of Lunar and Planetary Regoliths

    Science.gov (United States)

    Nagihara, S.; Hedlund, M.; Zacny, K.; Taylor, P. T.

    2013-01-01

    The needle probe method (also known as the' hot wire' or 'line heat source' method) is widely used for in-situ thermal conductivity measurements on soils and marine sediments on the earth. Variants of this method have also been used (or planned) for measuring regolith on the surfaces of extra-terrestrial bodies (e.g., the Moon, Mars, and comets). In the near-vacuum condition on the lunar and planetary surfaces, the measurement method used on the earth cannot be simply duplicated, because thermal conductivity of the regolith can be approximately 2 orders of magnitude lower. In addition, the planetary probes have much greater diameters, due to engineering requirements associated with the robotic deployment on extra-terrestrial bodies. All of these factors contribute to the planetary probes requiring much longer time of measurement, several tens of (if not over a hundred) hours, while a conventional terrestrial needle probe needs only 1 to 2 minutes. The long measurement time complicates the surface operation logistics of the lander. It also negatively affects accuracy of the thermal conductivity measurement, because the cumulative heat loss along the probe is no longer negligible. The present study improves the data reduction algorithm of the needle probe method by shortening the measurement time on planetary surfaces by an order of magnitude. The main difference between the new scheme and the conventional one is that the former uses the exact mathematical solution to the thermal model on which the needle probe measurement theory is based, while the latter uses an approximate solution that is valid only for large times. The present study demonstrates the benefit of the new data reduction technique by applying it to data from a series of needle probe experiments carried out in a vacuum chamber on JSC-1A lunar regolith stimulant. The use of the exact solution has some disadvantage, however, in requiring three additional parameters, but two of them (the diameter and the

  5. Generalized Free-Surface Effect and Random Vibration Theory: a new tool for computing moment magnitudes of small earthquakes using borehole data

    Science.gov (United States)

    Malagnini, Luca; Dreger, Douglas S.

    2016-07-01

    Although optimal, computing the moment tensor solution is not always a viable option for the calculation of the size of an earthquake, especially for small events (say, below Mw 2.0). Here we show an alternative approach to the calculation of the moment-rate spectra of small earthquakes, and thus of their scalar moments, that uses a network-based calibration of crustal wave propagation. The method works best when applied to a relatively small crustal volume containing both the seismic sources and the recording sites. In this study we present the calibration of the crustal volume monitored by the High-Resolution Seismic Network (HRSN), along the San Andreas Fault (SAF) at Parkfield. After the quantification of the attenuation parameters within the crustal volume under investigation, we proceed to the spectral correction of the observed Fourier amplitude spectra for the 100 largest events in our data set. Multiple estimates of seismic moment for the all events (1811 events total) are obtained by calculating the ratio of rms-averaged spectral quantities based on the peak values of the ground velocity in the time domain, as they are observed in narrowband-filtered time-series. The mathematical operations allowing the described spectral ratios are obtained from Random Vibration Theory (RVT). Due to the optimal conditions of the HRSN, in terms of signal-to-noise ratios, our network-based calibration allows the accurate calculation of seismic moments down to Mw < 0. However, because the HRSN is equipped only with borehole instruments, we define a frequency-dependent Generalized Free-Surface Effect (GFSE), to be used instead of the usual free-surface constant F = 2. Our spectral corrections at Parkfield need a different GFSE for each side of the SAF, which can be quantified by means of the analysis of synthetic seismograms. The importance of the GFSE of borehole instruments increases for decreasing earthquake's size because for smaller earthquakes the bandwidth available

  6. Benchmark calculations of power distribution within fuel assemblies. Phase 2: comparison of data reduction and power reconstruction methods in production codes

    International Nuclear Information System (INIS)

    2000-01-01

    Systems loaded with plutonium in the form of mixed-oxide (MOX) fuel show somewhat different neutronic characteristics compared with those using conventional uranium fuels. In order to maintain adequate safety standards, it is essential to accurately predict the characteristics of MOX-fuelled systems and to further validate both the nuclear data and the computation methods used. A computation benchmark on power distribution within fuel assemblies to compare different techniques used in production codes for fine flux prediction in systems partially loaded with MOX fuel was carried out at an international level. It addressed first the numerical schemes for pin power reconstruction, then investigated the global performance including cross-section data reduction methods. This report provides the detailed results of this second phase of the benchmark. The analysis of the results revealed that basic data still need to be improved, primarily for higher plutonium isotopes and minor actinides. (author)

  7. Tools and their uses

    CERN Document Server

    1973-01-01

    Teaches names, general uses, and correct operation of all basic hand and power tools, fasteners, and measuring devices you are likely to need. Also, grinding, metal cutting, soldering, and more. 329 illustrations.

  8. Comparing the ISO-recommended and the cumulative data-reduction algorithms in S-on-1 laser damage test by a reverse approach method

    Science.gov (United States)

    Zorila, Alexandru; Stratan, Aurel; Nemes, George

    2018-01-01

    We compare the ISO-recommended (the standard) data-reduction algorithm used to determine the surface laser-induced damage threshold of optical materials by the S-on-1 test with two newly suggested algorithms, both named "cumulative" algorithms/methods, a regular one and a limit-case one, intended to perform in some respects better than the standard one. To avoid additional errors due to real experiments, a simulated test is performed, named the reverse approach. This approach simulates the real damage experiments, by generating artificial test-data of damaged and non-damaged sites, based on an assumed, known damage threshold fluence of the target and on a given probability distribution function to induce the damage. In this work, a database of 12 sets of test-data containing both damaged and non-damaged sites was generated by using four different reverse techniques and by assuming three specific damage probability distribution functions. The same value for the threshold fluence was assumed, and a Gaussian fluence distribution on each irradiated site was considered, as usual for the S-on-1 test. Each of the test-data was independently processed by the standard and by the two cumulative data-reduction algorithms, the resulting fitted probability distributions were compared with the initially assumed probability distribution functions, and the quantities used to compare these algorithms were determined. These quantities characterize the accuracy and the precision in determining the damage threshold and the goodness of fit of the damage probability curves. The results indicate that the accuracy in determining the absolute damage threshold is best for the ISO-recommended method, the precision is best for the limit-case of the cumulative method, and the goodness of fit estimator (adjusted R-squared) is almost the same for all three algorithms.

  9. Tool Storage Problem Solved!

    Science.gov (United States)

    Klenke, Andrew M.; Dell, Tim W.

    2007-01-01

    Graduates of the automotive technology program at Pittsburg State University (PSU) generally enter the workforce in some type of automotive management role. As a result, the program does not require students to purchase their own tools, and it does not have room for all 280 majors to roll around a personal tool chest. Each instructor must maintain…

  10. Case and Administrative Support Tools

    Data.gov (United States)

    U.S. Environmental Protection Agency — Case and Administrative Support Tools (CAST) is the secure portion of the Office of General Counsel (OGC) Dashboard business process automation tool used to help...

  11. Generalized Superconductivity. Generalized Levitation

    International Nuclear Information System (INIS)

    Ciobanu, B.; Agop, M.

    2004-01-01

    In the recent papers, the gravitational superconductivity is described. We introduce the concept of generalized superconductivity observing that any nongeodesic motion and, in particular, the motion in an electromagnetic field, can be transformed in a geodesic motion by a suitable choice of the connection. In the present paper, the gravitoelectromagnetic London equations have been obtained from the generalized Helmholtz vortex theorem using the generalized local equivalence principle. In this context, the gravitoelectromagnetic Meissner effect and, implicitly, the gravitoelectromagnetic levitation are given. (authors)

  12. General Relativity

    CERN Document Server

    Straumann, Norbert

    2013-01-01

    This book provides a completely revised and expanded version of the previous classic edition ‘General Relativity and Relativistic Astrophysics’. In Part I the foundations of general relativity are thoroughly developed, while Part II is devoted to tests of general relativity and many of its applications. Binary pulsars – our best laboratories for general relativity – are studied in considerable detail. An introduction to gravitational lensing theory is included as well, so as to make the current literature on the subject accessible to readers. Considerable attention is devoted to the study of compact objects, especially to black holes. This includes a detailed derivation of the Kerr solution, Israel’s proof of his uniqueness theorem, and a derivation of the basic laws of black hole physics. Part II ends with Witten’s proof of the positive energy theorem, which is presented in detail, together with the required tools on spin structures and spinor analysis. In Part III, all of the differential geomet...

  13. General general game AI

    OpenAIRE

    Togelius, Julian; Yannakakis, Georgios N.; 2016 IEEE Conference on Computational Intelligence and Games (CIG)

    2016-01-01

    Arguably the grand goal of artificial intelligence research is to produce machines with general intelligence: the capacity to solve multiple problems, not just one. Artificial intelligence (AI) has investigated the general intelligence capacity of machines within the domain of games more than any other domain given the ideal properties of games for that purpose: controlled yet interesting and computationally hard problems. This line of research, however, has so far focuse...

  14. Exploring students' perceptions on the use of significant event analysis, as part of a portfolio assessment process in general practice, as a tool for learning how to use reflection in learning.

    Science.gov (United States)

    Grant, Andrew J; Vermunt, Jan D; Kinnersley, Paul; Houston, Helen

    2007-03-30

    Portfolio learning enables students to collect evidence of their learning. Component tasks making up a portfolio can be devised that relate directly to intended learning outcomes. Reflective tasks can stimulate students to recognise their own learning needs. Assessment of portfolios using a rating scale relating to intended learning outcomes offers high content validity. This study evaluated a reflective portfolio used during a final-year attachment in general practice (family medicine). Students were asked to evaluate the portfolio (which used significant event analysis as a basis for reflection) as a learning tool. The validity and reliability of the portfolio as an assessment tool were also measured. 81 final-year medical students completed reflective significant event analyses as part of a portfolio created during a three-week attachment (clerkship) in general practice (family medicine). As well as two reflective significant event analyses each portfolio contained an audit and a health needs assessment. Portfolios were marked three times; by the student's GP teacher, the course organiser and by another teacher in the university department of general practice. Inter-rater reliability between pairs of markers was calculated. A questionnaire enabled the students' experience of portfolio learning to be determined. Benefits to learning from reflective learning were limited. Students said that they thought more about the patients they wrote up in significant event analyses but information as to the nature and effect of this was not forthcoming. Moderate inter-rater reliability (Spearman's Rho .65) was found between pairs of departmental raters dealing with larger numbers (20-60) of portfolios. Inter-rater reliability of marking involving GP tutors who only marked 1-3 portfolios was very low. Students rated highly their mentoring relationship with their GP teacher but found the portfolio tasks time-consuming. The inter-rater reliability observed in this study should

  15. Science in General Education

    Science.gov (United States)

    Read, Andrew F.

    2013-01-01

    General education must develop in students an appreciation of the power of science, how it works, why it is an effective knowledge generation tool, and what it can deliver. Knowing what science has discovered is desirable but less important.

  16. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  17. Tools of radio astronomy

    CERN Document Server

    Wilson, Thomas L; Hüttemeister, Susanne

    2013-01-01

    This 6th edition of “Tools of Radio Astronomy”, the most used introductory text in radio astronomy, has been revised to reflect the current state of this important branch of astronomy. This includes the use of satellites, low radio frequencies, the millimeter/sub-mm universe, the Cosmic Microwave Background and the increased importance of mm/sub-mm dust emission. Several derivations and presentations of technical aspects of radio astronomy and receivers, such as receiver noise, the Hertz dipole and  beam forming have been updated, expanded, re-worked or complemented by alternative derivations. These reflect advances in technology. The wider bandwidths of the Jansky-VLA and long wave arrays such as LOFAR and mm/sub-mm arrays such as ALMA required an expansion of the discussion of interferometers and aperture synthesis. Developments in data reduction algorithms have been included. As a result of the large amount of data collected in the past 20 years, the discussion of solar system radio astronomy, dust em...

  18. REPLICATION TOOL AND METHOD OF PROVIDING A REPLICATION TOOL

    DEFF Research Database (Denmark)

    2016-01-01

    The invention relates to a replication tool (1, 1a, 1b) for producing a part (4) with a microscale textured replica surface (5a, 5b, 5c, 5d). The replication tool (1, 1a, 1b) comprises a tool surface (2a, 2b) defining a general shape of the item. The tool surface (2a, 2b) comprises a microscale...... energy directors on flange portions thereof uses the replication tool (1, 1a, 1b) to form an item (4) with a general shape as defined by the tool surface (2a, 2b). The formed item (4) comprises a microscale textured replica surface (5a, 5b, 5c, 5d) with a lateral arrangement of polydisperse microscale...

  19. Authoring Tools

    Science.gov (United States)

    Treviranus, Jutta

    Authoring tools that are accessible and that enable authors to produce accessible Web content play a critical role in web accessibility. Widespread use of authoring tools that comply to the W3C Authoring Tool Accessibility Guidelines (ATAG) would ensure that even authors who are neither knowledgeable about nor particularly motivated to produce accessible content do so by default. The principles and techniques of ATAG are discussed. Some examples of accessible authoring tools are described including authoring tool content management components such as TinyMCE. Considerations for creating an accessible collaborative environment are also covered. As part of providing accessible content, the debate between system-based personal optimization and one universally accessible site configuration is presented. The issues and potential solutions to address the accessibility crisis presented by the advent of rich internet applications are outlined. This challenge must be met to ensure that a large segment of the population is able to participate in the move toward the web as a two-way communication mechanism.

  20. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...... serves primarily two purpose (i) to improve the hardenabillity and (ii) to provide harder and thermally more stable carbides than cementite. Assuming proper heattreatment, the properties of a tool steel depends on the which alloying elements are added and their respective concentrations....

  1. Management Tools

    Science.gov (United States)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  2. General aviation air traffic pattern safety analysis

    Science.gov (United States)

    Parker, L. C.

    1973-01-01

    A concept is described for evaluating the general aviation mid-air collision hazard in uncontrolled terminal airspace. Three-dimensional traffic pattern measurements were conducted at uncontrolled and controlled airports. Computer programs for data reduction, storage retrieval and statistical analysis have been developed. Initial general aviation air traffic pattern characteristics are presented. These preliminary results indicate that patterns are highly divergent from the expected standard pattern, and that pattern procedures observed can affect the ability of pilots to see and avoid each other.

  3. Authoring Issues beyond Tools

    Science.gov (United States)

    Spierling, Ulrike; Szilas, Nicolas

    Authoring is still considered a bottleneck in successful Interactive Storytelling and Drama. The claim for intuitive authoring tools is high, especially for tools that allow storytellers and artists to define dynamic content that can be run with an AI-based story engine. We explored two concrete authoring processes in depth, using various Interactive Storytelling prototypes, and have provided feedback from the practical steps. The result is a presentation of general issues in authoring Interactive Storytelling, rather than of particular problems with a specific system that could be overcome by 'simply' designing the right interface. Priorities for future developments are also outlined.

  4. Design tools

    Science.gov (United States)

    Anton TenWolde; Mark T. Bomberg

    2009-01-01

    Overall, despite the lack of exact input data, the use of design tools, including models, is much superior to the simple following of rules of thumbs, and a moisture analysis should be standard procedure for any building envelope design. Exceptions can only be made for buildings in the same climate, similar occupancy, and similar envelope construction. This chapter...

  5. Generalized functions

    CERN Document Server

    Gelfand, I M; Graev, M I; Vilenkin, N Y; Pyatetskii-Shapiro, I I

    Volume 1 is devoted to basics of the theory of generalized functions. The first chapter contains main definitions and most important properties of generalized functions as functional on the space of smooth functions with compact support. The second chapter talks about the Fourier transform of generalized functions. In Chapter 3, definitions and properties of some important classes of generalized functions are discussed; in particular, generalized functions supported on submanifolds of lower dimension, generalized functions associated with quadratic forms, and homogeneous generalized functions are studied in detail. Many simple basic examples make this book an excellent place for a novice to get acquainted with the theory of generalized functions. A long appendix presents basics of generalized functions of complex variables.

  6. Meeting Generalized Others

    DEFF Research Database (Denmark)

    Strøbæk, Pernille Solveig; Willert, Søren

    2014-01-01

    Following George Herbert Mead, we contend that work-related organizational behavior requires continued negotiation of meaning – using linguistic, behavioral, and social tools. The meaning structures of the Generalized Other(s) of a particular employing organization provide the regulatory framework...

  7. General Editorial

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education. General Editorial. Articles in Resonance – Journal of Science Education. Volume 19 Issue 1 January 2014 pp 1-2 General Editorial. General Editorial on Publication Ethics · R Ramaswamy · More Details Fulltext PDF. Volume 19 Issue 1 January 2014 pp 3-3 ...

  8. RSP Tooling Technology

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  9. Generalized product

    OpenAIRE

    Greco, Salvatore; Mesiar, Radko; Rindone, Fabio

    2014-01-01

    Aggregation functions on [0,1] with annihilator 0 can be seen as a generalized product on [0,1]. We study the generalized product on the bipolar scale [–1,1], stressing the axiomatic point of view. Based on newly introduced bipolar properties, such as the bipolar increasingness, bipolar unit element, bipolar idempotent element, several kinds of generalized bipolar product are introduced and studied. A special stress is put on bipolar semicopulas, bipolar quasi-copulas and bipolar copulas.

  10. Tool for Military Logistics Planning of Peace Support Operations: The OTAS Planning Tool

    NARCIS (Netherlands)

    Merrienboer, S.A. van

    1998-01-01

    Within the research group Operations Research Studies Army of the TNO Physics and Electronics Laboratory the OTAS planning tool is developed for the Royal Netherlands Armed Forces. This paper gives a general and brief description of the tool.

  11. GENERAL SURGERY

    African Journals Online (AJOL)

    pain or discomfort. While most BLTs can ... of this study was to assess the spectrum of hepatic resections for BLTs in an ... Demographic data, operative management and morbidity and mortality using the ..... Royal Infirmary in Edinburgh.21. As in other .... tools and techniques for parenchymal liver transection. S Afr J. Surg.

  12. Generalized G-theory

    International Nuclear Information System (INIS)

    Sladkowski, J.

    1991-01-01

    Various attempts to formulate the fundamental physical interactions in the framework of unified geometric theories have recently gained considerable success (Kaluza, 1921; Klein, 1926; Trautmann, 1970; Cho, 1975). Symmetries of the spacetime and so-called internal spaces seem to play a key role in investigating both the fundamental interactions and the abundance of elementary particles. The author presents a category-theoretic description of a generalization of the G-theory concept and its application to geometric compactification and dimensional reduction. The main reasons for using categories and functors as tools are the clearness and the level of generalization one can obtain

  13. General relativity

    International Nuclear Information System (INIS)

    Kenyon, I.R.

    1990-01-01

    General relativity is discussed in this book at a level appropriate to undergraduate students of physics and astronomy. It describes concepts and experimental results, and provides a succinct account of the formalism. A brief review of special relativity is followed by a discussion of the equivalence principle and its implications. Other topics covered include the concepts of curvature and the Schwarzschild metric, test of the general theory, black holes and their properties, gravitational radiation and methods for its detection, the impact of general relativity on cosmology, and the continuing search for a quantum theory of gravity. (author)

  14. Case and Administrative Support Tools

    Science.gov (United States)

    Case and Administrative Support Tools (CAST) is the secure portion of the Office of General Counsel (OGC) Dashboard business process automation tool used to help reduce office administrative labor costs while increasing employee effectiveness. CAST supports business functions which rely on and store Privacy Act sensitive data (PII). Specific business processes included in CAST (and respective PII) are: -Civil Rights Cast Tracking (name, partial medical history, summary of case, and case correspondance). -Employment Law Case Tracking (name, summary of case). -Federal Tort Claims Act Incident Tracking (name, summary of incidents). -Ethics Program Support Tools and Tracking (name, partial financial history). -Summer Honors Application Tracking (name, home address, telephone number, employment history). -Workforce Flexibility Initiative Support Tools (name, alternative workplace phone number). -Resource and Personnel Management Support Tools (name, partial employment and financial history).

  15. General LTE Sequence

    OpenAIRE

    Billal, Masum

    2015-01-01

    In this paper,we have characterized sequences which maintain the same property described in Lifting the Exponent Lemma. Lifting the Exponent Lemma is a very powerful tool in olympiad number theory and recently it has become very popular. We generalize it to all sequences that maintain a property like it i.e. if p^{\\alpha}||a_k and p^\\b{eta}||n, then p^{{\\alpha}+\\b{eta}}||a_{nk}.

  16. The histogramming tool hparse

    International Nuclear Information System (INIS)

    Nikulin, V.; Shabratova, G.

    2005-01-01

    A general-purpose package aimed to simplify the histogramming in the data analysis is described. The proposed dedicated language for writing the histogramming scripts provides an effective and flexible tool for definition of a complicated histogram set. The script is more transparent and much easier to maintain than corresponding C++ code. In the TTree analysis it could be a good complement to the TTreeViewer class: the TTreeViewer is used for choice of the required histogram/cut set, while the hparse enables one to generate a code for systematic analysis

  17. Computer-aided translation tools

    DEFF Research Database (Denmark)

    Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    in Denmark is rather high in general, but limited in the case of machine translation (MT) tools: While most TSPs use translation-memory (TM) software, often in combination with a terminology management system (TMS), only very few have implemented MT, which is criticised for its low quality output, especially......The paper reports on a questionnaire survey from 2013 of the uptake and use of computer-aided translation (CAT) tools by Danish translation service providers (TSPs) and discusses how these tools appear to have impacted on the Danish translation industry. According to our results, the uptake...

  18. General problems

    International Nuclear Information System (INIS)

    2005-01-01

    This article presents the general problems as natural disasters, consequences of global climate change, public health, the danger of criminal actions, the availability to information about problems of environment

  19. Generalized Recovery

    DEFF Research Database (Denmark)

    Jensen, Christian Skov; Lando, David; Pedersen, Lasse Heje

    We characterize when physical probabilities, marginal utilities, and the discount rate can be recovered from observed state prices for several future time periods. Our characterization makes no assumptions of the probability distribution, thus generalizing the time-homogeneous stationary model...

  20. General Conformity

    Science.gov (United States)

    The General Conformity requirements ensure that the actions taken by federal agencies in nonattainment and maintenance areas do not interfere with a state’s plans to meet national standards for air quality.

  1. Tornado detection data reduction and analysis

    Science.gov (United States)

    Davisson, L. D.

    1977-01-01

    Data processing and analysis was provided in support of tornado detection by analysis of radio frequency interference in various frequency bands. Sea state determination data from short pulse radar measurements were also processed and analyzed. A backscatter simulation was implemented to predict radar performance as a function of wind velocity. Computer programs were developed for the various data processing and analysis goals of the effort.

  2. UniPOPS: Unified data reduction suite

    Science.gov (United States)

    Maddalena, Ronald J.; Garwood, Robert W.; Salter, Christopher J.; Stobie, Elizabeth B.; Cram, Thomas R.; Morgan, Lorrie; Vance, Bob; Hudson, Jerome

    2015-03-01

    UniPOPS, a suite of programs and utilities developed at the National Radio Astronomy Observatory (NRAO), reduced data from the observatory's single-dish telescopes: the Tucson 12-m, the Green Bank 140-ft, and archived data from the Green Bank 300-ft. The primary reduction programs, 'line' (for spectral-line reduction) and 'condar' (for continuum reduction), used the People-Oriented Parsing Service (POPS) as the command line interpreter. UniPOPS unified previous analysis packages and provided new capabilities; development of UniPOPS continued within the NRAO until 2004 when the 12-m was turned over to the Arizona Radio Observatory (ARO). The submitted code is version 3.5 from 2004, the last supported by the NRAO.

  3. Combined Acquisition/Processing For Data Reduction

    Science.gov (United States)

    Kruger, Robert A.

    1982-01-01

    Digital image processing systems necessarily consist of three components: acquisition, storage/retrieval and processing. The acquisition component requires the greatest data handling rates. By coupling together the acquisition witn some online hardwired processing, data rates and capacities for short term storage can be reduced. Furthermore, long term storage requirements can be reduced further by appropriate processing and editing of image data contained in short term memory. The net result could be reduced performance requirements for mass storage, processing and communication systems. Reduced amounts of data also snouid speed later data analysis and diagnostic decision making.

  4. Generalized polygons

    CERN Document Server

    Van Maldeghem, Hendrik

    1998-01-01

    Generalized Polygons is the first book to cover, in a coherent manner, the theory of polygons from scratch. In particular, it fills elementary gaps in the literature and gives an up-to-date account of current research in this area, including most proofs, which are often unified and streamlined in comparison to the versions generally known. Generalized Polygons will be welcomed both by the student seeking an introduction to the subject as well as the researcher who will value the work as a reference. In particular, it will be of great value for specialists working in the field of generalized polygons (which are, incidentally, the rank 2 Tits-buildings) or in fields directly related to Tits-buildings, incidence geometry and finite geometry. The approach taken in the book is of geometric nature, but algebraic results are included and proven (in a geometric way!). A noteworthy feature is that the book unifies and generalizes notions, definitions and results that exist for quadrangles, hexagons, octagons - in the ...

  5. General conclusions

    International Nuclear Information System (INIS)

    Tubiana, M.

    1993-01-01

    In conclusion, a general consensus of a number of points which the author endeavours to summarize in this article: -doctors are an excellent channel for passing on information to the public -doctors feel that they do not know enough about the subject and a training on radiobiology and radiation protection is a necessity for them -communication between doctors and the general public is poor in this field -research should be encouraged in numerous areas such as: carcinogenic effect of low doses of radiation, pedagogy and risk perception

  6. Tool path in torus tool CNC machining

    Directory of Open Access Journals (Sweden)

    XU Ying

    2016-10-01

    Full Text Available This paper is about tool path in torus tool CNC machining.The mathematical model of torus tool is established.The tool path planning algorithm is determined through calculation of the cutter location,boundary discretization,calculation of adjacent tool path and so on,according to the conversion formula,the cutter contact point will be converted to the cutter location point and then these points fit a toolpath.Lastly,the path planning algorithm is implemented by using Matlab programming.The cutter location points for torus tool are calculated by Matlab,and then fit these points to a toolpath.While using UG software,another tool path of free surface is simulated of the same data.It is drew compared the two tool paths that using torus tool is more efficient.

  7. GENERAL Iarticle

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 2. Supersymmetry. Akshay Kulkarni P Ramadevi. General Article Volume 8 Issue 2 February 2003 pp 28-41 ... Author Affiliations. Akshay Kulkarni1 P Ramadevi1. Physics Department, Indian Institute of Technology, Mumbai 400 076, India.

  8. General indicators

    International Nuclear Information System (INIS)

    2003-01-01

    This document summarizes the main 2002 energy indicators for France. A first table lists the evolution of general indicators between 1973 and 2002: energy bill, price of imported crude oil, energy independence, primary and final energy consumption. The main 2002 results are detailed separately for natural gas, petroleum and coal (consumption, imports, exports, production, stocks, prices). (J.S.)

  9. Generalized Recovery

    DEFF Research Database (Denmark)

    Jensen, Christian Skov; Lando, David; Pedersen, Lasse Heje

    We characterize when physical probabilities, marginal utilities, and the discount rate can be recovered from observed state prices for several future time periods. We make no assumptions of the probability distribution, thus generalizing the time-homogeneous stationary model of Ross (2015). Recov...

  10. GENERAL SURGERY

    African Journals Online (AJOL)

    Department of Surgery, University of Cape Town Health Sciences Faculty, Groote Schuur Hospital, Observatory, Cape Town,. South Africa ... included all district, regional and tertiary hospitals in the nine provinces. Clinics and so-called ..... large contingency of senior general surgeons from countries such as Cuba, who have ...

  11. GENERAL SURGERY

    African Journals Online (AJOL)

    effect of fatigue on patient safety, and owing to increasing emphasis on lifestyle issues .... increasing emphasis on an appropriate work-life balance in professional life.10 ... experience, were the most negative about the EWTD in general.3,13 ...

  12. GENERAL SURGERY

    African Journals Online (AJOL)

    in the endoscopy room. GENERAL SURGERY. T du Toit, O C Buchel, S J A Smit. Department of Surgery, University of the Free State, Bloemfontein, ... The lack of video instrumentation in developing countries: Redundant fibre-optic instruments (the old. “eye scope”) are still being used. This instrument brings endoscopists ...

  13. General Assembly

    CERN Multimedia

    Staff Association

    2016-01-01

    5th April, 2016 – Ordinary General Assembly of the Staff Association! In the first semester of each year, the Staff Association (SA) invites its members to attend and participate in the Ordinary General Assembly (OGA). This year the OGA will be held on Tuesday, April 5th 2016 from 11:00 to 12:00 in BE Auditorium, Meyrin (6-2-024). During the Ordinary General Assembly, the activity and financial reports of the SA are presented and submitted for approval to the members. This is the occasion to get a global view on the activities of the SA, its financial management, and an opportunity to express one’s opinion, including taking part in the votes. Other points are listed on the agenda, as proposed by the Staff Council. Who can vote? Only “ordinary” members (MPE) of the SA can vote. Associated members (MPA) of the SA and/or affiliated pensioners have a right to vote on those topics that are of direct interest to them. Who can give his/her opinion? The Ordinary General Asse...

  14. GENERAL SURGERY

    African Journals Online (AJOL)

    could cripple the global economy. Greater attention ... Africa and 5.7 general surgeons per 100 000 in the US.12 One of the key ... 100 000 insured population working in the private sector, which is comparable with the United States (US).

  15. Necklaces: Generalizations

    Indian Academy of Sciences (India)

    IAS Admin

    . A q-ary necklace of length n is an equivalence class of q-coloured strings of length n under rota- tion. In this article, we study various generaliza- tions and derive analytical expressions to count the number of these generalized necklaces.

  16. Generalized Recovery

    DEFF Research Database (Denmark)

    Lando, David; Pedersen, Lasse Heje; Jensen, Christian Skov

    We characterize when physical probabilities, marginal utilities, and the discount rate can be recovered from observed state prices for several future time periods. We make no assumptions of the probability distribution, thus generalizing the time-homogeneous stationary model of Ross (2015...... our model empirically, testing the predictive power of the recovered expected return and other recovered statistics....

  17. Foundational Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton [Univ. of Wisconsin, Madison, WI (United States)

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  18. Tools. Unit 9: A Core Curriculum of Related Instruction for Apprentices.

    Science.gov (United States)

    New York State Education Dept., Albany. Bureau of Occupational and Career Curriculum Development.

    The tool handling unit is presented to assist apprentices to acquire a general knowledge on the use of various basic tools. The unit consists of seven modules: (1) introduction to hand tools and small power tools; (2) measuring tools: layout and measuring tools for woodworking; (3) measuring tools: feeler gauge, micrometer, and torque wrench; (4)…

  19. Generalizing entanglement

    Science.gov (United States)

    Jia, Ding

    2017-12-01

    The expected indefinite causal structure in quantum gravity poses a challenge to the notion of entanglement: If two parties are in an indefinite causal relation of being causally connected and not, can they still be entangled? If so, how does one measure the amount of entanglement? We propose to generalize the notions of entanglement and entanglement measure to address these questions. Importantly, the generalization opens the path to study quantum entanglement of states, channels, networks, and processes with definite or indefinite causal structure in a unified fashion, e.g., we show that the entanglement distillation capacity of a state, the quantum communication capacity of a channel, and the entanglement generation capacity of a network or a process are different manifestations of one and the same entanglement measure.

  20. Useful design tools?

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole

    2005-01-01

    vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...... of a new sustainable settlement. The use of design tools is discussed in relation to innovation and stakeholder participation, and it is stressed that the usefulness of design tools is context dependent....

  1. General topology

    CERN Document Server

    Willard, Stephen

    2004-01-01

    Among the best available reference introductions to general topology, this volume is appropriate for advanced undergraduate and beginning graduate students. Its treatment encompasses two broad areas of topology: ""continuous topology,"" represented by sections on convergence, compactness, metrization and complete metric spaces, uniform spaces, and function spaces; and ""geometric topology,"" covered by nine sections on connectivity properties, topological characterization theorems, and homotopy theory. Many standard spaces are introduced in the related problems that accompany each section (340

  2. Ludic Educational Game Creation Tool

    DEFF Research Database (Denmark)

    Vidakis, Nikolaos; Syntychakis, Efthimios; Kalafatis, Konstantinos

    2015-01-01

    This paper presents initial findings and ongoing work of the game creation tool, a core component of the IOLAOS(IOLAOS in ancient Greece was a divine hero famed for helping with some of Heracles’s labors.) platform, a general open authorable framework for educational and training games. The game...... creation tool features a web editor, where the game narrative can be manipulated, according to specific needs. Moreover, this tool is applied for creating an educational game according to a reference scenario namely teaching schoolers road safety. A ludic approach is used both in game creation and play....... Helping children staying safe and preventing serious injury on the roads is crucial. In this context, this work presents an augmented version of the IOLAOS architecture including an enhanced game creation tool and a new multimodality module. In addition presents a case study for creating educational games...

  3. Generalized polygons

    CERN Document Server

    Maldeghem, Hendrik

    1998-01-01

    This book is intended to be an introduction to the fascinating theory ofgeneralized polygons for both the graduate student and the specialized researcher in the field. It gathers together a lot of basic properties (some of which are usually referred to in research papers as belonging to folklore) and very recent and sometimes deep results. I have chosen a fairly strict geometrical approach, which requires some knowledge of basic projective geometry. Yet, it enables one to prove some typically group-theoretical results such as the determination of the automorphism groups of certain Moufang polygons. As such, some basic group-theoretical knowledge is required of the reader. The notion of a generalized polygon is a relatively recent one. But it is one of the most important concepts in incidence geometry. Generalized polygons are the building bricks of Tits buildings. They are the prototypes and precursors of more general geometries such as partial geometries, partial quadrangles, semi-partial ge­ ometries, near...

  4. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  5. General chemistry

    International Nuclear Information System (INIS)

    Kwon, Yeong Sik; Lee, Dong Seop; Ryu, Haung Ryong; Jang, Cheol Hyeon; Choi, Bong Jong; Choi, Sang Won

    1993-07-01

    The book concentrates on the latest general chemistry, which is divided int twenty-three chapters. It deals with basic conception and stoichiometry, nature of gas, structure of atoms, quantum mechanics, symbol and structure of an electron of ion and molecule, chemical thermodynamics, nature of solid, change of state and liquid, properties of solution, chemical equilibrium, solution and acid-base, equilibrium of aqueous solution, electrochemistry, chemical reaction speed, molecule spectroscopy, hydrogen, oxygen and water, metallic atom; 1A, IIA, IIIA, carbon and atom IVA, nonmetal atom and an inert gas, transition metals, lanthanons, and actinoids, nuclear properties and radioactivity, biochemistry and environment chemistry.

  6. General relativity

    International Nuclear Information System (INIS)

    Gourgoulhon, Eric

    2013-01-01

    The author proposes a course on general relativity. He first presents a geometrical framework by addressing, presenting and discussion the following notions: the relativistic space-time, the metric tensor, Universe lines, observers, principle of equivalence and geodesics. In the next part, he addresses gravitational fields with spherical symmetry: presentation of the Schwarzschild metrics, radial light geodesics, gravitational spectral shift (Einstein effect), orbitals of material objects, photon trajectories. The next parts address the Einstein equation, black holes, gravitational waves, and cosmological solutions. Appendices propose a discussion of the relationship between relativity and GPS, some problems and their solutions, and Sage codes

  7. General principles

    International Nuclear Information System (INIS)

    Hutchison, J.M.S.; Foster, M.A.

    1987-01-01

    NMR characteristics are not unique - T/sub 1/ values of tumour tissues overlap with those from multiple sclerosis plaques or from areas of inflammation. Despite this, NMR imaging is an extremely powerful tool to the diagnostician and for other medical use such as following the course of treatment or planning or surgery or radiotherapy. Magnetic resonance imaging (MRI) is often used solely as an anatomical technique similar to X-ray CT. This is certainly an appropriate use for it and it has certain advantages over X-ray CT such as the greater ease with which sagittal and coronal sections can be obtained (or other views by suitable manipulation of the gradients) NMR is also less bothered by bone-related artefacts. There are disadvantages in terms of resolution (although this is improving) and of speed of acquisition of the image. The NMR signal, however, derives from a complex interaction of biophysical properties and, if properly used, can yield a considerable amount of information about its origin. The NMR image is capable of much more manipulation than that obtained by X-ray methods and, particularly with the addition of spectroscopy to the repertoire the authors expect in vivo NMR examinations to yield much metabolic and biophysical information in addition to providing a demonstration of the anatomy of the body

  8. Generalized Multiphoton Quantum Interference

    Directory of Open Access Journals (Sweden)

    Max Tillmann

    2015-10-01

    Full Text Available Nonclassical interference of photons lies at the heart of optical quantum information processing. Here, we exploit tunable distinguishability to reveal the full spectrum of multiphoton nonclassical interference. We investigate this in theory and experiment by controlling the delay times of three photons injected into an integrated interferometric network. We derive the entire coincidence landscape and identify transition matrix immanants as ideally suited functions to describe the generalized case of input photons with arbitrary distinguishability. We introduce a compact description by utilizing a natural basis that decouples the input state from the interferometric network, thereby providing a useful tool for even larger photon numbers.

  9. The power tool

    International Nuclear Information System (INIS)

    HAYFIELD, J.P.

    1999-01-01

    POWER Tool--Planning, Optimization, Waste Estimating and Resourcing tool, a hand-held field estimating unit and relational database software tool for optimizing disassembly and final waste form of contaminated systems and equipment

  10. Digital Tectonic Tools

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due

    2005-01-01

    Tectonics has been an inherent part of the architectural field since the Greek temples while the digital media is new to the field. This paper is built on the assumption that in the intermediate zone between the two there is a lot to be learned about architecture in general and the digital media...... in particular. A model of the aspects in the term tectonics – epresentation, ontology and culture – will be presented and used to discuss the current digital tools’ ability in tectonics. Furthermore it will be discussed what a digital tectonic tool is and could be and how a connection between the digital...... and tectonic could become a part of the architectural education....

  11. Preset pivotal tool holder

    Science.gov (United States)

    Asmanes, Charles

    1979-01-01

    A tool fixture is provided for precise pre-alignment of a radiused edge cutting tool in a tool holder relative to a fixed reference pivot point established on said holder about which the tool holder may be selectively pivoted relative to the fixture base member to change the contact point of the tool cutting edge with a workpiece while maintaining the precise same tool cutting radius relative to the reference pivot point.

  12. FASTBUS simulation tools

    International Nuclear Information System (INIS)

    Dean, T.D.; Haney, M.J.

    1991-10-01

    A generalized model of a FASTBUS master is presented. The model is used with simulation tools to aid in the specification, design, and production of FASTBUS slave modules. The model provides a mechanism to interact with the electrical schematics and software models to predict performance. The model is written in the IEEE std 1076-1987 hardware description language VHDL. A model of the ATC logic is also presented. VHDL was chosen to provide portability to various platforms and simulation tools. The models, in conjunction with most commercially available simulators, will perform all of the transactions specified in IEEE std 960-1989. The models may be used to study the behavior of electrical schematics and other software models and detect violations of the FASTBUS protocol. For example, a hardware design of a slave module could be studied, protocol violations detected and corrected before committing money to prototype development. The master model accepts a stream of high level commands from an ASCII file to initiate FASTBUS transactions. The high level command language is based on the FASTBUS standard routines listed in IEEE std 1177-1989. Using this standard-based command language to direct the model of the master, hardware engineers can simulate FASTBUS transactions in the language used by physicists and programmers to operate FASTBUS systems. 15 refs., 6 figs

  13. Generalizing quasinormality

    Directory of Open Access Journals (Sweden)

    John Cossey

    2015-03-01

    Full Text Available Quasinormal subgroups have been studied for nearly 80 years. In finite groups, questions concerning them invariably reduce to p-groups, and here they have the added interest of being invariant under projectivities, unlike normal subgroups. However, it has been shown recently that certain groups, constructed by Berger and Gross in 1982, of an important universal nature with regard to the existence of core-free quasinormal subgroups gener- ally, have remarkably few such subgroups. Therefore in order to overcome this misfortune, a generalization of the concept of quasi- normality will be defined. It could be the beginning of a lengthy undertaking. But some of the initial findings are encouraging, in particular the fact that this larger class of subgroups also remains invariant under projectivities of finite p-groups, thus connecting group and subgroup lattice structures.

  14. General report

    International Nuclear Information System (INIS)

    Nicklisch, F.

    1984-01-01

    Growing complexity of technical matter has meant that technical expertise is called upon in more and more legal proceedings. The technical expert is, in general terms, the mediator between technology and the law, he is also entrusted with the task of pointing up the differences in approach and in the nature of authority in these two areas and thus paving the way for mutual understanding. The evaluation of the technical expert's opinion is one of the cardinal problems bound up with the role of the expert in legal procedure. After the presentation of the expert's opinion, the judge is supposed to possess so much specialised knowledge that he can assess the opinion itself in scientific and technical respects and put his finger on any errors the expert may have made. This problem can only be solved via an assessment opinion. First of all, the opinion can be assessed indirectly via evaluation of the credentials and the neutrality and independence of the expert. In direct terms, the opinion can be subjected to a certain - albeit restricted - scrutiny, whether it is generally convincing, as far as the layman is competent to judge. This interpretation alone makes it possible to classify and integrate legally the technical standards and regulations represent expert statements on the scientific and technical theorems based on the knowledge and experience gained in a given area. They are designed to reflect prevailing opinion among leading representatives of the profession and can thus themselves be regarded as expert opinions. As a rule, these opinions will have such weight that - other than in exceptional cases - they will not be invalidated in procedure by deviating opinions from individual experts. (orig./HSCH) [de

  15. Program plan recognition for year 2000 tools

    NARCIS (Netherlands)

    A. van Deursen (Arie); S. Woods; A. Quilici

    1997-01-01

    textabstractThere are many commercial tools that address various aspects of the Year 2000 problem. None of these tools, however, make any documented use of plan-based techniques for automated concept recovery. This implies a general perception that plan-based techniques is not useful for this

  16. Dutch Risk Assessment tools

    NARCIS (Netherlands)

    Venema, A.

    2015-01-01

    The ‘Risico- Inventarisatie- en Evaluatie-instrumenten’ is the name for the Dutch risk assessment (RA) tools. A RA tool can be used to perform a risk assessment including an evaluation of the identified risks. These tools were among the first online risk assessment tools developed in Europe. The

  17. New QC 7 tools

    International Nuclear Information System (INIS)

    1982-03-01

    This book tells of new QC with 7 tools which includes TQC and new QC with 7 tools which is for better propel, what is QC method to think? what is new QC 7 tool ? like KJ law, PDPC law, arrow and diagram law, and matrix diagram law, application of new QC 7 tools such as field to apply, application of new QC 7 tools for policy management the method of new QC 7 tools including related regulations KJ law, matrix and data analysis, PDPC law and education and introduction of new QC 7 tools.

  18. Tools to improve Angra 1/2 general training program

    International Nuclear Information System (INIS)

    Barroso, Haroldo Jr.

    2003-01-01

    Since Brazil restarted Angra 2 construction in 1995, as a result of the studies of future energy consumption, the Training Department of Eletronuclear developed the training program for site personnel. This new situation has demanded additional efforts and new routines. In the following paragraphs there is a description of significant aspects in this concern. Most of them are now under discussion in the Training Department and some alternative solutions are being adopted in order to face the new challenges. (author)

  19. Data Reduction of Laser Ablation Split-Stream (LASS) Analyses Using Newly Developed Features Within Iolite: With Applications to Lu-Hf + U-Pb in Detrital Zircon and Sm-Nd +U-Pb in Igneous Monazite

    Science.gov (United States)

    Fisher, Christopher M.; Paton, Chad; Pearson, D. Graham; Sarkar, Chiranjeeb; Luo, Yan; Tersmette, Daniel B.; Chacko, Thomas

    2017-12-01

    A robust platform to view and integrate multiple data sets collected simultaneously is required to realize the utility and potential of the Laser Ablation Split-Stream (LASS) method. This capability, until now, has been unavailable and practitioners have had to laboriously process each data set separately, making it challenging to take full advantage of the benefits of LASS. We describe a new program for handling multiple mass spectrometric data sets collected simultaneously, designed specifically for the LASS technique, by which a laser aerosol is been split into two or more separate "streams" to be measured on separate mass spectrometers. New features within Iolite (https://iolite-software.com) enable the capability of loading, synchronizing, viewing, and reducing two or more data sets acquired simultaneously, as multiple DRSs (data reduction schemes) can be run concurrently. While this version of Iolite accommodates any combination of simultaneously collected mass spectrometer data, we demonstrate the utility using case studies where U-Pb and Lu-Hf isotope composition of zircon, and U-Pb and Sm-Nd isotope composition of monazite were analyzed simultaneously, in crystals showing complex isotopic zonation. These studies demonstrate the importance of being able to view and integrate simultaneously acquired data sets, especially for samples with complicated zoning and decoupled isotope systematics, in order to extract accurate and geologically meaningful isotopic and compositional data. This contribution provides instructions and examples for handling simultaneously collected laser ablation data. An instructional video is also provided. The updated Iolite software will help to fully develop the applications of both LASS and multi-instrument mass spectrometric measurement capabilities.

  20. Non-commutative tools for topological insulators

    International Nuclear Information System (INIS)

    Prodan, Emil

    2010-01-01

    This paper reviews several analytic tools for the field of topological insulators, developed with the aid of non-commutative calculus and geometry. The set of tools includes bulk topological invariants defined directly in the thermodynamic limit and in the presence of disorder, whose robustness is shown to have nontrivial physical consequences for the bulk states. The set of tools also includes a general relation between the current of an observable and its edge index, a relation that can be used to investigate the robustness of the edge states against disorder. The paper focuses on the motivations behind creating such tools and on how to use them.

  1. Ootw Tool Requirements in Relation to JWARS

    Energy Technology Data Exchange (ETDEWEB)

    Hartley III, D.S.; Packard, S.L.

    1998-01-01

    This document reports the results of the CMke of the Secretary of Defense/Program Analysis & Evaluation (OSD/PA&E) sponsored project to identify how Operations Other Than War (OOTW) tool requirements relate to the Joint Warfare Simulation (JWARS) and, more generally, to joint analytical modeling and simulation (M&S) requirements. It includes recommendations about which OOTW tools (and functionality within tools) should be included in JWARS, which should be managed as joint analytical modeling and simulation (M&S) tools, and which should be left for independent development.

  2. Grid sleeve bulge tool

    International Nuclear Information System (INIS)

    Phillips, W.D.; Vaill, R.E.

    1980-01-01

    An improved grid sleeve bulge tool is designed for securing control rod guide tubes to sleeves brazed in a fuel assembly grid. The tool includes a cylinder having an outer diameter less than the internal diameter of the control rod guide tubes. The walls of the cylinder are cut in an axial direction along its length to provide several flexible tines or ligaments. These tines are similar to a fork except they are spaced in a circumferential direction. The end of each alternate tine is equipped with a semispherical projection which extends radially outwardly from the tine surface. A ram or plunger of generally cylindrical configuration and about the same length as the cylinder is designed to fit in and move axially of the cylinder and thereby force the tined projections outwardly when the ram is pulled into the cylinder. The ram surface includes axially extending grooves and plane surfaces which are complimentary to the inner surfaces formed on the tines on the cylinder. As the cylinder is inserted into a control rod guide tube, and the projections on the cylinder placed in a position just below or above a grid strap, the ram is pulled into the cylinder, thus moving the tines and the projections thereon outwardly into contact with the sleeve, to plastically deform both the sleeve and the control rod guide tube, and thereby form four bulges which extend outwardly from the sleeve surface and beyond the outer periphery of the grid peripheral strap. This process is then repeated at the points above the grid to also provide for outwardly projecting surfaces, the result being that the grid is accurately positioned on and mechanically secured to the control rod guide tubes which extend the length of a fuel assembly

  3. Nanocomposites for Machining Tools

    Directory of Open Access Journals (Sweden)

    Daria Sidorenko

    2017-10-01

    Full Text Available Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials. A promising way to improve the performance characteristics of these materials is to design new nanocomposites based on them. The application of micromechanical modeling during the elaboration of composite materials for machining tools can reduce the financial and time costs for development of new tools, with enhanced performance. This article reviews the main groups of nanocomposites for machining tools and their performance.

  4. Tool grinding machine

    Science.gov (United States)

    Dial, Sr., Charles E.

    1980-01-01

    The present invention relates to an improved tool grinding mechanism for grinding single point diamond cutting tools to precise roundness and radius specifications. The present invention utilizes a tool holder which is longitudinally displaced with respect to the remainder of the grinding system due to contact of the tool with the grinding surface with this displacement being monitored so that any variation in the grinding of the cutting surface such as caused by crystal orientation or tool thickness may be compensated for during the grinding operation to assure the attainment of the desired cutting tool face specifications.

  5. Improved tool grinding machine

    Science.gov (United States)

    Dial, C.E. Sr.

    The present invention relates to an improved tool grinding mechanism for grinding single point diamond cutting tools to precise roundness and radius specifications. The present invention utilizes a tool holder which is longitudinally displaced with respect to the remainder of the grinding system due to contact of the tool with the grinding surface with this displacement being monitored so that any variation in the grinding of the cutting surface such as caused by crystal orientation or tool thicknesses may be compensated for during the grinding operation to assure the attainment of the desired cutting tool face specifications.

  6. 19 CFR 10.502 - General definitions.

    Science.gov (United States)

    2010-04-01

    ... States or Singapore, including: (1) Fuel and energy; (2) Tools, dies, and molds; (3) Spare parts and... under the rules of origin set out in SFTA Chapter Three (Rules of Origin) and General Note 25, HTSUS; (l...

  7. Tools of online Marketing

    OpenAIRE

    Hossain, M. S.; Rahman, M. F.

    2017-01-01

    Abstract Online marketing is the most crucial issue in the modern marketing era but there was no previous research that could identify the tools of internet marketing before this study and it was the first study on the field of online marketing tools. This research was descriptive in nature and it has attempted to identify the major tools of internet marketing from the concepts of traditional marketing tools. Worldwide network is known as Internet that can exchange information between use...

  8. OOTW COST TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    HARTLEY, D.S.III; PACKARD, S.L.

    1998-09-01

    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  9. Recruitment of general practices

    DEFF Research Database (Denmark)

    Riis, Allan; Jensen, Cathrine Elgaard; Maindal, Helle Terkildsen

    2016-01-01

    -factors as determinants for successfully recruiting healthcare professionals: relationships, reputation, requirements, rewards, reciprocity, resolution, and respect. Method: This is a process evaluation of the seven R-factors. We applied these factors to guide the design of our recruitment strategy as well as to make......Introduction: Health service research often involves the active participation of healthcare professionals. However, their ability and commitment to research varies. This can cause recruitment difficulties and thereby prolong the study period and inflate budgets. Solberg has identified seven R...... adjustments when recruiting general practices in a guideline implementation study. In the guideline implementation study, we studied the effect of outreach visits, quality reports, and new patient stratification tools for low back pain patients. Results: During a period of 15 months, we recruited 60 practices...

  10. Pro Tools HD

    CERN Document Server

    Camou, Edouard

    2013-01-01

    An easy-to-follow guide for using Pro Tools HD 11 effectively.This book is ideal for anyone who already uses ProTools and wants to learn more, or is new to Pro Tools HD and wants to use it effectively in their own audio workstations.

  11. Nanocomposites for Machining Tools

    DEFF Research Database (Denmark)

    Sidorenko, Daria; Loginov, Pavel; Mishnaevsky, Leon

    2017-01-01

    Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials...

  12. Java Radar Analysis Tool

    Science.gov (United States)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  13. Perception as a Tool

    Directory of Open Access Journals (Sweden)

    Jovana Komnenič

    2014-03-01

    Full Text Available The article presents a project of providing guidelines on art education for the blind and visually impaired, which was entitled Perception as a Tool and presented at the Berlin Biennale on 6 October 2010. It focuses on potential aspects of art education with regard to people with special needs and seeks to discover what happens with art if we cannot see it. This approach to art education combines elements of conventional tours of exhibitions and involves the participants through play. The methods that were used in our work included establishing dramatic tension and insecurity in the group as well as mutual trust by relying on different resources, including sensory perception, personal biography and different forms of knowledge and skills. A major part of the project is finding hidden, invisible or forgotten stories that are not directly linked to the exhibition and the aspects directly related to the exhibition. Such a generally inclusive approach enabled us to formulate political questions on the issue of ’invisibility’.

  14. Pickering tool management system

    International Nuclear Information System (INIS)

    Wong, E.H.; Green, A.H.

    1997-01-01

    Tools were being deployed in the station with no process in effect to ensure that they are maintained in good repair so as to effectively support the performance of Maintenance activities. Today's legal requirements require that all employers have a process in place to ensure that tools are maintained in a safe condition. This is specified in the Ontario Health and Safety Act. The Pickering Tool Management System has been chosen as the process at Pickering N.D to manage tools. Tools are identified by number etching and bar codes. The system is a Windows application installed on several file servers

  15. Machine tool structures

    CERN Document Server

    Koenigsberger, F

    1970-01-01

    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  16. An Interactive Visual Analytics Tool for NASA's General Mission Analysis Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of any spacecraft trajectory design process is to identify a path that transfers a vehicle from its point of origin to some specific destination in the...

  17. Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)

    Science.gov (United States)

    Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia

    2018-06-01

    Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.

  18. Description of Ethical Bio-Technology Assessment Tools for Agriculture and Food Production. Interim Report Ethical Bio-TA Tools

    NARCIS (Netherlands)

    Beekman, V.

    2004-01-01

    The objective of 'Ethical Bio-TA Tools' project is to develop and improve tools for the ethical assessment of new technologies in agriculture and food production in general and modern biotechnologies in particular. The developed tools need to be designed for various purposes and contexts. They

  19. General Relativity in (1 + 1) Dimensions

    Science.gov (United States)

    Boozer, A. D.

    2008-01-01

    We describe a theory of gravity in (1 + 1) dimensions that can be thought of as a toy model of general relativity. The theory should be a useful pedagogical tool, because it is mathematically much simpler than general relativity but shares much of the same conceptual structure; in particular, it gives a simple illustration of how gravity arises…

  20. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  1. A Bayesian Network as a tool to measure Supply Chain Resilience

    NARCIS (Netherlands)

    Wagenberg, van C.P.A.; Aramyan, L.H.; Lauwere, de C.C.; Gielen-Meuwissen, M.P.M.; Timmer, M.J.; Willems, D.J.M.

    2018-01-01

    Resilience frameworks and tools are generally qualitative. The Diagnostic Tool presented in this paper provides a quantitative tool in the area of supply chain resilience. It is among the first tools to quantify the complex concept of resilience. We apply the tool to a sustainable pork chain in the

  2. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...... as named functions in Scheme. Finally, the Scheme Elucidator is able to integrate SchemeDoc resources as part of an internal documentation resource....

  3. Generalized fractional supersymmetry associated to different species of anyons

    International Nuclear Information System (INIS)

    Douari, Jamila; Abdus Salam International Centre for Theoretical Physics, Trieste; Hassouni, Yassine

    2001-01-01

    We consider multiple species of anyons characterized by different statistical parameters. First, we redefine the anyonic algebra and then generalize this definition by constructing the anyonic superalgebra. Finally, we use these tools to generalize the fractional supersymmetry already discussed. (author)

  4. Practical Implementation of Sustainable Urban Management Tools

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Jensen, Jesper Ole; Hoffmann, Birgitte

    2006-01-01

    The paper discusses how to promote the use of decision support tools for urban sustainable development. The interest in decision support tools based on indicators is increasing among practitioners and researchers. The research has so far focused on indicator types and systems of indicators...... and goals for urban sustainability whereas less focus has been on the context of implementation and even less on what we can learn from practical experiences about the usefulness of urban sustainable indicator tools. This paper explores the practical implementation of urban sustainable management tools....... It is generally agreed that in order to make indicators and other sustainability management tools work it is necessary that they are integrated in the relevant urban organisational levels, in a way that creates commitment to the subsequent goals. This includes involvement of organisations, individuals and other...

  5. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  6. Instant Spring Tool Suite

    CERN Document Server

    Chiang, Geoff

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  7. Chimera Grid Tools

    Science.gov (United States)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  8. Easy QC 7 tools

    International Nuclear Information System (INIS)

    1981-04-01

    This book explains method of QC 7 tools, mind for using QC 7 tools, effect of QC 7 tools application, giving descriptions of graph, pareto's diagram like application writing way and using method of pareto's diagram, characteristic diagram, check sheet such as purpose and subject of check, goals and types of check sheet, and using point of check sheet, histogram like application and using method, and stratification, scatterplot, control chart, promotion method and improvement and cases of practice of QC tools.

  9. Java Power Tools

    CERN Document Server

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  10. Easy QC 7 tools

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-04-15

    This book explains method of QC 7 tools, mind for using QC 7 tools, effect of QC 7 tools application, giving descriptions of graph, pareto's diagram like application writing way and using method of pareto's diagram, characteristic diagram, check sheet such as purpose and subject of check, goals and types of check sheet, and using point of check sheet, histogram like application and using method, and stratification, scatterplot, control chart, promotion method and improvement and cases of practice of QC tools.

  11. Qlikview Audit Tool (QLIKVIEW) -

    Data.gov (United States)

    Department of Transportation — This tool supports the cyclical financial audit process. Qlikview supports large volumes of financial transaction data that can be mined, summarized and presented to...

  12. TFV as a strategic tool

    DEFF Research Database (Denmark)

    Bonke, Sten; Bertelsen, Sven

    2011-01-01

    The paper investigates the use of the Transformation-Flow-Value theory as a strategic tool in the development of the project production firm. When producing products such as ships, focus on value more than on cost may be the best approach, but in service industries such as construction, focus...... on flow may often be a far better approach than just looking at the costs. The paper presents a simple, general financial model to support this argument and not least to assist the reader in conducting similar analyses in his own company....

  13. PROMOTION OF PRODUCTS AND ANALYSIS OF MARKET OF POWER TOOLS

    Directory of Open Access Journals (Sweden)

    Sergey S. Rakhmanov

    2014-01-01

    Full Text Available The article describes the general situation of power tools on the market, both in Russia and in the world. A comparative analysis of competitors, market structure analysis of power tools, as well as assessment of competitiveness of some major product lines. Also the analysis methods of promotion used by companies selling tools, competitive analysis range Bosch, the leader in its segment, power tools available on the market in Russia.

  14. The Innsbruck/ESO sky models and telluric correction tools*

    Directory of Open Access Journals (Sweden)

    Kimeswenger S.

    2015-01-01

    While the ground based astronomical observatories just have to correct for the line-of-sight integral of these effects, the Čerenkov telescopes use the atmosphere as the primary detector. The measured radiation originates at lower altitudes and does not pass through the entire atmosphere. Thus, a decent knowledge of the profile of the atmosphere at any time is required. The latter cannot be achieved by photometric measurements of stellar sources. We show here the capabilities of our sky background model and data reduction tools for ground-based optical/infrared telescopes. Furthermore, we discuss the feasibility of monitoring the atmosphere above any observing site, and thus, the possible application of the method for Čerenkov telescopes.

  15. EPR design tools. Integrated data processing tools

    International Nuclear Information System (INIS)

    Kern, R.

    1997-01-01

    In all technical areas, planning and design have been supported by electronic data processing for many years. New data processing tools had to be developed for the European Pressurized Water Reactor (EPR). The work to be performed was split between KWU and Framatome and laid down in the Basic Design contract. The entire plant was reduced to a logical data structure; the circuit diagrams and flowsheets of the systems were drafted, the central data pool was established, the outlines of building structures were defined, the layout of plant components was planned, and the electrical systems were documented. Also building construction engineering was supported by data processing. The tasks laid down in the Basic Design were completed as so-called milestones. Additional data processing tools also based on the central data pool are required for the phases following after the Basic Design phase, i.e Basic Design Optimization; Detailed Design; Management; Construction, and Commissioning. (orig.) [de

  16. Data Tools and Apps

    Science.gov (United States)

    Employment and Payroll Survey of Business Owners Work from Home Our statistics highlight trends in household statistics from multiple surveys. Data Tools & Apps Main American FactFinder Census Business Builder My ). Business Dynamics Statistics This tool shows tabulations on establishments, firms, and employment with

  17. Evaluating meeting support tools

    NARCIS (Netherlands)

    Post, W.M.; Huis in 't Veld, M. M.A.; Boogaard, S.A.A. van den

    2007-01-01

    Many attempts are underway for developing meeting support tools, but less attention is paid to the evaluation of meetingware. This article describes the development and testing of an instrument for evaluating meeting tools. First, we specified the object of evaluation -meetings- by means of a set of

  18. Maailma suurim tool

    Index Scriptorium Estoniae

    2000-01-01

    AS Tartu näitused, Tartu Kunstikool ja ajakiri 'Diivan' korraldavad 9.-11. III Tartu messikeskuse I paviljonis näituse 'Tool 2000'. Eksponeeritakse 2000 tooli, mille hulgast valitakse TOP 12. Messikeskuse territooriumile on kavas püstitada maailma suurim tool. Samal ajal II paviljonis kaksikmess 'Sisustus 2000' ja 'Büroo 2000'.

  19. Design mentoring tool.

    Science.gov (United States)

    2011-01-01

    In 2004 a design engineer on-line mentoring tool was developed and implemented The purpose of the tool was to assist senior engineers : mentoring new engineers to the INDOT design process and improve their technical competency. This approach saves se...

  20. Evaluating meeting support tools

    NARCIS (Netherlands)

    Post, W.M.; Huis in't Veld, M.A.A.; Boogaard, S.A.A. van den

    2008-01-01

    Many attempts are underway for developing meeting support tools, but less attention is paid to the evaluation of meetingware. This article describes the development and testing of an instrument for evaluating meeting tools. First, we specified the object of evaluation - meetings - by means of a set

  1. Benchmarking Tool Kit.

    Science.gov (United States)

    Canadian Health Libraries Association.

    Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…

  2. Expert tool use

    DEFF Research Database (Denmark)

    Thorndahl, Kathrine Liedtke; Ravn, Susanne

    2017-01-01

    on a case study of elite rope skipping, we argue that the phenomenological concept of incorporation does not suffice to adequately describe how expert tool users feel when interacting with their tools. By analyzing a combination of insights gained from participant observation of 11 elite rope skippers......According to some phenomenologists, a tool can be experienced as incorporated when, as a result of habitual use or deliberate practice, someone is able to manipulate it without conscious effort. In this article, we specifically focus on the experience of expertise tool use in elite sport. Based...... and autoethnographic material from one former elite skipper, we take some initial steps toward the development of a more nuanced understanding of the concept of incorporation; one that is able to accommodate the experiences of expert tool users. In sum, our analyses indicate that the possibility for experiencing...

  3. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  4. OOTW Force Design Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  5. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  6. Modern Canonical Quantum General Relativity

    Science.gov (United States)

    Thiemann, Thomas

    2008-11-01

    Preface; Notation and conventions; Introduction; Part I. Classical Foundations, Interpretation and the Canonical Quantisation Programme: 1. Classical Hamiltonian formulation of general relativity; 2. The problem of time, locality and the interpretation of quantum mechanics; 3. The programme of canonical quantisation; 4. The new canonical variables of Ashtekar for general relativity; Part II. Foundations of Modern Canonical Quantum General Relativity: 5. Introduction; 6. Step I: the holonomy-flux algebra [P]; 7. Step II: quantum-algebra; 8. Step III: representation theory of [A]; 9. Step IV: 1. Implementation and solution of the kinematical constraints; 10. Step V: 2. Implementation and solution of the Hamiltonian constraint; 11. Step VI: semiclassical analysis; Part III. Physical Applications: 12. Extension to standard matter; 13. Kinematical geometrical operators; 14. Spin foam models; 15. Quantum black hole physics; 16. Applications to particle physics and quantum cosmology; 17. Loop quantum gravity phenomenology; Part IV. Mathematical Tools and their Connection to Physics: 18. Tools from general topology; 19. Differential, Riemannian, symplectic and complex geometry; 20. Semianalytical category; 21. Elements of fibre bundle theory; 22. Holonomies on non-trivial fibre bundles; 23. Geometric quantisation; 24. The Dirac algorithm for field theories with constraints; 25. Tools from measure theory; 26. Elementary introduction to Gel'fand theory for Abelean C* algebras; 27. Bohr compactification of the real line; 28. Operatir -algebras and spectral theorem; 29. Refined algebraic quantisation (RAQ) and direct integral decomposition (DID); 30. Basics of harmonic analysis on compact Lie groups; 31. Spin network functions for SU(2); 32. + Functional analytical description of classical connection dynamics; Bibliography; Index.

  7. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  8. Machine tool evaluation

    International Nuclear Information System (INIS)

    Lunsford, B.E.

    1976-01-01

    Continued improvement in numerical control (NC) units and the mechanical components used in the construction of today's machine tools, necessitate the use of more precise instrumentation to calibrate and determine the capabilities of these systems. It is now necessary to calibrate most tape-control lathes to a tool-path positioning accuracy of +-300 microinches in the full slide travel and, on some special turning and boring machines, a capability of +-100 microinches must be achieved. The use of a laser interferometer to determine tool-path capabilities is described

  9. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  10. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  11. Tool Inventory and Replacement

    Science.gov (United States)

    Bear, W. Forrest

    1976-01-01

    Vocational agriculture teachers are encouraged to evaluate curriculum offerings, the new trends in business and industry, and develop a master tool purchase and replacement plan over a 3- to 5-year period. (HD)

  12. ATO Resource Tool -

    Data.gov (United States)

    Department of Transportation — Cru-X/ART is a shift management tool designed for?use by operational employees in Air Traffic Facilities.? Cru-X/ART is used for shift scheduling, shift sign in/out,...

  13. Water Budget Tool

    Science.gov (United States)

    If you're designing a new landscape or rethinking your current one, the WaterSense Water Budget Tool will tell you if you have designed a landscape that will use an appropriate amount of water for your climate.

  14. Neighborhood Mapping Tool

    Data.gov (United States)

    Department of Housing and Urban Development — This tool assists the public and Choice Neighborhoods applicants to prepare data to submit with their grant application by allowing applicants to draw the exact...

  15. Sequence History Update Tool

    Science.gov (United States)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  16. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  17. Financing Alternatives Comparison Tool

    Science.gov (United States)

    FACT is a financial analysis tool that helps identify the most cost-effective method to fund a wastewater or drinking water management project. It produces a comprehensive analysis that compares various financing options.

  18. Breastfeeding assessment tools

    International Nuclear Information System (INIS)

    Bizouerne, Cécile; Kerac, Marko; Macgrath, Marie

    2014-01-01

    Full text: Breastfeeding plays a major role in reducing the global burden of child mortality and under-nutrition. Whilst many programmes aim to support breastfeeding and prevent feeding problems occurring, interventions are also needed once they have developed. In this situation, accurate assessment of a problem is critical to inform prognosis and enables tailored, appropriate treatment. The presentation will present a review, which aims to identify breastfeeding assessment tools/checklists for use in assessing malnourished infants in poor resource settings. The literature review identified 24 breastfeeding assessment tools, and 41 validation studies. Evidence underpinning most of the tools was mainly low quality, and conducted in high-income countries and hospital settings. The presentation will describe the main findings of the literature review and propose recommendations for improving existing tools in order to appropriately assess malnourished infants and enable early, appropriate intervention and treatment of malnutrition. (author)

  19. Personal Wellness Tools

    Science.gov (United States)

    ... of Personal Stories Peers Celebrating Art Peers Celebrating Music Be Vocal Support Locator DBSA In-Person Support ... With this tool, you can track key health trends related to the following: Overall mood Mood disorder ...

  20. Cash Reconciliation Tool

    Data.gov (United States)

    US Agency for International Development — CART is a cash reconciliation tool that allows users to reconcile Agency cash disbursements with Treasury fund balances; track open unreconciled items; and create an...

  1. Chemical Data Access Tool

    Data.gov (United States)

    U.S. Environmental Protection Agency — This tool is intended to aid individuals interested in learning more about chemicals that are manufactured or imported into the United States. Health and safety...

  2. Chatter and machine tools

    CERN Document Server

    Stone, Brian

    2014-01-01

    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  3. Learning Design Tools

    NARCIS (Netherlands)

    Griffiths, David; Blat, Josep; Garcia, Rocío; Vogten, Hubert; Kwong, KL

    2005-01-01

    Griffiths, D., Blat, J., Garcia, R., Vogten, H. & Kwong, KL. (2005). Learning Design Tools. In: Koper, R. & Tattersall, C., Learning Design: A Handbook on Modelling and Delivering Networked Education and Training (pp. 109-136). Berlin-Heidelberg: Springer Verlag.

  4. Clean Energy Finance Tool

    Science.gov (United States)

    State and local governments interested in developing a financing program can use this Excel tool to support energy efficiency and clean energy improvements for large numbers of buildings within their jurisdiction.

  5. Mapping Medicare Disparities Tool

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Minority Health has designed an interactive map, the Mapping Medicare Disparities Tool, to identify areas of disparities between subgroups of...

  6. Recovery Action Mapping Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Recovery Action Mapping Tool is a web map that allows users to visually interact with and query actions that were developed to recover species listed under the...

  7. Friction stir welding tool

    Science.gov (United States)

    Tolle,; Charles R. , Clark; Denis E. , Barnes; Timothy, A [Ammon, ID

    2008-04-15

    A friction stir welding tool is described and which includes a shank portion; a shoulder portion which is releasably engageable with the shank portion; and a pin which is releasably engageable with the shoulder portion.

  8. CMS offline web tools

    International Nuclear Information System (INIS)

    Metson, S; Newbold, D; Belforte, S; Kavka, C; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Tuura, L; Evans, D; Fanfani, A; Feichtinger, D; Kuznetsov, V; Lingen, F van; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools

  9. Quality management tool

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Tae Hun

    2011-09-15

    This book introduces basic conception of quality with characteristic, price, cost, and function, basic conception on quality management, introduction and operation of quality management, quality guaranteed and claim like handling of claim of goods, standards, and quality guaranteed method, basic tools of quality management such as Pareto diagram, characteristic diagram, cause-and-effect, fish born diagram check sheet histogram scatter diagram graph and stratification new seven tools of QC, quality deployment function and measurement system.

  10. Quality management tool

    International Nuclear Information System (INIS)

    Lee, Tae Hun

    2011-09-01

    This book introduces basic conception of quality with characteristic, price, cost, and function, basic conception on quality management, introduction and operation of quality management, quality guaranteed and claim like handling of claim of goods, standards, and quality guaranteed method, basic tools of quality management such as Pareto diagram, characteristic diagram, cause-and-effect, fish born diagram check sheet histogram scatter diagram graph and stratification new seven tools of QC, quality deployment function and measurement system.

  11. Stochastic tools in turbulence

    CERN Document Server

    Lumey, John L

    2012-01-01

    Stochastic Tools in Turbulence discusses the available mathematical tools to describe stochastic vector fields to solve problems related to these fields. The book deals with the needs of turbulence in relation to stochastic vector fields, particularly, on three-dimensional aspects, linear problems, and stochastic model building. The text describes probability distributions and densities, including Lebesgue integration, conditional probabilities, conditional expectations, statistical independence, lack of correlation. The book also explains the significance of the moments, the properties of the

  12. CMS offline web tools

    Energy Technology Data Exchange (ETDEWEB)

    Metson, S; Newbold, D [H.H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Belforte, S; Kavka, C [INFN, Sezione di Trieste (Italy); Bockelman, B [University of Nebraska Lincoln, Lincoln, NE (United States); Dziedziniewicz, K [CERN, Geneva (Switzerland); Egeland, R [University of Minnesota Twin Cities, Minneapolis, MN (United States); Elmer, P [Princeton (United States); Eulisse, G; Tuura, L [Northeastern University, Boston, MA (United States); Evans, D [Fermilab MS234, Batavia, IL (United States); Fanfani, A [Universita degli Studi di Bologna (Italy); Feichtinger, D [PSI, Villigen (Switzerland); Kuznetsov, V [Cornell University, Ithaca, NY (United States); Lingen, F van [California Institute of Technology, Pasedena, CA (United States); Wakefield, S [Blackett Laboratory, Imperial College, London (United Kingdom)

    2008-07-15

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools.

  13. Tools used for hand deburring

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, L.K.

    1981-03-01

    This guide is designed to help in quick identification of those tools most commonly used to deburr hand size or smaller parts. Photographs and textual descriptions are used to provide rapid yet detailed information. The data presented include the Bendix Kansas City Division coded tool number, tool description, tool crib in which the tool can be found, the maximum and minimum inventory requirements, the cost of each tool, and the number of the illustration that shows the tool.

  14. Cluster development in the SA tooling industry

    Directory of Open Access Journals (Sweden)

    Von Leipzig, Konrad

    2015-11-01

    Full Text Available This paper explores the concept of clustering in general, analysing research and experiences in different countries and regions, and summarising factors leading to success or contributing to failure of specific cluster initiatives. Based on this, requirements for the establishment of clusters are summarised. Next, initiatives especially in the South African tool and die making (TDM industry are considered. Through a benchmarking approach, the strengths and weaknesses of individual local tool rooms are analysed, and conclusions are drawn particularly about South African characteristics of the industry. From these results, and from structured interviews with individual tool room owners, difficulties in the establishment of a South African tooling cluster are explored, and specific areas of concern are pointed out.

  15. GVS - GENERAL VISUALIZATION SYSTEM

    Science.gov (United States)

    Keith, S. R.

    1994-01-01

    The primary purpose of GVS (General Visualization System) is to support scientific visualization of data output by the panel method PMARC_12 (inventory number ARC-13362) on the Silicon Graphics Iris computer. GVS allows the user to view PMARC geometries and wakes as wire frames or as light shaded objects. Additionally, geometries can be color shaded according to phenomena such as pressure coefficient or velocity. Screen objects can be interactively translated and/or rotated to permit easy viewing. Keyframe animation is also available for studying unsteady cases. The purpose of scientific visualization is to allow the investigator to gain insight into the phenomena they are examining, therefore GVS emphasizes analysis, not artistic quality. GVS uses existing IRIX 4.0 image processing tools to allow for conversion of SGI RGB files to other formats. GVS is a self-contained program which contains all the necessary interfaces to control interaction with PMARC data. This includes 1) the GVS Tool Box, which supports color histogram analysis, lighting control, rendering control, animation, and positioning, 2) GVS on-line help, which allows the user to access control elements and get information about each control simultaneously, and 3) a limited set of basic GVS data conversion filters, which allows for the display of data requiring simpler data formats. Specialized controls for handling PMARC data include animation and wakes, and visualization of off-body scan volumes. GVS is written in C-language for use on SGI Iris series computers running IRIX. It requires 28Mb of RAM for execution. Two separate hardcopy documents are available for GVS. The basic document price for ARC-13361 includes only the GVS User's Manual, which outlines major features of the program and provides a tutorial on using GVS with PMARC_12 data. Programmers interested in modifying GVS for use with data in formats other than PMARC_12 format may purchase a copy of the draft GVS 3.1 Software Maintenance

  16. Free Access Does Not Necessarily Encourage Practitioners to Use Online Evidence Based Information Tools. A Review of: Buchan, H., Lourey, E., D’Este, C., & Sanson-Fisher, R. (2009. Effectiveness of strategies to encourage general practitioners to accept an offer of free access to online evidence-based information: A randomised controlled trial. Implementation Science, 4, article 68.

    Directory of Open Access Journals (Sweden)

    Heather Ganshorn

    2010-12-01

    Full Text Available Objectives – To determine which strategies were most effective for encouraging general practitioners (GPs to sign up for free access to an online evidence based information resource; and to determine whether those who accepted the offer differed in their sociodemographic characteristics from those who did not.Design – Descriptive marketing research study.Setting – Australia’s public healthcare system.Subjects – 14,000 general practitioners (GPs from all regions of Australia.Methods – Subjects were randomly selected by Medicare Australia from its list of GPs that bill it for services. Medicare Australia had 18,262 doctors it deemed eligible; 14,000 of these were selected for a stratified random sample. Subjects were randomized to one of 7 groups of 2,000 each. Each group received a different letter offering two years of free access to BMJ Clinical Evidence, an evidence based online information tool. Randomization was done electronically, and the seven groups were stratified by age group, gender, and location. The interventions given to each group differed as follows:• Group 1: Received a letter offering 2 years of free access, with no further demands on the recipient.• Group 2: Received a letter offering 2 years of free access, but on the condition that they complete an initial questionnaire and another one at 12 months, as well as allowing the publisher to provide de-personalized usage data to the researchers.• Group 3: Same as Group 2, but with the additional offer of an online tutorial to assist them with using the resource.• Group 4: Same as Group 2, but with an additional pamphlet with positive testimonials about the resource from Australian medical opinion leaders.• Group 5: Same as Group 2, but with an additional offer of professional development credits towards their required annual totals.• Group 6: Same as Group 2, but with an additional offer to be entered to win a prize of $500 towards registration at a

  17. A Generalization of the Alias Matrix

    DEFF Research Database (Denmark)

    Kulahci, Murat; Bisgaard, S.

    2006-01-01

    The investigation of aliases or biases is important for the interpretation of the results from factorial experiments. For two-level fractional factorials this can be facilitated through their group structure. For more general arrays the alias matrix can be used. This tool is traditionally based...... on the assumption that the error structure is that associated with ordinary least squares. For situations where that is not the case, we provide in this article a generalization of the alias matrix applicable under the generalized least squares assumptions. We also show that for the special case of split plot error...... structure, the generalized alias matrix simplifies to the ordinary alias matrix....

  18. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  19. General Ultrasound Imaging

    Medline Plus

    Full Text Available ... News Physician Resources Professions Site Index A-Z General Ultrasound Ultrasound imaging uses sound waves to produce ... the limitations of General Ultrasound Imaging? What is General Ultrasound Imaging? Ultrasound is safe and painless, and ...

  20. General Nuclear Medicine

    Science.gov (United States)

    ... Resources Professions Site Index A-Z General Nuclear Medicine Nuclear medicine imaging uses small amounts of radioactive ... of General Nuclear Medicine? What is General Nuclear Medicine? Nuclear medicine is a branch of medical imaging ...

  1. Calfornia General Plans

    Data.gov (United States)

    California Natural Resource Agency — We undertook creating the first ever seamless statewide General Plan map for California. All county general plans and many city general plans were integrated into 1...

  2. Visual illusion of tool use recalibrates tactile perception

    Science.gov (United States)

    Miller, Luke E.; Longo, Matthew R.; Saygin, Ayse P.

    2018-01-01

    Brief use of a tool recalibrates multisensory representations of the user’s body, a phenomenon called tool embodiment. Despite two decades of research, little is known about its boundary conditions. It has been widely argued that embodiment requires active tool use, suggesting a critical role for somatosensory and motor feedback. The present study used a visual illusion to cast doubt on this view. We used a mirror-based setup to induce a visual experience of tool use with an arm that was in fact stationary. Following illusory tool use, tactile perception was recalibrated on this stationary arm, and with equal magnitude as physical use. Recalibration was not found following illusory passive tool holding, and could not be accounted for by sensory conflict or general interhemispheric plasticity. These results suggest visual tool-use signals play a critical role in driving tool embodiment. PMID:28196765

  3. Computing generalized Langevin equations and generalized Fokker-Planck equations.

    Science.gov (United States)

    Darve, Eric; Solomon, Jose; Kia, Amirali

    2009-07-07

    The Mori-Zwanzig formalism is an effective tool to derive differential equations describing the evolution of a small number of resolved variables. In this paper we present its application to the derivation of generalized Langevin equations and generalized non-Markovian Fokker-Planck equations. We show how long time scales rates and metastable basins can be extracted from these equations. Numerical algorithms are proposed to discretize these equations. An important aspect is the numerical solution of the orthogonal dynamics equation which is a partial differential equation in a high dimensional space. We propose efficient numerical methods to solve this orthogonal dynamics equation. In addition, we present a projection formalism of the Mori-Zwanzig type that is applicable to discrete maps. Numerical applications are presented from the field of Hamiltonian systems.

  4. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  5. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  6. Seven Basic Tools of Quality Control: An Appropriate Tools for Solving Quality Problems in the Organizations

    OpenAIRE

    Neyestani, Behnam

    2017-01-01

    Dr. Kaoru Ishikawa was first total quality management guru, who has been associated with the development and advocacy of using the seven quality control (QC) tools in the organizations for problem solving and process improvements. Seven old quality control tools are a set of the QC tools that can be used for improving the performance of the production processes, from the first step of producing a product or service to the last stage of production. So, the general purpose of this paper was to...

  7. New Conceptual Design Tools

    DEFF Research Database (Denmark)

    Pugnale, Alberto; Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    hand, the main software houses are trying to introduce powerful and effective user-friendly applications in the world of building designers, that are more and more able to fit their specific requirements; on the other hand, some groups of expert users with a basic programming knowledge seem to deal......This paper aims to discuss recent approaches in using more and more frequently computer tools as supports for the conceptual design phase of the architectural project. The present state-of-the-art about software as conceptual design tool could be summarized in two parallel tendencies. On the one...... with the problem of software as conceptual design tool by means of 'scripting', in other words by self-developing codes able to solve specific and well defined design problems. Starting with a brief historical recall and the discussion of relevant researches and practical experiences, this paper investigates...

  8. Assembly tool design

    International Nuclear Information System (INIS)

    Kanamori, Naokazu; Nakahira, Masataka; Ohkawa, Yoshinao; Tada, Eisuke; Seki, Masahiro

    1996-06-01

    The reactor core of the International Thermonuclear Experimental Reactor (ITER) is assembled with a number of large and asymmetric components within a tight tolerance in order to assure the structural integrity for various loads and to provide the tritium confinement. In addition, the assembly procedure should be compatible with remote operation since the core structures will be activated by 14-MeV neutrons once it starts operation and thus personal access will be prohibited. Accordingly, the assembly procedure and tool design are quite essential and should be designed from the beginning to facilitate remote operation. According to the ITER Design Task Agreement, the Japan Atomic Energy Research Institute (JAERI) has performed design study to develop the assembly procedures and associated tool design for the ITER tokamak assembly. This report describes outlines of the assembly tools and the remaining issues obtained in this design study. (author)

  9. The tools of cooperation and change.

    Science.gov (United States)

    Christensen, Clayton M; Marx, Matt; Stevenson, Howard H

    2006-10-01

    Employers can choose from lots of tools when they want to encourage employees to work together toward a new corporate goal. One of the rarest managerial skills is the ability to understand which tools will work in a given situation and which will misfire. Cooperation tools fall into four major categories: power, management, leadership, and culture. Choosing the right tool, say the authors, requires assessing the organization along two critical dimensions: the extent to which people agree on what they want and the extent to which they agree on cause and effect, or how to get what they want. The authors plot on a matrix where various organizations fall along these two dimensions. Employees represented in the lower-left quadrant of the model, for example, disagree strongly both about what they want and on what actions will produce which results. Those in the upper-right quadrant agree on both dimensions. Different quadrants call for different tools. When employees share little consensus on either dimension, for instance, the only methods that will elicit cooperation are "power tools" such as fiat, force, and threats. Yugoslavia's Josip Broz Tito wielded such devices effectively. So did Jamie Dimon, current CEO of J.P. Morgan Chase, during the bank's integration with Bank One. For employees who agree on what they want but not on how to get it--think of Microsoft in 1995--leadership tools, such as vision statements, are more appropriate. Some leaders are blessed with an instinct for choosing the right tools--Continental Airlines' Gordon Bethune, General Electric's Jack Welch, and IBM's Lou Gerstner are all examples. Others can use this framework to help select the most appropriate tools for their circumstances.

  10. CMS tracker visualization tools

    CERN Document Server

    Zito, G; Osborne, I; Regano, A

    2005-01-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  11. CMS tracker visualization tools

    Energy Technology Data Exchange (ETDEWEB)

    Mennea, M.S. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Osborne, I. [Northeastern University, 360 Huntington Avenue, Boston, MA 02115 (United States); Regano, A. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Zito, G. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy)]. E-mail: giuseppe.zito@ba.infn.it

    2005-08-21

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  12. The GNEMRE Dendro Tool.

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  13. CMS tracker visualization tools

    International Nuclear Information System (INIS)

    Mennea, M.S.; Osborne, I.; Regano, A.; Zito, G.

    2005-01-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking

  14. Coach assessment tool

    OpenAIRE

    Härkönen, Niko; Klicznik, Roman

    2014-01-01

    The Coach Assessment Tool was created to assist coaches of all sports for their own development. The starting point to develop the tool is the fact that coaching clinics solely focus on the technical and tactial skills of the sport. The education for coaches is lacking to teach the importance of the coach´s behavior towards their athletes. The question is how to teach properly the task in hand to increase the athlete´s performance considering the coach´s behavior. Nevertheless,...

  15. Cancer Data and Statistics Tools

    Science.gov (United States)

    ... Educational Campaigns Initiatives Stay Informed Cancer Data and Statistics Tools Recommend on Facebook Tweet Share Compartir Cancer Statistics Tools United States Cancer Statistics: Data Visualizations The ...

  16. Incident Information Management Tool

    CERN Document Server

    Pejovic, Vladimir

    2015-01-01

    Flaws of\tcurrent incident information management at CMS and CERN\tare discussed. A new data\tmodel for future incident database is\tproposed and briefly described. Recently developed draft version of GIS-­‐based tool for incident tracking is presented.

  17. Hypercard Another Computer Tool.

    Science.gov (United States)

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  18. Organisational skills and tools.

    Science.gov (United States)

    Wicker, Paul

    2009-04-01

    While this article mainly applies to practitioners who have responsibilities for leading teams or supervising practitioners, many of the skills and tools described here may also apply to students or junior practitioners. The purpose of this article is to highlight some of the main points about organisation, some of the organisational skills and tools that are available, and some examples of how these skills and tools can be used to make practitioners more effective at organising their workload. It is important to realise that organising work and doing work are two completely different things and shouldn't be mixed up. For example, it would be very difficult to start organising work in the middle of a busy operating list: the organisation of the work must come before the work starts and therefore preparation is often an important first step in organising work. As such, some of the tools and skills described in this article may need to be used hours or even days prior to the actual work taking place.

  19. Tools for Authentication

    International Nuclear Information System (INIS)

    White, G.

    2008-01-01

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work

  20. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  1. Tools for Authentication

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  2. [Tools for assisting diagnosis].

    Science.gov (United States)

    Roux, Magali; Asset, Sonya; Medjebar, Samir

    2017-11-01

    Connected objects are revolutionising practices, fulfil patients' needs for autonomy and the need to deploy healthcare provision beyond healthcare facilities. This article illustrates how these tools can be used in the case of epilepsy. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  3. Requirements for enrichment tools

    NARCIS (Netherlands)

    Boer, A.; Winkels, R.; Trompper, M.

    2016-01-01

    This report gives a high level overview of requirements for Enrichment tools in the Openlaws.eu project. Openlaws.eu aims to initiate a platform and develop a vision for Big Open Legal Data (BOLD): an open framework for legislation, case law, and legal literature from across Europe.

  4. C-TOOL

    DEFF Research Database (Denmark)

    Taghizadeh-Toosi, Arezoo; Christensen, Bent Tolstrup; Hutchings, Nicholas John

    2014-01-01

    Soil organic carbon (SOC) is a significant component of the global carbon (C) cycle. Changes in SOC storage affect atmospheric CO2 concentrations on decadal to centennial timescales. The C-TOOL model was developed to simulate farm- and regional-scale effects of management on medium- to long...

  5. Tools for Climate Services

    Science.gov (United States)

    Hartmann, H. C.

    2007-05-01

    Full realization of socio-economic benefits of from public investments in climate services remains incomplete because decision makers have difficulty: 1) interpreting individual products, 2) appropriately judging information credibility, and 3) linking different types of information, both conceptually and practically. Addressing these barriers is as important as improving the science leading to improved information. The challenge is creating flexible climate information products and tools that can accommodate unique user needs; the goal is a systemic change in the nature of information delivery and use. The underlying assumption is not that climate information is good and useful, and simply needs to be communicated effectively. Rather, a number of conditions must be met before decision makers can make informed choices about whether to use particular information in a specific situation. Several case studies, of varying success, illustrate user-centric strategies for developing decision support tools: a forecast evaluation tool, a climate information management system, and a hydrologic alert system. However, tools alone will not bridge the barriers in climate services, with training and other capacity- building activities remaining important activities.

  6. Sight Application Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  7. Tool voor verdeling rivierwater

    NARCIS (Netherlands)

    Hellegers, P.J.G.J.

    2011-01-01

    In het stroomgebied van de Inkomati-rivier in zuidelijk Afrika, maken partijen uit drie landen aanspraak op het rivierwater. LEI, Alterra en adviesbureau WaterWatch ontwikkelden met lokale partners een tool die betrokkenen laat zien wat ander landgebruik betekent voor de beschikbaarheid van dat

  8. The science writing tool

    Science.gov (United States)

    Schuhart, Arthur L.

    This is a two-part dissertation. The primary part is the text of a science-based composition rhetoric and reader called The Science Writing Tool. This textbook has seven chapters dealing with topics in Science Rhetoric. Each chapter includes a variety of examples of science writing, discussion questions, writing assignments, and instructional resources. The purpose of this text is to introduce lower-division college science majors to the role that rhetoric and communication plays in the conduct of Science, and how these skills contribute to a successful career in Science. The text is designed as a "tool kit," for use by an instructor constructing a science-based composition course or a writing-intensive Science course. The second part of this part of this dissertation reports on student reactions to draft portions of The Science Writing Tool text. In this report, students of English Composition II at Northern Virginia Community College-Annandale were surveyed about their attitudes toward course materials and topics included. The findings were used to revise and expand The Science Writing Tool.

  9. Big Data Visualization Tools

    OpenAIRE

    Bikakis, Nikos

    2018-01-01

    Data visualization is the presentation of data in a pictorial or graphical format, and a data visualization tool is the software that generates this presentation. Data visualization provides users with intuitive means to interactively explore and analyze data, enabling them to effectively identify interesting patterns, infer correlations and causalities, and supports sense-making activities.

  10. Rapid Tooling via Stereolithography

    OpenAIRE

    Montgomery, Eva

    2006-01-01

    Approximately three years ago, composite stereolithography (SL) resins were introduced to the marketplace, offering performance features beyond what traditional SL resins could offer. In particular, the high heat deflection temperatures and high stiffness of these highly filled resins have opened the door to several new rapid prototyping (RP) applications, including wind tunnel test modelling and, more recently, rapid tooling.

  11. Tools of the Trade

    Science.gov (United States)

    Porterfield, Kitty; Carnes, Meg

    2010-01-01

    People have known principals who are intellectual giants on issues of instruction and who have a great love of children but who stumble in leadership roles because they either do not or cannot communicate what they know. Good communication skills are among the most important tools a leader carries in his or her toolbox. Not only does good…

  12. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  13. Generalized Cartan Calculus in general dimension

    Science.gov (United States)

    Wang, Yi-Nan

    2015-07-01

    We develop the generalized Cartan Calculus for the groups and SO(5 , 5). They are the underlying algebraic structures of d = 9 , 7 , 6 exceptional field theory, respectively. These algebraic identities are needed for the "tensor hierarchy" structure in exceptional field theory. The validity of Poincaré lemmas in this new differential geometry is also discussed. Finally we explore some possible extension of the generalized Cartan calculus beyond the exceptional series.

  14. RISK COMMUNICATION IN ACTION: THE TOOLS OF MESSAGE MAPPING

    Science.gov (United States)

    Risk Communication in Action: The Tools of Message Mapping, is a workbook designed to guide risk communicators in crisis situations. The first part of this workbook will review general guidelines for risk communication. The second part will focus on one of the most robust tools o...

  15. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  16. Extended Testability Analysis Tool

    Science.gov (United States)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  17. Spray-formed tooling

    Science.gov (United States)

    McHugh, K. M.; Key, J. F.

    The United States Council for Automotive Research (USCAR) has formed a partnership with the Idaho National Engineering Laboratory (INEL) to develop a process for the rapid production of low-cost tooling based on spray forming technology developed at the INEL. Phase 1 of the program will involve bench-scale system development, materials characterization, and process optimization. In Phase 2, prototype systems will be designed, constructed, evaluated, and optimized. Process control and other issues that influence commercialization will be addressed during this phase of the project. Technology transfer to USCAR, or a tooling vendor selected by USCAR, will be accomplished during Phase 3. The approach INEL is using to produce tooling, such as plastic injection molds and stamping dies, combines rapid solidification processing and net-shape materials processing into a single step. A bulk liquid metal is pressure-fed into a de Laval spray nozzle transporting a high velocity, high temperature inert gas. The gas jet disintegrates the metal into fine droplets and deposits them onto a tool pattern made from materials such as plastic, wax, clay, ceramics, and metals. The approach is compatible with solid freeform fabrication techniques such as stereolithography, selective laser sintering, and laminated object manufacturing. Heat is extracted rapidly, in-flight, by convection as the spray jet entrains cool inert gas to produce undercooled and semi-solid droplets. At the pattern, the droplets weld together while replicating the shape and surface features of the pattern. Tool formation is rapid; deposition rates in excess of 1 ton/h have been demonstrated for bench-scale nozzles.

  18. Developing new chemical tools for solvent extraction

    International Nuclear Information System (INIS)

    Moyer, B.A.; Baes, C.F.; Burns, J.H.; Case, G.N.; Sachleben, R.A.; Bryan, S.A.; Lumetta, G.J.; McDowell, W.J.; Sachleben, R.A.

    1993-01-01

    Prospects for innovation and for greater technological impact in the field of solvent extraction (SX) seem as bright as ever, despite the maturation of SX as an economically significant separation method and as an important technique in the laboratory. New industrial, environmental, and analytical problems provide compelling motivation for diversifying the application of SX, developing new solvent systems, and seeking improved properties. Toward this end, basic research must be dedicated to enhancing the tools of SX: physical tools for probing the basis of extraction and molecular tools for developing new SX chemistries. In this paper, the authors describe their progress in developing and applying the general tools of equilibrium analysis and of ion recognition in SX. Nearly half a century after the field of SX began in earnest, coordination chemistry continues to provide the impetus for important advancements in understanding SX systems and in controlling SX chemistry. In particular, the physical tools of equilibrium analysis, X-ray crystallography, and spectroscopy are elucidating the molecular basis of SX in unprecedented detail. Moreover, the principles of ion recognition are providing the molecular tools with which to achieve new selectivities and new applications

  19. Remote tool development for nuclear dismantling operations

    International Nuclear Information System (INIS)

    Craig, G.; Ferlay, J.C.; Ieracitano, F.

    2003-01-01

    Remote tool systems to undertake nuclear dismantling operations require careful design and development not only to perform their given duty but to perform it safely within the constraints imposed by harsh environmental conditions. Framatome ANP NUCLEAR SERVICES has for a long time developed and qualified equipment to undertake specific maintenance operations of nuclear reactors. The tool development methodology from this activity has since been adapted to resolve some very challenging reactor dismantling operations which are demonstrated in this paper. Each nuclear decommissioning project is a unique case, technical characterisation data is generally incomplete. The development of the dismantling methodology and associated equipment is by and large an iterative process combining design and simulation with feasibility and validation testing. The first stage of the development process involves feasibility testing of industrial tools and examining adaptations necessary to control and deploy the tool remotely with respect to the chosen methodology and environmental constraints. This results in a prototype tool and deployment system to validate the basic process. The second stage involves detailed design which integrates any remaining technical and environmental constraints. At the end of this stage, tools and deployment systems, operators and operating procedures are qualified on full scale mock ups. (authors)

  20. On generalized operator quasi-equilibrium problems

    Science.gov (United States)

    Kum, Sangho; Kim, Won Kyu

    2008-09-01

    In this paper, we will introduce the generalized operator equilibrium problem and generalized operator quasi-equilibrium problem which generalize the operator equilibrium problem due to Kazmi and Raouf [K.R. Kazmi, A. Raouf, A class of operator equilibrium problems, J. Math. Anal. Appl. 308 (2005) 554-564] into multi-valued and quasi-equilibrium problems. Using a Fan-Browder type fixed point theorem in [S. Park, Foundations of the KKM theory via coincidences of composites of upper semicontinuous maps, J. Korean Math. Soc. 31 (1994) 493-519] and an existence theorem of equilibrium for 1-person game in [X.-P. Ding, W.K. Kim, K.-K. Tan, Equilibria of non-compact generalized games with L*-majorized preferences, J. Math. Anal. Appl. 164 (1992) 508-517] as basic tools, we prove new existence theorems on generalized operator equilibrium problem and generalized operator quasi-equilibrium problem which includes operator equilibrium problems.

  1. Nutrition screening tools: Does one size fit all? A systematic review of screening tools for the hospital setting

    NARCIS (Netherlands)

    van Bokhorst-de van der Schueren, M.A.E.; Guaitoli, P.R.; Jansma, E.P.; de Vet, H.C.W.

    2014-01-01

    Background & aims: Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. Methods: A systematic review of

  2. Keeping you safe by making machine tools safe

    CERN Multimedia

    2012-01-01

    CERN’s third safety objective for 2012 concerns the safety of equipment - and machine tools in particular.   There are three prerequisites for ensuring that a machine tool can be used safely: ·      the machine tool must comply with Directive 2009/104/EC, ·      the layout of the workshop must be compliant, and ·      everyone who uses the machine tool must be trained. Provided these conditions are met, the workshop head can grant authorisation to use the machine tool. To fulfil this objective, an inventory of the machine tools must be drawn up and the people responsible for them identified. The HSE Unit's Safety Inspection Service produces compliance reports for the machine tools. In order to meet the third objective set by the Director-General, the section has doubled its capacity to carry out inspections: ...

  3. Teachers' Understanding of Algebraic Generalization

    Science.gov (United States)

    Hawthorne, Casey Wayne

    conceptualizations of the symbols. Finally, by comparing two teachers' understandings of student thinking in the classroom, I developed an instructional trajectory to describe steps along students' generalization processes. This emergent framework serves as an instructional tool for teachers' use in identifying significant connections in supporting students to develop understanding of algebraic symbols as representations that communicate the quantities perceived in the figure.

  4. An Analytical Study of Tools and Techniques for Movie Marketing

    Directory of Open Access Journals (Sweden)

    Garima Maik

    2014-08-01

    Full Text Available Abstract. Bollywood or Hindi movie industry is one of the fastest growing sector in the media and entertainment space creating numerous business and employment opportunities. Movies in India are a major source of entertainment for all sects of society. They not only face competition from other movie industries and movies but from other source of entertainment such as adventure sports, amusement parks, theatre and drama, pubs and discothèques. A lot of man power, man hours, creative brains, and money are put in to build a quality feature film. Bollywood is the industry which continuously works towards providing the 7 billion population with something new always. So it is important for the movie and production team to stand out, to grab the due attention of the maximum audience. Movie makers employ various tools and techniques today to market their movies. They leave no stone unturned. They roll out teasers, First look, Theatrical trailer release, Music launch, City tours, Producer’s and director’s interview, Movie premier, Movie release, post release follow up and etc. to pull the viewers to the Cineplex. The audience today which comprises mainly of youth requires photos, videos, meet ups, gossip, debate, collaboration and content creation. These requirements of today’s generation are most fulfilled through digital platforms. However, the traditional media like newspapers, radio, and television are not old school. They reach out to mass audience and play an upper role in effective marketing. This study aims at analysing these tools for their effectiveness. The objectives are fulfilled through a consumer survey. This study will bring out the effectiveness and relational importance of various tools which are employed by movie marketers to generate maximum returns on the investments by using various data reduction techniques like factor analysis and statistical techniques like chi-square test with data visualization using pie charts

  5. Web Tools: The Second Generation

    Science.gov (United States)

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  6. Pneumatic soil removal tool

    International Nuclear Information System (INIS)

    Neuhaus, J.E.

    1992-01-01

    A soil removal tool is provided for removing radioactive soil, rock and other debris from the bottom of an excavation, while permitting the operator to be located outside of a containment for that excavation. The tool includes a fixed jaw, secured to one end of an elongate pipe, which cooperates with a movable jaw pivotably mounted on the pipe. Movement of the movable jaw is controlled by a pneumatic cylinder mounted on the pipe. The actuator rod of the pneumatic cylinder is connected to a collar which is slidably mounted on the pipe and forms part of the pivotable mounting assembly for the movable jaw. Air is supplied to the pneumatic cylinder through a handle connected to the pipe, under the control of an actuator valve mounted on the handle, to provide movement of the movable jaw. 3 figs

  7. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  8. Remote vehicle survey tool

    International Nuclear Information System (INIS)

    Armstrong, G.A.; Burks, B.L.; Kress, R.L.; Wagner, D.G.; Ward, C.R.

    1993-01-01

    The Remote Vehicle Survey Tool (RVS7) is a color graphical display tool for viewing remotely acquired scientific data. The RVST displays the data in the form of a color two-dimensional world model map. The world model map allows movement of the remote vehicle to be tracked by the operator and the data from sensors to be graphically depicted in the interface. Linear and logarithmic meters, dual channel oscilloscopes, and directional compasses are used to display sensor information. The RVST is user-configurable by the use of ASCII text files. The operator can configure the RVST to work with any remote data acquisition system and teleoperated or autonomous vehicle. The modular design of the RVST and its ability to be quickly configured for varying system requirements make the RVST ideal for remote scientific data display in all environmental restoration and waste management programs

  9. Pneumatic soil removal tool

    Science.gov (United States)

    Neuhaus, John E.

    1992-01-01

    A soil removal tool is provided for removing radioactive soil, rock and other debris from the bottom of an excavation, while permitting the operator to be located outside of a containment for that excavation. The tool includes a fixed jaw, secured to one end of an elongate pipe, which cooperates with a movable jaw pivotably mounted on the pipe. Movement of the movable jaw is controlled by a pneumatic cylinder mounted on the pipe. The actuator rod of the pneumatic cylinder is connected to a collar which is slidably mounted on the pipe and forms part of the pivotable mounting assembly for the movable jaw. Air is supplied to the pneumatic cylinder through a handle connected to the pipe, under the control of an actuator valve mounted on the handle, to provide movement of the movable jaw.

  10. Communication tools in Canada

    International Nuclear Information System (INIS)

    Cowper, D.

    1995-01-01

    This document deals with the means and tools that are used for communicating with elected representatives. First, messages need to be simple, few in number and accurate. It is also advised to seek a first briefing because there is some information to be given, and not because some help is needed. On top of that, contacts should be made often enough to assure of a continued interest. (TEC)

  11. The GEDI Performance Tool

    Science.gov (United States)

    Hancock, S.; Armston, J.; Tang, H.; Patterson, P. L.; Healey, S. P.; Marselis, S.; Duncanson, L.; Hofton, M. A.; Kellner, J. R.; Luthcke, S. B.; Sun, X.; Blair, J. B.; Dubayah, R.

    2017-12-01

    NASA's Global Ecosystem Dynamics Investigation will mount a multi-track, full-waveform lidar on the International Space Station (ISS) that is optimised for the measurement of forest canopy height and structure. GEDI will use ten laser tracks, two 10 mJ "power beams" and eight 5 mJ "coverage beams" to produce global (51.5oS to 51.5oN) maps of above ground biomass (AGB), canopy height, vegetation structure and other biophysical parameters. The mission has a requirement to generate a 1 km AGB map with 80% of pixels with ≤ 20% standard error or 20 Mg·ha-1, whichever is greater. To assess performance and compare to mission requirements, an end-to-end simulator has been developed. The simulator brings together tools to propagate the effects of measurement and sampling error on GEDI data products. The simulator allows us to evaluate the impact of instrument performance, ISS orbits, processing algorithms and losses of data that may occur due to clouds, snow, leaf-off conditions, and areas with an insufficient signal-to-noise ratio (SNR). By evaluating the consequences of operational decisions on GEDI data products, this tool provides a quantitative framework for decision-making and mission planning. Here we demonstrate the performance tool by using it to evaluate the trade-off between measurement and sampling error on the 1 km AGB data product. Results demonstrate that the use of coverage beams during the day (lowest GEDI SNR case) over very dense forests (>95% canopy cover) will result in some measurement bias. Omitting these low SNR cases increased the sampling error. Through this an SNR threshold for a given expected canopy cover can be set. The other applications of the performance tool are also discussed, such as assessing the impact of decisions made in the AGB modelling and signal processing stages on the accuracy of final data products.

  12. Program Management Tool

    Science.gov (United States)

    Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil

    2007-01-01

    The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity

  13. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  14. MICROCONTROLLER PIN CONFIGURATION TOOL

    OpenAIRE

    Bhaskar Joshi; F. Mohammed Rizwan; Dr. Rajashree Shettar

    2012-01-01

    Configuring the micro controller with large number of pins is tedious. Latest Infine on microcontroller contains more than 200 pins and each pin has classes of signals. Therefore the complexity of the microcontroller is growing. It evolves looking into thousands of pages of user manual. For a user it will take days to configure the microcontroller with the peripherals. We need an automated tool to configure the microcontroller so that the user can configure the microcontroller without having ...

  15. Generalized convexity, generalized monotonicity recent results

    CERN Document Server

    Martinez-Legaz, Juan-Enrique; Volle, Michel

    1998-01-01

    A function is convex if its epigraph is convex. This geometrical structure has very strong implications in terms of continuity and differentiability. Separation theorems lead to optimality conditions and duality for convex problems. A function is quasiconvex if its lower level sets are convex. Here again, the geo­ metrical structure of the level sets implies some continuity and differentiability properties for quasiconvex functions. Optimality conditions and duality can be derived for optimization problems involving such functions as well. Over a period of about fifty years, quasiconvex and other generalized convex functions have been considered in a variety of fields including economies, man­ agement science, engineering, probability and applied sciences in accordance with the need of particular applications. During the last twenty-five years, an increase of research activities in this field has been witnessed. More recently generalized monotonicity of maps has been studied. It relates to generalized conve...

  16. Teaching Syllogistics Using E-learning Tools

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Sandborg-Petersen, Ulrik; Thorvaldsen, Steinar

    2016-01-01

    This paper is a study of various strategies for teaching syllogistics as part of a course in basic logic. It is a continuation of earlier studies involving practical experiments with students of Communication using the Syllog system, which makes it possible to develop e-learning tools and to do l...... involving different teaching methods will be compared.......This paper is a study of various strategies for teaching syllogistics as part of a course in basic logic. It is a continuation of earlier studies involving practical experiments with students of Communication using the Syllog system, which makes it possible to develop e-learning tools and to do...... learning analytics based on log-data. The aim of the present paper is to investigate whether the Syllog e-learning tools can be helpful in logic teaching in order to obtain a better understanding of logic and argumentation in general and syllogisms in particular. Four versions of a course in basic logic...

  17. Dynamic principle for ensemble control tools.

    Science.gov (United States)

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  18. Fluid sampling tool

    Science.gov (United States)

    Garcia, A.R.; Johnston, R.G.; Martinez, R.K.

    1999-05-25

    A fluid sampling tool is described for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall. 6 figs.

  19. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  20. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  1. Fluid sampling tool

    Science.gov (United States)

    Garcia, Anthony R.; Johnston, Roger G.; Martinez, Ronald K.

    1999-05-25

    A fluid sampling tool for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall.

  2. Generalized symmetry algebras

    International Nuclear Information System (INIS)

    Dragon, N.

    1979-01-01

    The possible use of trilinear algebras as symmetry algebras for para-Fermi fields is investigated. The shortcomings of the examples are argued to be a general feature of such generalized algebras. (author)

  3. Academy of General Dentistry

    Science.gov (United States)

    ... Examine Oral Systemic Health Nov 14, 2017 General Dentistry and American Family Physician Collaborate to Examine Oral ... Oral Health Oct 23, 2017 Academy of General Dentistry Foundation Celebrates 45 Years Raising Awareness for Oral ...

  4. Generalized quantum groups

    International Nuclear Information System (INIS)

    Leivo, H.P.

    1992-01-01

    The algebraic approach to quantum groups is generalized to include what may be called an anyonic symmetry, reflecting the appearance of phases more general than ±1 under transposition. (author). 6 refs

  5. General Ultrasound Imaging

    Medline Plus

    Full Text Available ... What are the limitations of General Ultrasound Imaging? What is General Ultrasound Imaging? Ultrasound is safe and ... be heard with every heartbeat. top of page What are some common uses of the procedure? Ultrasound ...

  6. Delphi General Ledger -

    Data.gov (United States)

    Department of Transportation — Delphi general ledger contains the following data elements, but is not limited to the United States Standard General Ledger (USSGL) chart of accounts, stores actual,...

  7. Generalized hypergeometric coherent states

    International Nuclear Information System (INIS)

    Appl, Thomas; Schiller, Diethard H

    2004-01-01

    We introduce a large class of holomorphic quantum states by choosing their normalization functions to be given by generalized hypergeometric functions. We call them generalized hypergeometric states in general, and generalized hypergeometric coherent states in particular, if they allow a resolution of unity. Depending on the domain of convergence of the generalized hypergeometric functions, we distinguish generalized hypergeometric states on the plane, the open unit disc and the unit circle. All states are eigenstates of suitably defined lowering operators. We then study their photon number statistics and phase properties as revealed by the Husimi and Pegg-Barnett phase distributions. On the basis of the generalized hypergeometric coherent states we introduce new analytic representations of arbitrary quantum states in Bargmann and Hardy spaces as well as generalized hypergeometric Husimi distributions and corresponding phase distributions

  8. A Tool for Simulating Rotating Coil Magnetometers

    CERN Document Server

    Bottura, L; Schnizer, P; Smirnov, N

    2002-01-01

    When investigating the quality of a magnetic measurement system, one observes difficulties to identify the "trouble maker" of such a system as different effects can yield similar influences on the measurement results.We describe a tool in this paper that allows to investigate numerically the effects produced by different imperfections of components of such a system, including, but not limited to vibration and movements of the rotating coil, influence of electrical noise on the system, angular encoder imperfections. This system can simulate the deterministic and stochastic parts of those imperfections. We outline the physical models used that are generally based on experience or first principles. Comparisons to analytical results are shown. The modular structure of the general design of this tool permits to include new modules for new devices and effects.

  9. Generalized Fourier transforms classes

    DEFF Research Database (Denmark)

    Berntsen, Svend; Møller, Steen

    2002-01-01

    The Fourier class of integral transforms with kernels $B(\\omega r)$ has by definition inverse transforms with kernel $B(-\\omega r)$. The space of such transforms is explicitly constructed. A slightly more general class of generalized Fourier transforms are introduced. From the general theory foll...... follows that integral transform with kernels which are products of a Bessel and a Hankel function or which is of a certain general hypergeometric type have inverse transforms of the same structure....

  10. Generalized Fourier transforms classes

    DEFF Research Database (Denmark)

    Berntsen, Svend; Møller, Steen

    2002-01-01

    The Fourier class of integral transforms with kernels $B(\\omega r)$ has by definition inverse transforms with kernel $B(-\\omega r)$. The space of such transforms is explicitly constructed. A slightly more general class of generalized Fourier transforms are introduced. From the general theory...

  11. Forces in General Relativity

    Science.gov (United States)

    Ridgely, Charles T.

    2010-01-01

    Many textbooks dealing with general relativity do not demonstrate the derivation of forces in enough detail. The analyses presented herein demonstrate straightforward methods for computing forces by way of general relativity. Covariant divergence of the stress-energy-momentum tensor is used to derive a general expression of the force experienced…

  12. General Ultrasound Imaging

    Medline Plus

    Full Text Available ... be necessary. Your doctor will explain the exact reason why another exam is requested. Sometimes a follow- ... Ultrasound provides real-time imaging, making it a good tool for guiding minimally invasive procedures such as ...

  13. Tools for Trigger Aware Analyses in ATLAS

    CERN Document Server

    Krasznahorkay, A; The ATLAS collaboration; Stelzer, J

    2010-01-01

    In order to search for rare processes, all four LHC experiments have to use advanced triggering methods for selecting and recording the events of interest. At the expected nominal LHC operating conditions only about 0.0005% of the collision events can be kept for physics analysis in ATLAS. Therefore the understanding and evaluation of the trigger performance is one of the most crucial parts of any physics analysis. ATLAS’s first level trigger is composed of custom-built hardware, while the second and third levels are implemented using regular PCs running reconstruction and selection algorithms. Because of this split, accessing the results of the trigger execution for the two stages is different. The complexity of the software trigger presents further difficulties in accessing the trigger data. To make the job of the physicists easier when evaluating the trigger performance, multiple general-use tools are provided by the ATLAS Trigger Analysis Tools group. The TrigDecisionTool, a general tool, is provided to...

  14. Reuse Tools to Support ADA Instantiation Construction

    Science.gov (United States)

    1990-06-01

    specification and body with embedded task shell instantiations, as well as an inter-task coordination procedure which controls task activation, execution, and...Tools to Support Ada Instantiation Construction 3 - Generalized Construction Approaches Page 39 4Automatic Programming Programmer’s Apprentice ~ASLs...which is the root of a frame hierarchy. The specification frame controls the hierarchy’s composition of the program and stores all its custom

  15. Ceramic cutting tools materials, development and performance

    CERN Document Server

    Whitney, E Dow

    1994-01-01

    Interest in ceramics as a high speed cutting tool material is based primarily on favorable material properties. As a class of materials, ceramics possess high melting points, excellent hardness and good wear resistance. Unlike most metals, hardness levels in ceramics generally remain high at elevated temperatures which means that cutting tip integrity is relatively unaffected at high cutting speeds. Ceramics are also chemically inert against most workmetals.

  16. Oxygen general saturation after bronchography under general ...

    African Journals Online (AJOL)

    Thirty-six patients undergoing bronchography or bronchoscopy under general anaesthesia were continuously monitored by pulse oximetry for 5 hours after these procedures. Significant falls in oxygen saturation were observed in the first hour and were of most clinical relevance in patients with preexisting pulmonary ...

  17. Estimating Tool-Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool.

    Science.gov (United States)

    Zhao, Baoliang; Nelson, Carl A

    2016-10-01

    Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool-tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool-tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool-tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool-tissue interaction forces in real time, thereby increasing surgical efficiency and safety.

  18. C++ software quality in the ATLAS experiment: tools and experience

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00236968; The ATLAS collaboration; Kluth, Stefan; Seuster, Rolf; Snyder, Scott; Obreshkov, Emil; Roe, Shaun; Sherwood, Peter; Stewart, Graeme

    2017-01-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  19. C++ software quality in the ATLAS experiment: tools and experience

    Science.gov (United States)

    Martin-Haugh, S.; Kluth, S.; Seuster, R.; Snyder, S.; Obreshkov, E.; Roe, S.; Sherwood, P.; Stewart, G. A.

    2017-10-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  20. C++ Software Quality in the ATLAS Experiment: Tools and Experience

    CERN Document Server

    Kluth, Stefan; The ATLAS collaboration; Obreshkov, Emil; Roe, Shaun; Seuster, Rolf; Snyder, Scott; Stewart, Graeme

    2016-01-01

    The ATLAS experiment at CERN uses about six million lines of code and currently has about 420 developers whose background is largely from physics. In this paper we explain how the C++ code quality is managed using a range of tools from compile-time through to run time testing and reflect on the great progress made in the last year largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other tools including cppcheck, Include-What-You-Use and run-time 'sanitizers' are also discussed.

  1. The Resilience of Analog Tools in Creative Work Practices

    DEFF Research Database (Denmark)

    Borum, Nanna; Petersson, Eva; Frimodt-Møller, Søren

    2014-01-01

    This paper discusses the use of digital and analog tools, respectively, in a creative industry. The research was done within the EU-funded research project IdeaGarden, which explores digital platforms for creative collaboration. The findings in a case study of LEGO® Future Lab, one of LEGO Group......’s largest innovation departments, show a preference for analog tools over digital in the creative process. This points towards a general need for tangible tools in the creative work process, a need that has consequences for the development of new digital tools for creative collaboration....

  2. On a Generalized Hankel Type Convolution of Generalized Functions

    Indian Academy of Sciences (India)

    Generalized Hankel type transformation; Parserval relation; generalized ... The classical generalized Hankel type convolution are defined and extended to a class of generalized functions. ... Proceedings – Mathematical Sciences | News.

  3. Generally covariant gauge theories

    International Nuclear Information System (INIS)

    Capovilla, R.

    1992-01-01

    A new class of generally covariant gauge theories in four space-time dimensions is investigated. The field variables are taken to be a Lie algebra valued connection 1-form and a scalar density. Modulo an important degeneracy, complex [euclidean] vacuum general relativity corresponds to a special case in this class. A canonical analysis of the generally covariant gauge theories with the same gauge group as general relativity shows that they describe two degrees of freedom per space point, qualifying therefore as a new set of neighbors of general relativity. The modification of the algebra of the constraints with respect to the general relativity case is computed; this is used in addressing the question of how general relativity stands out from its neighbors. (orig.)

  4. ASTROS: A multidisciplinary automated structural design tool

    Science.gov (United States)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  5. Three novel software tools for ASDEX Upgrade

    International Nuclear Information System (INIS)

    Martinov, S.; Löbhard, T.; Lunt, T.; Behler, K.; Drube, R.; Eixenberger, H.; Herrmann, A.; Lohs, A.; Lüddecke, K.; Merkel, R.; Neu, G.; ASDEX Upgrade Team; MPCDF Garching

    2016-01-01

    Highlights: • Key features of innovative software tools for data visualization and inspection are presented to the nuclear fusion research community. • 3D animation of experiment geometry together with diagnostic data and images allow better understanding of measurements and influence of machine construction details behind them. • Multi-video viewer with fusion relevant image manipulation abilities and event database features allows faster and better decision making from video streams coming from various plasma and machine diagnostics. • Platform independant Web technologies enable the inspection of diagnostic raw signals with virtually any kind of display device. - Abstract: Visualization of measurements together with experimental settings is a general subject in experiments analysis. The complex engineering design, 3D geometry, and manifold of diagnostics in larger fusion research experiments justify the development of special analysis and visualization programs. Novel ASDEX Upgrade (AUG) software tools bring together virtual navigation through 3D device models and advanced play-back and interpretation of video streams from plasma discharges. A third little tool allows the web-based platform independent observation of real-time diagnostic signals. While all three tools stem from spontaneous development ideas and are not considered mission critical for the operation of a fusion device, they with time and growing completeness shaped up as valuable helpers to visualize acquired data in fusion research. A short overview on the goals, the features, and the design as well as the operation of these tools is given in this paper.

  6. Three novel software tools for ASDEX Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Martinov, S. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Löbhard, T. [Conovum GmbH & Co. KG, Nymphenburger Straße 13, D-80335 München (Germany); Lunt, T. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Behler, K., E-mail: karl.behler@ipp.mpg.de [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Drube, R.; Eixenberger, H.; Herrmann, A.; Lohs, A. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Lüddecke, K. [Unlimited Computer Systems GmbH, Seeshaupterstr. 15, D-82393 Iffeldorf (Germany); Merkel, R.; Neu, G.; ASDEX Upgrade Team [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); MPCDF Garching [Max Planck Compu ting and Data Facility, Boltzmannstr. 2, D-85748 Garching (Germany)

    2016-11-15

    Highlights: • Key features of innovative software tools for data visualization and inspection are presented to the nuclear fusion research community. • 3D animation of experiment geometry together with diagnostic data and images allow better understanding of measurements and influence of machine construction details behind them. • Multi-video viewer with fusion relevant image manipulation abilities and event database features allows faster and better decision making from video streams coming from various plasma and machine diagnostics. • Platform independant Web technologies enable the inspection of diagnostic raw signals with virtually any kind of display device. - Abstract: Visualization of measurements together with experimental settings is a general subject in experiments analysis. The complex engineering design, 3D geometry, and manifold of diagnostics in larger fusion research experiments justify the development of special analysis and visualization programs. Novel ASDEX Upgrade (AUG) software tools bring together virtual navigation through 3D device models and advanced play-back and interpretation of video streams from plasma discharges. A third little tool allows the web-based platform independent observation of real-time diagnostic signals. While all three tools stem from spontaneous development ideas and are not considered mission critical for the operation of a fusion device, they with time and growing completeness shaped up as valuable helpers to visualize acquired data in fusion research. A short overview on the goals, the features, and the design as well as the operation of these tools is given in this paper.

  7. APT: Aperture Photometry Tool

    Science.gov (United States)

    Laher, Russ

    2012-08-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  8. Mentorship Programs for Faculty Development in Academic General Pediatric Divisions

    Directory of Open Access Journals (Sweden)

    Jennifer Takagishi

    2011-01-01

    Discussion. General pediatric division chiefs acknowledge the benefits of mentoring relationships, and some have programs in place. Many need tools to create them. Pediatric societies could facilitate this critical area of professional development.

  9. Application of the dynamical interpretation of general relativity

    International Nuclear Information System (INIS)

    Deumens, E.

    1981-01-01

    The paper argues that the gravitational field seen in the fully dynamical way, described here, is a useful tool for understanding some fundamental results in a coherent general relativistic way. (author)

  10. Computational Aeroacoustics Using the Generalized Lattice Boltzmann Equation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall objective of the proposed project is to develop a generalized lattice Boltzmann (GLB) approach as a potential computational aeroacoustics (CAA) tool for...

  11. Establishment of Guidance Levels in General Radiography and Mammography

    International Nuclear Information System (INIS)

    2010-04-01

    Coordinated project report IAEA ARCAL LXXV-RLA/9/048 Pilot Exercise for Developing and Setting Levels Reference in General Radiography and Mammography as a Tool for Optimizing Radiation Protection and Reduce Patient Exposure in Latin America

  12. Forensic analysis of video steganography tools

    Directory of Open Access Journals (Sweden)

    Thomas Sloan

    2015-05-01

    Full Text Available Steganography is the art and science of concealing information in such a way that only the sender and intended recipient of a message should be aware of its presence. Digital steganography has been used in the past on a variety of media including executable files, audio, text, games and, notably, images. Additionally, there is increasing research interest towards the use of video as a media for steganography, due to its pervasive nature and diverse embedding capabilities. In this work, we examine the embedding algorithms and other security characteristics of several video steganography tools. We show how all feature basic and severe security weaknesses. This is potentially a very serious threat to the security, privacy and anonymity of their users. It is important to highlight that most steganography users have perfectly legal and ethical reasons to employ it. Some common scenarios would include citizens in oppressive regimes whose freedom of speech is compromised, people trying to avoid massive surveillance or censorship, political activists, whistle blowers, journalists, etc. As a result of our findings, we strongly recommend ceasing any use of these tools, and to remove any contents that may have been hidden, and any carriers stored, exchanged and/or uploaded online. For many of these tools, carrier files will be trivial to detect, potentially compromising any hidden data and the parties involved in the communication. We finish this work by presenting our steganalytic results, that highlight a very poor current state of the art in practical video steganography tools. There is unfortunately a complete lack of secure and publicly available tools, and even commercial tools offer very poor security. We therefore encourage the steganography community to work towards the development of more secure and accessible video steganography tools, and make them available for the general public. The results presented in this work can also be seen as a useful

  13. Highcrop picture tool

    OpenAIRE

    Fog, Erik

    2013-01-01

    Pictures give other impulses than words and numbers. With images, you can easily spot new opportunities. The Highcrop-tool allows for optimization of the organic arable farm based on picture-cards. The picture-cards are designed to make it easier and more inspiring to go close to the details of production. By using the picture-cards you can spot the areas, where there is a possibility to optimize the production system for better results in the future. Highcrop picture cards can be used to:...

  14. Jupiter Environment Tool

    Science.gov (United States)

    Sturm, Erick J.; Monahue, Kenneth M.; Biehl, James P.; Kokorowski, Michael; Ngalande, Cedrick,; Boedeker, Jordan

    2012-01-01

    The Jupiter Environment Tool (JET) is a custom UI plug-in for STK that provides an interface to Jupiter environment models for visualization and analysis. Users can visualize the different magnetic field models of Jupiter through various rendering methods, which are fully integrated within STK s 3D Window. This allows users to take snapshots and make animations of their scenarios with magnetic field visualizations. Analytical data can be accessed in the form of custom vectors. Given these custom vectors, users have access to magnetic field data in custom reports, graphs, access constraints, coverage analysis, and anywhere else vectors are used within STK.

  15. ANT Advanced Neural Tool

    Energy Technology Data Exchange (ETDEWEB)

    Labrador, I.; Carrasco, R.; Martinez, L.

    1996-07-01

    This paper describes a practical introduction to the use of Artificial Neural Networks. Artificial Neural Nets are often used as an alternative to the traditional symbolic manipulation and first order logic used in Artificial Intelligence, due the high degree of difficulty to solve problems that can not be handled by programmers using algorithmic strategies. As a particular case of Neural Net a Multilayer Perception developed by programming in C language on OS9 real time operating system is presented. A detailed description about the program structure and practical use are included. Finally, several application examples that have been treated with the tool are presented, and some suggestions about hardware implementations. (Author) 15 refs.

  16. ANT Advanced Neural Tool

    International Nuclear Information System (INIS)

    Labrador, I.; Carrasco, R.; Martinez, L.

    1996-01-01

    This paper describes a practical introduction to the use of Artificial Neural Networks. Artificial Neural Nets are often used as an alternative to the traditional symbolic manipulation and first order logic used in Artificial Intelligence, due the high degree of difficulty to solve problems that can not be handled by programmers using algorithmic strategies. As a particular case of Neural Net a Multilayer Perception developed by programming in C language on OS9 real time operating system is presented. A detailed description about the program structure and practical use are included. Finally, several application examples that have been treated with the tool are presented, and some suggestions about hardware implementations. (Author) 15 refs

  17. Contamination Analysis Tools

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  18. Balancing the tools

    DEFF Research Database (Denmark)

    Leroyer, Patrick

    2009-01-01

    The purpose of this article is to describe the potential of a new combination of functions in lexicographic tools for tourists. So far lexicography has focused on the communicative information needs of tourists, i.e. helping tourists decide what to say in a number of specific tourist situations, ...... tourist situations, wherever and whenever needed. It is demonstrated how this type of objective knowledge, which is conventionally represented in tourist guides and on tourist web sites, could benefit from being arranged in a lexicographic design....

  19. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  20. Comparison of quality control software tools for diffusion tensor imaging.

    Science.gov (United States)

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. How general are general source conditions?

    International Nuclear Information System (INIS)

    Mathé, Peter; Hofmann, Bernd

    2008-01-01

    Error analysis of regularization methods in Hilbert spaces is based on smoothness assumptions in terms of source conditions. In the traditional setup, i.e. when smoothness is in a power scale, we see that not all elements in the underlying Hilbert space possess some smoothness with this scale. Our main result asserts that this can be overcome when turning to general source conditions defined in terms of index functions. We conclude with some consequences

  2. Heat exchanger tube tool

    International Nuclear Information System (INIS)

    Gugel, G.

    1976-01-01

    Certain types of heat-exchangers have tubes opening through a tube sheet to a manifold having an access opening offset from alignment with the tube ends. A tool for inserting a device, such as for inspection or repair, is provided for use in such instances. The tool is formed by a flexible guide tube insertable through the access opening and having an inner end provided with a connector for connection with the opening of the tube in which the device is to be inserted, and an outer end which remains outside of the chamber, the guide tube having adequate length for this arrangement. A flexible transport hose for internally transporting the device slides inside of the guide tube. This hose is long enough to slide through the guide tube, into the heat-exchanger tube, and through the latter to the extent required for the use of the device. The guide tube must be bent to reach the end of the heat-exchanger tube and the latter may be constructed with a bend, the hose carrying anit-friction elements at interspaced locations along its length to make it possible for the hose to negotiate such bends while sliding to the location where the use of the device is required

  3. Hydraulic release oil tool

    International Nuclear Information System (INIS)

    Mims, M.G.; Mueller, M.D.; Ehlinger, J.C.

    1992-01-01

    This patent describes a hydraulic release tool. It comprises a setting assembly; a coupling member for coupling to drill string or petroleum production components, the coupling member being a plurality of sockets for receiving the dogs in the extended position and attaching the coupling member the setting assembly; whereby the setting assembly couples to the coupling member by engagement of the dogs in the sockets of releases from and disengages the coupling member in movement of the piston from its setting to its reposition in response to a pressure in the body in exceeding the predetermined pressure; and a relief port from outside the body into its bore and means to prevent communication between the relief port and the bore of the body axially of the piston when the piston is in the setting position and to establish such communication upon movement of the piston from the setting position to the release position and reduce the pressure in the body bore axially of the piston, whereby the reduction of the pressure signals that the tool has released the coupling member

  4. Motif enrichment tool.

    Science.gov (United States)

    Blatti, Charles; Sinha, Saurabh

    2014-07-01

    The Motif Enrichment Tool (MET) provides an online interface that enables users to find major transcriptional regulators of their gene sets of interest. MET searches the appropriate regulatory region around each gene and identifies which transcription factor DNA-binding specificities (motifs) are statistically overrepresented. Motif enrichment analysis is currently available for many metazoan species including human, mouse, fruit fly, planaria and flowering plants. MET also leverages high-throughput experimental data such as ChIP-seq and DNase-seq from ENCODE and ModENCODE to identify the regulatory targets of a transcription factor with greater precision. The results from MET are produced in real time and are linked to a genome browser for easy follow-up analysis. Use of the web tool is free and open to all, and there is no login requirement. ADDRESS: http://veda.cs.uiuc.edu/MET/. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. OMIRIS educational tool

    International Nuclear Information System (INIS)

    Dumont, X.; Artus, J.C.; Gonin, M.; Bidard, F.; Hickman, B.

    2004-01-01

    The biological effects of ionizing radiations are one of the most important issues for workers exposed to these radiations in nuclear plants. To deliver information to workers about this topic, the 'Utility Medical Work Officers' wanted an educational tool. This tool named 'OMIRIS' was prepared under the authority of the federation of professors in radiology, radiobiology and radioprotection and the industrial nuclear partners in France (ANDRA, AREVA/COGEMA and FRAMATOME-ANP, CEA and EDF). The use of animation techniques helps to present in a simple way this complex topic using an interactive, pleasant and comprehensible form. Detailed information is given about 5 themes: 1) the various sources of ionizing radiation whatever they are natural, medical, industrial or military; 2) the various types of exposure, whether internal or external, their characteristics: duration, target organs and radiological toxicity and the different means of protection; 3) the concept of dose, the importance of dose rate and the reference values of doses; 4) the biological effects on the human organism, notions of dose threshold and aims, results and limitations of epidemiological surveys; and 5) the regulation based on radiological protection studies

  6. Treatment Deployment Evaluation Tool

    International Nuclear Information System (INIS)

    M. A. Rynearson; M. M. Plum

    1999-01-01

    The U.S. Department of Energy (DOE) is responsible for the final disposition of legacy spent nuclear fuel (SNF). As a response, DOE's National Spent Nuclear Fuel Program (NSNFP) has been given the responsibility for the disposition of DOE-owned SNF. Many treatment technologies have been identified to treat some forms of SNF so that the resulting treated product is acceptable by the disposition site. One of these promising treatment processes is the electrometallurgical treatment (EMT) currently in development; a second is an Acid Wash Decladding process. The NSNFP has been tasked with identifying possible strategies for the deployment of these treatment processes in the event that a treatment path is deemed necessary. To support the siting studies of these strategies, economic evaluations are being performed to identify the least-cost deployment path. This model (tool) was developed to consider the full scope of costs, technical feasibility, process material disposition, and schedule attributes over the life of each deployment alternative. Using standard personal computer (PC) software, the model was developed as a comprehensive technology economic assessment tool using a Life-Cycle Cost (LCC) analysis methodology. Model development was planned as a systematic, iterative process of identifying and bounding the required activities to dispose of SNF. To support the evaluation process, activities are decomposed into lower level, easier to estimate activities. Sensitivity studies can then be performed on these activities, defining cost issues and testing results against the originally stated problem

  7. Tools for computational finance

    CERN Document Server

    Seydel, Rüdiger U

    2017-01-01

    Computational and numerical methods are used in a number of ways across the field of finance. It is the aim of this book to explain how such methods work in financial engineering. By concentrating on the field of option pricing, a core task of financial engineering and risk analysis, this book explores a wide range of computational tools in a coherent and focused manner and will be of use to anyone working in computational finance. Starting with an introductory chapter that presents the financial and stochastic background, the book goes on to detail computational methods using both stochastic and deterministic approaches. Now in its sixth edition, Tools for Computational Finance has been significantly revised and contains:    Several new parts such as a section on extended applications of tree methods, including multidimensional trees, trinomial trees, and the handling of dividends; Additional material in the field of generating normal variates with acceptance-rejection methods, and on Monte Carlo methods...

  8. SIRTF Tools for DIRT

    Science.gov (United States)

    Pound, M. W.; Wolfire, M. G.; Amarnath, N. S.

    2004-07-01

    The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS {http://dustem.astro.umd.edu}) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysics community for about 5 years. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. We are adding new functionality to DIRT to support new missions like SIRTF and SOFIA. A new Instrument module allows for plotting of the model points convolved with the spatial and spectral responses of the selected instrument. This lets users better fit data from specific instruments. Currently, we have implemented modules for the Infrared Array Camera (IRAC) and Multiband Imaging Photometer (MIPS) on SIRTF. The models are based on the dust radiation transfer code of Wolfire & Cassinelli (1986) which accounts for multiple grain sizes and compositions. The model outputs are averaged over the instrument bands using the same weighting (νFν = constant) as the SIRTF data pipeline which allows the SIRTF data products to be compared directly with the model database. This work was supported in part by a NASA AISRP grant NAG 5-10751 and the SIRTF Legacy Science Program provided by NASA through an award issued by JPL under NASA contract 1407.

  9. Unsupervised Learning and Generalization

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Larsen, Jan

    1996-01-01

    The concept of generalization is defined for a general class of unsupervised learning machines. The generalization error is a straightforward extension of the corresponding concept for supervised learning, and may be estimated empirically using a test set or by statistical means-in close analogy ...... with supervised learning. The empirical and analytical estimates are compared for principal component analysis and for K-means clustering based density estimation......The concept of generalization is defined for a general class of unsupervised learning machines. The generalization error is a straightforward extension of the corresponding concept for supervised learning, and may be estimated empirically using a test set or by statistical means-in close analogy...

  10. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    Science.gov (United States)

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  11. Generalization of concurrence vectors

    International Nuclear Information System (INIS)

    Yu Changshui; Song Heshan

    2004-01-01

    In this Letter, based on the generalization of concurrence vectors for bipartite pure state with respect to employing tensor product of generators of the corresponding rotation groups, we generalize concurrence vectors to the case of mixed states; a new criterion of separability of multipartite pure states is given out, for which we define a concurrence vector; we generalize the vector to the case of multipartite mixed state and give out a good measure of free entanglement

  12. General quantum variational calculus

    Directory of Open Access Journals (Sweden)

    Artur M. C. Brito da Cruz

    2018-02-01

    Full Text Available We develop a new variational calculus based in the general quantum difference operator recently introduced by Hamza et al. In particular, we obtain optimality conditions for generalized variational problems where the Lagrangian may depend on the endpoints conditions and a real parameter, for the basic and isoperimetric problems, with and without fixed boundary conditions. Our results provide a generalization to previous results obtained for the $q$- and Hahn-calculus.

  13. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  14. Generalized quasi variational inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Noor, M.A. [King Saud Univ., Riyadh (Saudi Arabia)

    1996-12-31

    In this paper, we establish the equivalence between the generalized quasi variational inequalities and the generalized implicit Wiener-Hopf equations using essentially the projection technique. This equivalence is used to suggest and analyze a number of new iterative algorithms for solving generalized quasi variational inequalities and the related complementarity problems. The convergence criteria is also considered. The results proved in this paper represent a significant improvement and refinement of the previously known results.

  15. Ge(Li) data reduction using small computers

    Science.gov (United States)

    Mcdermott, W. E.

    1972-01-01

    The advantages and limitations of using a small computer to analyze Ge(Li) radiation spectra are studied. The computer has to: (1) find the spectrum peaks, (2) determine the count rate in the photopeaks, and (3) relate the count rate to known gamma transitions to find the amount of each radionuclide present. Results show that tasks one and two may be done by the computer but task three must be done by an experimenter or a larger computer.

  16. Less is more : data reduction in wireless sensor networks

    NARCIS (Netherlands)

    Masoum, Alireza

    2018-01-01

    Wireless sensor networks are monitoring systems consisting of many small, low-cost and low-power devices called sensor nodes. A large number of sensor nodes are deployed in an environment to monitor a physical phenomenon, execute light processes on collected data, and send either raw data or

  17. Computerized data reduction techniques for nadir viewing remote sensors

    Science.gov (United States)

    Tiwari, S. N.; Gormsen, Barbara B.

    1985-01-01

    Computer resources have been developed for the analysis and reduction of MAPS experimental data from the OSTA-1 payload. The MAPS Research Project is concerned with the measurement of the global distribution of mid-tropospheric carbon monoxide. The measurement technique for the MAPS instrument is based on non-dispersive gas filter radiometer operating in the nadir viewing mode. The MAPS experiment has two passive remote sensing instruments, the prototype instrument which is used to measure tropospheric air pollution from aircraft platforms and the third generation (OSTA) instrument which is used to measure carbon monoxide in the mid and upper troposphere from space platforms. Extensive effort was also expended in support of the MAPS/OSTA-3 shuttle flight. Specific capabilities and resources developed are discussed.

  18. Data Reduction Algorithm Using Nonnegative Matrix Factorization with Nonlinear Constraints

    Science.gov (United States)

    Sembiring, Pasukat

    2017-12-01

    Processing ofdata with very large dimensions has been a hot topic in recent decades. Various techniques have been proposed in order to execute the desired information or structure. Non- Negative Matrix Factorization (NMF) based on non-negatives data has become one of the popular methods for shrinking dimensions. The main strength of this method is non-negative object, the object model by a combination of some basic non-negative parts, so as to provide a physical interpretation of the object construction. The NMF is a dimension reduction method thathasbeen used widely for numerous applications including computer vision,text mining, pattern recognitions,and bioinformatics. Mathematical formulation for NMF did not appear as a convex optimization problem and various types of algorithms have been proposed to solve the problem. The Framework of Alternative Nonnegative Least Square(ANLS) are the coordinates of the block formulation approaches that have been proven reliable theoretically and empirically efficient. This paper proposes a new algorithm to solve NMF problem based on the framework of ANLS.This algorithm inherits the convergenceproperty of the ANLS framework to nonlinear constraints NMF formulations.

  19. Big data reduction framework for value creation in sustainable enterprises

    OpenAIRE

    Rehman, Muhammad Habib ur; Chang, Victor; Batool, Aisha; Teh, Ying Wah

    2016-01-01

    Value creation is a major sustainability factor for enterprises, in addition to profit maximization and revenue generation. Modern enterprises collect big data from various inbound and outbound data sources. The inbound data sources handle data generated from the results of business operations, such as manufacturing, supply chain management, marketing, and human resource management, among others. Outbound data sources handle customer-generated data which are acquired directly or indirectly fr...

  20. Data Reduction Algorithms for Store Separation Grid Testing

    Science.gov (United States)

    2014-08-01

    The PES consists of a 17 blade centrifugal compressor driven by a 2.6 megawatt motor and operates at a constant speed of 10840 rpm. This extracts...1.4 with a fixed nozzle. The total pressure can be varied from 30 kPa to 200 kPa. Flow is generated by a two-stage axial flow compressor powered by

  1. User-friendly software for SANS data reduction and analysis

    International Nuclear Information System (INIS)

    Biemann, P.; Haese-Seiller, M.; Staron, P.

    1999-01-01

    At the Geesthacht Neutron Facility (GeNF) a new software is being developed for the reduction of two-dimensional small-angle neutron scattering (SANS) data. The main motivation for this work was to created software for users of our SANS facilities that is easy to use. Another motivation was to provide users with software they can also use at their home institute. Therefore, the software is implemented on a personal computer running WINDOWS. The program reads raw data from an area detector in binary or ascii format and produces ascii files containing the scattering curve. The cross section can be averaged over the whole area of the detector or over users defined sectors only. Scripts can be created for processing large numbers of files. (author)

  2. Big Data Reduction and Optimization in Sensor Monitoring Network

    Directory of Open Access Journals (Sweden)

    Bin He

    2014-01-01

    Full Text Available Wireless sensor networks (WSNs are increasingly being utilized to monitor the structural health of the underground subway tunnels, showing many promising advantages over traditional monitoring schemes. Meanwhile, with the increase of the network size, the system is incapable of dealing with big data to ensure efficient data communication, transmission, and storage. Being considered as a feasible solution to these issues, data compression can reduce the volume of data travelling between sensor nodes. In this paper, an optimization algorithm based on the spatial and temporal data compression is proposed to cope with these issues appearing in WSNs in the underground tunnel environment. The spatial and temporal correlation functions are introduced for the data compression and data recovery. It is verified that the proposed algorithm is applicable to WSNs in the underground tunnel.

  3. F-111C Flight Data Reduction and Analysis Procedures

    Science.gov (United States)

    1990-12-01

    BPHI NO 24 BTHE YES 25 BPSI NO 26 BH YES 27 LVEL NO 28 LBET NO 29 LALP YES 30 LPHI NO 31 LTHE NO 32 LPSI NO 33 LH NO 34 TABLE 2 INPUTS I Ax YES 2 Av NO...03 * 51 IJ Appendix G - A priori Data from Six Degree of Free- dom Flight Dynamic Model The six degree of freedom flight dynamic mathematical model of...Estimated Mathematical mode response - > of aircraft !Gauss- Maximum " Newton --- likelihood 4,computational cost Salgorithm function Maximum

  4. Generalized sampling in Julia

    DEFF Research Database (Denmark)

    Jacobsen, Christian Robert Dahl; Nielsen, Morten; Rasmussen, Morten Grud

    2017-01-01

    Generalized sampling is a numerically stable framework for obtaining reconstructions of signals in different bases and frames from their samples. For example, one can use wavelet bases for reconstruction given frequency measurements. In this paper, we will introduce a carefully documented toolbox...... for performing generalized sampling in Julia. Julia is a new language for technical computing with focus on performance, which is ideally suited to handle the large size problems often encountered in generalized sampling. The toolbox provides specialized solutions for the setup of Fourier bases and wavelets....... The performance of the toolbox is compared to existing implementations of generalized sampling in MATLAB....

  5. Tools & Resources | Efficient Windows Collaborative

    Science.gov (United States)

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  6. Soomlase disainitud tool sai preemia

    Index Scriptorium Estoniae

    2005-01-01

    Isku OY disaineri Tapio Anttila loodud tool Haiku sai rahvusvahelisel disainikonkursil ADEX 2004 mööbli kategoorias Adex Silveri preemia. Mullu sai sama tool Chicago arhitektuuri- ja disainimuuseumi preemia Good Design Award

  7. Criticism on Environmental Assessment Tools

    NARCIS (Netherlands)

    Abdalla, G.; Maas, G.J.; Huyghe, J.; Oostra, M.; Saji Baby, xx; Bogdan Zygmunt, xx

    2011-01-01

    Using environmental assessment tools to assess the sustainability of buildings, homes and mixed- use area is increasing. Environmental tools assign scores to projects using some sustainability (sub) aspects according to design and realization documents and evidences. Six European sustainable urban

  8. Transformational Tools and Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Transformational Tools and Technologies (TTT) Project advances state-of-the-art computational and experimental tools and technologies that are vital to aviation...

  9. Aperture Photometry Tool

    Science.gov (United States)

    Laher, Russ R.; Gorjian, Varoujan; Rebull, Luisa M.; Masci, Frank J.; Fowler, John W.; Helou, George; Kulkarni, Shrinivas R.; Law, Nicholas M.

    2012-07-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It is a graphical user interface (GUI) designed to allow the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. The finely tuned layout of the GUI, along with judicious use of color-coding and alerting, is intended to give maximal user utility and convenience. Simply mouse-clicking on a source in the displayed image will instantly draw a circular or elliptical aperture and sky annulus around the source and will compute the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs with just the push of a button, including image histogram, x and y aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has many functions for customizing the calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source

  10. APPROACHES AND TOOLS FOR QUALITY EXAMINATION OF E-LEARNING TOOLS

    Directory of Open Access Journals (Sweden)

    Galina P. Lavrentieva

    2011-02-01

    Full Text Available The article highlights the scientific and methodological approaches to quality examination of e-learning tools for general education. There are considered terms of the research, described the essence of the main components and stages of the examination. A methodology for quality estimation tools elaboration is described that should be based on identifying criteria and parameters of evaluation. Complex of psycho-pedagogical and ergonomic requirements that should be used in organizing expertise is justified and the most expedient ways of their implementation are examined.

  11. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  12. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2010-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often

  13. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2011-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often

  14. Climate Action Planning Tool | NREL

    Science.gov (United States)

    NREL's Climate Action Planning Tool provides a quick, basic estimate of how various technology options can contribute to an overall climate action plan for your research campus. Use the tool to Tool Calculation Formulas and Assumptions Climate Neutral Research Campuses Website Climate Neutral

  15. High performance electromagnetic simulation tools

    Science.gov (United States)

    Gedney, Stephen D.; Whites, Keith W.

    1994-10-01

    Army Research Office Grant #DAAH04-93-G-0453 has supported the purchase of 24 additional compute nodes that were installed in the Intel iPsC/860 hypercube at the Univesity Of Kentucky (UK), rendering a 32-node multiprocessor. This facility has allowed the investigators to explore and extend the boundaries of electromagnetic simulation for important areas of defense concerns including microwave monolithic integrated circuit (MMIC) design/analysis and electromagnetic materials research and development. The iPSC/860 has also provided an ideal platform for MMIC circuit simulations. A number of parallel methods based on direct time-domain solutions of Maxwell's equations have been developed on the iPSC/860, including a parallel finite-difference time-domain (FDTD) algorithm, and a parallel planar generalized Yee-algorithm (PGY). The iPSC/860 has also provided an ideal platform on which to develop a 'virtual laboratory' to numerically analyze, scientifically study and develop new types of materials with beneficial electromagnetic properties. These materials simulations are capable of assembling hundreds of microscopic inclusions from which an electromagnetic full-wave solution will be obtained in toto. This powerful simulation tool has enabled research of the full-wave analysis of complex multicomponent MMIC devices and the electromagnetic properties of many types of materials to be performed numerically rather than strictly in the laboratory.

  16. SETI as an educational tool

    Science.gov (United States)

    Vaile, R. A.

    SETI offers an extraordinary catalyst in our search for a better education. While the glamour of movie images increased the general public awareness of the term "SETI", we are challenged to improve the level of public understanding of the fundamental scientific and technological issues involved in SETI. It is also critical to keep in mind the reality of human existence. No country seems entirely at peace, whether one considers cultural, trade, military, or heritage issues; no country seems content with the breadth and standards of education for following generations. However, SETI requires generations to participate across cultures, and this long-term human involvement must be sustained through both education and communication across many disciplines and different cultures. For both these major roles, SETI appears to offer a tantalising range and depth, both in educational tools, and in superb tests of communication skills. This paper considers the educational influence of roles evoked by SETI issues. We will briefly consider the range in expertise needed in SETI, the means of improving the public SETI awareness, and mechanisms through which such education may explore the consequences of any SETI result (whether judged as successful or not). Examples of the use of SETI in formal secondary and University education are briefly reviewed.

  17. Improving Tools in Artificial Intelligence

    Directory of Open Access Journals (Sweden)

    Angel Garrido

    2010-01-01

    Full Text Available The historical origin of the Artificial Intelligence (AI is usually established in the Dartmouth Conference, of 1956. But we can find many more arcane origins [1]. Also, we can consider, in more recent times, very great thinkers, as Janos Neumann (then, John von Neumann, arrived in USA, Norbert Wiener, Alan Mathison Turing, or Lofti Zadeh, for instance [12, 14]. Frequently AI requires Logic. But its Classical version shows too many insufficiencies. So, it was necessary to introduce more sophisticated tools, as Fuzzy Logic, Modal Logic, Non-Monotonic Logic and so on [1, 2]. Among the things that AI needs to represent are categories, objects, properties, relations between objects, situations, states, time, events, causes and effects, knowledge about knowledge, and so on. The problems in AI can be classified in two general types [3, 5], search problems and representation problems. On this last "peak", there exist different ways to reach their summit. So, we have [4] Logics, Rules, Frames, Associative Nets, Scripts, and so on, many times connected among them. We attempt, in this paper, a panoramic vision of the scope of application of such representation methods in AI. The two more disputable questions of both modern philosophy of mind and AI will be perhaps the Turing Test and the Chinese Room Argument. To elucidate these very difficult questions, see our final note.

  18. Mathematical tools for physicists

    International Nuclear Information System (INIS)

    Trigg, G.L.

    2005-01-01

    Mathematical Tools for Physisists is a unique collection of 18 review articles, each one written by a renowned expert of its field. Their professional style will be beneficial for advanced students as well as for the scientist at work. The first may find a comprehensive introduction while the latter use it as a quick reference. Great attention was paid to ensuring fast access to the information, and each carefully reviewed article includes a glossary of terms and a guide to further reading. The contributions range from fundamental methods right up to the latest applications, including: - Algebraic Methods - Analytic Methods - Fourier and Other Mathematical Transforms - Fractal Geometry - Geometrical Methods - Green's Functions - Group Theory - Mathematical Modeling - Monte Carlo Methods - Numerical Methods - Perturbation Methods - Quantum Computation - Quantum Logic - Special Functions - Stochastic Processes - Symmetries and Conservation Laws - Topology - Variational Methods. (orig.)

  19. Remediating a design tool

    DEFF Research Database (Denmark)

    Jensen, Mads Møller; Rädle, Roman; Klokmose, Clemens N.

    2018-01-01

    digital sticky notes setup. The paper contributes with a nuanced understanding of what happens when remediating a physical design tool into digital space, by emphasizing focus shifts and breakdowns caused by the technology, but also benefits and promises inherent in the digital media. Despite users......' preference for creating physical notes, handling digital notes on boards was easier and the potential of proper documentation make the digital setup a possible alternative. While the analogy in our remediation supported a transfer of learned handling, the users' experiences across technological setups impact......Sticky notes are ubiquitous in design processes because of their tangibility and ease of use. Yet, they have well-known limitations in professional design processes, as documentation and distribution are cumbersome at best. This paper compares the use of sticky notes in ideation with a remediated...

  20. Tools for Social Construction

    DEFF Research Database (Denmark)

    Brynskov, Martin

    for social construction, i.e. supporting staging, performing, and sharing of playful activities involving people and artifacts.             The method used is a triangulation of the problem domain—pervasive media systems for children—from three perspectives: theory, design, and empirical work......: the potential of pervasive media systems to support the children’s social construction.  ......In this dissertation, I propose a new type of playful media for children. Based on field work, prototypes, and theoretical development, I define, build, and explore a distinct cross-section of existing and new digital media, tools, and communication devices in order to assess the characteristics...