WorldWideScience

Sample records for performing routine database

  1. Should the lateral chest radiograph be routinely performed?

    International Nuclear Information System (INIS)

    Osman, Fatuma; Williams, Imelda

    2014-01-01

    Background: The chest x-ray is one of the most common plain film radiographic examinations performed. Inclusion of the lateral chest radiograph varies internationally and nationally across radiology departments and states in Australia. Search strategy: A search strategy of the databases Cochrane Library, Ovid Medline/Medline, PubMed, Scopus and Science Direct was conducted. The results were restricted to those published between 1985 and 2013 and those published in English. The following search terms were used: ‘lateral chest’, ‘radiograph’, ‘digital radiography’, ‘chest x-ray’, ‘plain film radiography’, ‘ionising radiation’. The results were restricted to publications with these terms in the title, abstract and/or keywords. Main findings: There are few national or international guidelines pertaining to the inclusion of the lateral chest x-ray as routine. Primary concerns are the increased radiation dose associated with the additional chest view and reduction of medical imaging services cost. Modern digital imaging systems result in a lower radiation dose. The diagnostic yield of the lateral chest x-ray is highly dependent on the clinical indications of the patient. Further research into the routine inclusion of the lateral chest x-ray is recommended. Conclusion: Review of the literature suggests that the lateral chest radiograph should not be performed routinely unless clinically indicated

  2. DB90: A Fortran Callable Relational Database Routine for Scientific and Engineering Computer Programs

    Science.gov (United States)

    Wrenn, Gregory A.

    2005-01-01

    This report describes a database routine called DB90 which is intended for use with scientific and engineering computer programs. The software is written in the Fortran 90/95 programming language standard with file input and output routines written in the C programming language. These routines should be completely portable to any computing platform and operating system that has Fortran 90/95 and C compilers. DB90 allows a program to supply relation names and up to 5 integer key values to uniquely identify each record of each relation. This permits the user to select records or retrieve data in any desired order.

  3. Routine health insurance data for scientific research: potential and limitations of the Agis Health Database.

    Science.gov (United States)

    Smeets, Hugo M; de Wit, Niek J; Hoes, Arno W

    2011-04-01

    Observational studies performed within routine health care databases have the advantage of their large size and, when the aim is to assess the effect of interventions, can offer a completion to randomized controlled trials with usually small samples from experimental situations. Institutional Health Insurance Databases (HIDs) are attractive for research because of their large size, their longitudinal perspective, and their practice-based information. As they are based on financial reimbursement, the information is generally reliable. The database of one of the major insurance companies in the Netherlands, the Agis Health Database (AHD), is described in detail. Whether the AHD data sets meet the specific requirements to conduct several types of clinical studies is discussed according to the classification of the four different types of clinical research; that is, diagnostic, etiologic, prognostic, and intervention research. The potential of the AHD for these various types of research is illustrated using examples of studies recently conducted in the AHD. HIDs such as the AHD offer large potential for several types of clinical research, in particular etiologic and intervention studies, but at present the lack of detailed clinical information is an important limitation. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Identifying complications of interventional procedures from UK routine healthcare databases: a systematic search for methods using clinical codes.

    Science.gov (United States)

    Keltie, Kim; Cole, Helen; Arber, Mick; Patrick, Hannah; Powell, John; Campbell, Bruce; Sims, Andrew

    2014-11-28

    Several authors have developed and applied methods to routine data sets to identify the nature and rate of complications following interventional procedures. But, to date, there has been no systematic search for such methods. The objective of this article was to find, classify and appraise published methods, based on analysis of clinical codes, which used routine healthcare databases in a United Kingdom setting to identify complications resulting from interventional procedures. A literature search strategy was developed to identify published studies that referred, in the title or abstract, to the name or acronym of a known routine healthcare database and to complications from procedures or devices. The following data sources were searched in February and March 2013: Cochrane Methods Register, Conference Proceedings Citation Index - Science, Econlit, EMBASE, Health Management Information Consortium, Health Technology Assessment database, MathSciNet, MEDLINE, MEDLINE in-process, OAIster, OpenGrey, Science Citation Index Expanded and ScienceDirect. Of the eligible papers, those which reported methods using clinical coding were classified and summarised in tabular form using the following headings: routine healthcare database; medical speciality; method for identifying complications; length of follow-up; method of recording comorbidity. The benefits and limitations of each approach were assessed. From 3688 papers identified from the literature search, 44 reported the use of clinical codes to identify complications, from which four distinct methods were identified: 1) searching the index admission for specified clinical codes, 2) searching a sequence of admissions for specified clinical codes, 3) searching for specified clinical codes for complications from procedures and devices within the International Classification of Diseases 10th revision (ICD-10) coding scheme which is the methodology recommended by NHS Classification Service, and 4) conducting manual clinical

  5. Database principles programming performance

    CERN Document Server

    O'Neil, Patrick

    2014-01-01

    Database: Principles Programming Performance provides an introduction to the fundamental principles of database systems. This book focuses on database programming and the relationships between principles, programming, and performance.Organized into 10 chapters, this book begins with an overview of database design principles and presents a comprehensive introduction to the concepts used by a DBA. This text then provides grounding in many abstract concepts of the relational model. Other chapters introduce SQL, describing its capabilities and covering the statements and functions of the programmi

  6. Human Performance Event Database

    International Nuclear Information System (INIS)

    Trager, E. A.

    1998-01-01

    The purpose of this paper is to describe several aspects of a Human Performance Event Database (HPED) that is being developed by the Nuclear Regulatory Commission. These include the background, the database structure and basis for the structure, the process for coding and entering event records, the results of preliminary analyses of information in the database, and plans for the future. In 1992, the Office for Analysis and Evaluation of Operational Data (AEOD) within the NRC decided to develop a database for information on human performance during operating events. The database was needed to help classify and categorize the information to help feedback operating experience information to licensees and others. An NRC interoffice working group prepared a list of human performance information that should be reported for events and the list was based on the Human Performance Investigation Process (HPIP) that had been developed by the NRC as an aid in investigating events. The structure of the HPED was based on that list. The HPED currently includes data on events described in augmented inspection team (AIT) and incident investigation team (IIT) reports from 1990 through 1996, AEOD human performance studies from 1990 through 1993, recent NRR special team inspections, and licensee event reports (LERs) that were prepared for the events. (author)

  7. Level-3 Cholesky Factorization Routines Improve Performance of Many Cholesky Algorithms

    DEFF Research Database (Denmark)

    Gustavson, Fred G.; Wasniewski, Jerzy; Dongarra, Jack J.

    2013-01-01

    Four routines called DPOTF3i, i = a,b,c,d, are presented. DPOTF3i are a novel type of level-3 BLAS for use by BPF (Blocked Packed Format) Cholesky factorization and LAPACK routine DPOTRF. Performance of routines DPOTF3i are still increasing when the performance of Level-2 routine DPOTF2 of LAPACK...

  8. DEVELOPING AND INSTRUCTING PRE-PERFORMANCE ROUTINES FOR TENPIN BOWLING COMPETITIONS (1).

    Science.gov (United States)

    Lee, Seungmin; Lee, Keunchul; Kwon, Sungho

    2015-06-01

    This preliminary study developed pre-performance routines for tenpin bowlers and instructed them. To develop the routine, the situations before throwing the ball were divided into four phases; participants were examined through interviews and observations. This study used an A-B design; the A stage included the development of the routines for 3 wk., while the B stage included the instruction and two evaluations of the routine consistency. Practice was implemented for 4 hr. per day for 9 wk. The participants noted they understood the developed routine easily and experienced an atmosphere similar to that of a competition during training through the routines. They found it difficult to practice the relaxation phase, but emphasized that the relaxation phase was helpful. Consistent routines were associated with an improved mental state and performance in a competition. This study suggests that pre-performance routines stabilize the mental state of the athletes, apparently giving them a competitive advantage.

  9. PostgreSQL database performance optimization

    OpenAIRE

    Wang, Qiang

    2011-01-01

    The thesis was request by Marlevo software Oy for a general description of the PostgreSQL database and its performance optimization technics. Its purpose was to help new PostgreSQL users to quickly understand the system and to assist DBAs to improve the database performance. The thesis was divided into two parts. The first part described PostgreSQL database optimization technics in theory. In additional popular tools were also introduced. This part was based on PostgreSQL documentation, r...

  10. A performance evaluation of in-memory databases

    Directory of Open Access Journals (Sweden)

    Abdullah Talha Kabakus

    2017-10-01

    Full Text Available The popularity of NoSQL databases has increased due to the need of (1 processing vast amount of data faster than the relational database management systems by taking the advantage of highly scalable architecture, (2 flexible (schema-free data structure, and, (3 low latency and high performance. Despite that memory usage is not major criteria to evaluate performance of algorithms, since these databases serve the data from memory, their memory usages are also experimented alongside the time taken to complete each operation in the paper to reveal which one uses the memory most efficiently. Currently there exists over 225 NoSQL databases that provide different features and characteristics. So it is necessary to reveal which one provides better performance for different data operations. In this paper, we experiment the widely used in-memory databases to measure their performance in terms of (1 the time taken to complete operations, and (2 how efficiently they use memory during operations. As per the results reported in this paper, there is no database that provides the best performance for all data operations. It is also proved that even though a RDMS stores its data in memory, its overall performance is worse than NoSQL databases.

  11. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    Science.gov (United States)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  12. performance routines followed by free throw shooting accuracy in secondary basketball players

    Directory of Open Access Journals (Sweden)

    Phelps Ashley

    2015-12-01

    Full Text Available Study aim: The purpose of the current study was to determine whether existing pre-performance routines had an effect on free throw shooting accuracy in high school pupils as compared to shooting without a pre-performance routine.

  13. Interval of Routine Maintenance and Maintenance Performance: A Literature Review

    Directory of Open Access Journals (Sweden)

    Au-Yong Cheong Peng

    2016-01-01

    Full Text Available In high-rise residential buildings, the quality of facilities management services is significant to the normal operation of the facilities. Unfortunately, lack of concern towards building maintenance, especially preventive maintenance, happens in domestic housing industry in Malaysia. Majority of the maintenance operations of condominiums suffer from lack of planning, lack of proactive maintenance plan, and lack of proper implementation. Thus, this paper reviews the implementation of preventive maintenance strategy, routine maintenance in specific. An extensive review of literature published in 1987 to 2014 is performed for the purpose of this research. The publications are sourced from journal articles, conference proceedings and books. The literature analysis confirms that the routine maintenance of facilities and building services is vital and it can be influential towards the maintenance performance. Subsequently, a theoretical framework is developed, which shows the relationship between routine maintenance of building facilities & services and maintenance performance. The building facilities & services are divided into two categories. They are essential facilities & services that ensure the safety, health, habitability, and operability of buildings; while value-added facilities & services deal with property value, return on investment, and quality living of buildings. Based on the findings, a future research is proposed, which aims to identify the appropriate routine of maintenance for the facilities and services in high-rise residential buildings to improve the maintenance performance.

  14. Factors for Radical Creativity, Incremental Creativity, and Routine, Noncreative Performance

    Science.gov (United States)

    Madjar, Nora; Greenberg, Ellen; Chen, Zheng

    2011-01-01

    This study extends theory and research by differentiating between routine, noncreative performance and 2 distinct types of creativity: radical and incremental. We also use a sensemaking perspective to examine the interplay of social and personal factors that may influence a person's engagement in a certain level of creative action versus routine,…

  15. Building Micro-Foundations for the Routines, Capabilities, and Performance Links

    DEFF Research Database (Denmark)

    Abell, Peter; Felin, Teppo; Foss, Nicolai Juul

    2007-01-01

    a neglect of micro-foundations - is incomplete. There are no mechanisms that work solely on the macro-level, directly connecting routines and capabilities to firm-level outcomes. While routines and capabilities are useful shorthand for complicated patterns of individual action and interaction, ultimately...... they are best understood at the micro-level. Second, we provide a formal model that shows precisely why macro explanation is incomplete and which exemplifies how explicit micro-foundations may be built for notions of routines and capabilities and for how these impact firm performance....

  16. Reliability database development for use with an object-oriented fault tree evaluation program

    Science.gov (United States)

    Heger, A. Sharif; Harringtton, Robert J.; Koen, Billy V.; Patterson-Hine, F. Ann

    1989-01-01

    A description is given of the development of a fault-tree analysis method using object-oriented programming. In addition, the authors discuss the programs that have been developed or are under development to connect a fault-tree analysis routine to a reliability database. To assess the performance of the routines, a relational database simulating one of the nuclear power industry databases has been constructed. For a realistic assessment of the results of this project, the use of one of existing nuclear power reliability databases is planned.

  17. Exploring the impact of staff absenteeism on patient satisfaction using routine databases in a university hospital.

    Science.gov (United States)

    Duclay, E; Hardouin, J B; Sébille, V; Anthoine, E; Moret, L

    2015-10-01

    To explore the influence of staff absenteeism on patient satisfaction using the indicators available in management reports. Among factors explaining patient satisfaction, human resource indicators have been studied widely in terms of burnout or job satisfaction, but there have not been many studies related to absenteeism indicators. A multilevel analysis was conducted using two routinely compiled databases from 2010 in the clinical departments of a university hospital (France). The staff database monitored absenteeism for short-term medical reasons (5 days or less), non-medical reasons and absences starting at the weekend. The patient satisfaction database was established at the time of discharge. Patient satisfaction related to relationships with staff was significantly and negatively correlated with nurse absenteeism for non-medical reasons (P absenteeism starting at weekends (P absenteeism for short-term medical reasons (P absenteeism and should lead to a better understanding of the impact of human resources on patient satisfaction. To enhance patient satisfaction, managers need to find a way to reduce staff absenteeism, in order to avoid burnout and to improve the atmosphere in the workplace. © 2014 John Wiley & Sons Ltd.

  18. A concept for routine emergency-care data-based syndromic surveillance in Europe.

    Science.gov (United States)

    Ziemann, A; Rosenkötter, N; Garcia-Castrillo Riesgo, L; Schrell, S; Kauhl, B; Vergeiner, G; Fischer, M; Lippert, F K; Krämer, A; Brand, H; Krafft, T

    2014-11-01

    We developed a syndromic surveillance (SyS) concept using emergency dispatch, ambulance and emergency-department data from different European countries. Based on an inventory of sub-national emergency data availability in 12 countries, we propose framework definitions for specific syndromes and a SyS system design. We tested the concept by retrospectively applying cumulative sum and spatio-temporal cluster analyses for the detection of local gastrointestinal outbreaks in four countries and comparing the results with notifiable disease reporting. Routine emergency data was available daily and electronically in 11 regions, following a common structure. We identified two gastrointestinal outbreaks in two countries; one was confirmed as a norovirus outbreak. We detected 1/147 notified outbreaks. Emergency-care data-based SyS can supplement local surveillance with near real-time information on gastrointestinal patients, especially in special circumstances, e.g. foreign tourists. It most likely cannot detect the majority of local gastrointestinal outbreaks with few, mild or dispersed cases.

  19. Benchmarking database performance for genomic data.

    Science.gov (United States)

    Khushi, Matloob

    2015-06-01

    Genomic regions represent features such as gene annotations, transcription factor binding sites and epigenetic modifications. Performing various genomic operations such as identifying overlapping/non-overlapping regions or nearest gene annotations are common research needs. The data can be saved in a database system for easy management, however, there is no comprehensive database built-in algorithm at present to identify overlapping regions. Therefore I have developed a novel region-mapping (RegMap) SQL-based algorithm to perform genomic operations and have benchmarked the performance of different databases. Benchmarking identified that PostgreSQL extracts overlapping regions much faster than MySQL. Insertion and data uploads in PostgreSQL were also better, although general searching capability of both databases was almost equivalent. In addition, using the algorithm pair-wise, overlaps of >1000 datasets of transcription factor binding sites and histone marks, collected from previous publications, were reported and it was found that HNF4G significantly co-locates with cohesin subunit STAG1 (SA1).Inc. © 2015 Wiley Periodicals, Inc.

  20. Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA)

    Science.gov (United States)

    SRD 100 Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA) (PC database for purchase)   This database has been designed to facilitate quantitative interpretation of Auger-electron and X-ray photoelectron spectra and to improve the accuracy of quantitation in routine analysis. The database contains all physical data needed to perform quantitative interpretation of an electron spectrum for a thin-film specimen of given composition. A simulation module provides an estimate of peak intensities as well as the energy and angular distributions of the emitted electron flux.

  1. Performance Enhancements for Advanced Database Management Systems

    OpenAIRE

    Helmer, Sven

    2000-01-01

    New applications have emerged, demanding database management systems with enhanced functionality. However, high performance is a necessary precondition for the acceptance of such systems by end users. In this context we developed, implemented, and tested algorithms and index structures for improving the performance of advanced database management systems. We focused on index structures and join algorithms for set-valued attributes.

  2. Relational database hybrid model, of high performance and storage capacity for nuclear engineering applications

    International Nuclear Information System (INIS)

    Gomes Neto, Jose

    2008-01-01

    The objective of this work is to present the relational database, named FALCAO. It was created and implemented to support the storage of the monitored variables in the IEA-R1 research reactor, located in the Instituto de Pesquisas Energeticas e Nucleares, IPEN/CNEN-SP. The data logical model and its direct influence in the integrity of the provided information are carefully considered. The concepts and steps of normalization and de normalization including the entities and relations involved in the logical model are presented. It is also presented the effects of the model rules in the acquisition, loading and availability of the final information, under the performance concept since the acquisition process loads and provides lots of information in small intervals of time. The SACD application, through its functionalities, presents the information stored in the FALCAO database in a practical and optimized form. The implementation of the FALCAO database occurred successfully and its existence leads to a considerably favorable situation. It is now essential to the routine of the researchers involved, not only due to the substantial improvement of the process but also to the reliability associated to it. (author)

  3. Solid Waste Projection Model: Database User's Guide

    International Nuclear Information System (INIS)

    Blackburn, C.L.

    1993-10-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for using Version 1.4 of the SWPM database: system requirements and preparation, entering and maintaining data, and performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not Provide instruction in the use of Paradox, the database management system in which the SWPM database is established

  4. Oracle database performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2011-01-01

    A data-driven, fact-based, quantitative text on Oracle performance and scalability With database concepts and theories clearly explained in Oracle's context, readers quickly learn how to fully leverage Oracle's performance and scalability capabilities at every stage of designing and developing an Oracle-based enterprise application. The book is based on the author's more than ten years of experience working with Oracle, and is filled with dependable, tested, and proven performance optimization techniques. Oracle Database Performance and Scalability is divided into four parts that enable reader

  5. A trending database for human performance events

    International Nuclear Information System (INIS)

    Harrison, D.

    1993-01-01

    An effective Operations Experience program includes a standardized methodology for the investigation of unplanned events and a tool capable of retaining investigation data for the purpose of trending analysis. A database used in conjunction with a formalized investigation procedure for the purpose of trending unplanning event data is described. The database follows the structure of INPO's Human Performance Enhancement System for investigations. The database screens duplicate on-line the HPES evaluation Forms. All information pertaining to investigations is collected, retained and entered into the database using these forms. The database will be used for trending analysis to determine if any significant patterns exist, for tracking progress over time both within AECL and against industry standards, and for evaluating the success of corrective actions. Trending information will be used to help prevent similar occurrences

  6. Performance assessment of EMR systems based on post-relational database.

    Science.gov (United States)

    Yu, Hai-Yan; Li, Jing-Song; Zhang, Xiao-Guang; Tian, Yu; Suzuki, Muneou; Araki, Kenji

    2012-08-01

    Post-relational databases provide high performance and are currently widely used in American hospitals. As few hospital information systems (HIS) in either China or Japan are based on post-relational databases, here we introduce a new-generation electronic medical records (EMR) system called Hygeia, which was developed with the post-relational database Caché and the latest platform Ensemble. Utilizing the benefits of a post-relational database, Hygeia is equipped with an "integration" feature that allows all the system users to access data-with a fast response time-anywhere and at anytime. Performance tests of databases in EMR systems were implemented in both China and Japan. First, a comparison test was conducted between a post-relational database, Caché, and a relational database, Oracle, embedded in the EMR systems of a medium-sized first-class hospital in China. Second, a user terminal test was done on the EMR system Izanami, which is based on the identical database Caché and operates efficiently at the Miyazaki University Hospital in Japan. The results proved that the post-relational database Caché works faster than the relational database Oracle and showed perfect performance in the real-time EMR system.

  7. Creating Masterpieces: How Course Structures and Routines Enable Student Performance

    Science.gov (United States)

    Dean, Kathy Lund; Fornaciari, Charles J.

    2014-01-01

    Over a five-year period, we made a persistent observation: Course structures and routines, such as assignment parameters, student group process rules, and grading schemes were being consistently ignored. As a result, we got distracted by correcting these structural issues and were spending less time on student assignment performance. In this…

  8. Constructing a population-based research database from routine maternal screening records: a resource for studying alloimmunization in pregnant women.

    Directory of Open Access Journals (Sweden)

    Brian K Lee

    Full Text Available BACKGROUND: Although screening for maternal red blood cell antibodies during pregnancy is a standard procedure, the prevalence and clinical consequences of non-anti-D immunization are poorly understood. The objective was to create a national database of maternal antibody screening results that can be linked with population health registers to create a research resource for investigating these issues. STUDY DESIGN AND METHODS: Each birth in the Swedish Medical Birth Register was uniquely identified and linked to the text stored in routine maternal antibody screening records in the time window from 9 months prior to 2 weeks after the delivery date. These text records were subjected to a computerized search for specific antibodies using regular expressions. To illustrate the research potential of the resulting database, selected antibody prevalence rates are presented as tables and figures, and the complete data (from more than 60 specific antibodies presented as online moving graphical displays. RESULTS: More than one million (1,191,761 births with valid screening information from 1982-2002 constitute the study population. Computerized coverage of screening increased steadily over time and varied by region as electronic records were adopted. To ensure data quality, we restricted analysis to birth records in areas and years with a sustained coverage of at least 80%, representing 920,903 births from 572,626 mothers in 17 of the 24 counties in Sweden. During the study period, non-anti-D and anti-D antibodies occurred in 76.8/10,000 and 14.1/10,000 pregnancies respectively, with marked differences between specific antibodies over time. CONCLUSION: This work demonstrates the feasibility of creating a nationally representative research database from the routine maternal antibody screening records from an extended calendar period. By linkage with population registers of maternal and child health, such data are a valuable resource for addressing important

  9. Basic database performance tuning - developer's perspective

    CERN Document Server

    Kwiatek, Michal

    2008-01-01

    This lecture discusses selected database performance issues from the developer's point of view: connection overhead, bind variables and SQL injection, making most of the optimizer with up-to-date statistics, reading execution plans. Prior knowledge of SQL is expected.

  10. Absorptive routines and international patent performance

    Directory of Open Access Journals (Sweden)

    Fernando E. García-Muiña

    2017-04-01

    We enrich the treatment of the absorptive capacity phases including the moderating effects between routines associated to the traditional potential-realized absorptive capacities. Taking into account external knowledge search strategies, the deeper external relationships, the better transference and appropriation of specific external knowledge. Nevertheless, when the moderating role of assimilation is included, cooperation agreements appear as the most efficient source of external knowledge. Finally, we show that technological tools let firms store and structure the information making easier its use for international patenting. This positive effect is reinforced in the presence of exploitation routines, since technological knowledge will better fit to the industry's key factors of success.

  11. OPERA-a human performance database under simulated emergencies of nuclear power plants

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea

    2007-01-01

    In complex systems such as the nuclear and chemical industry, the importance of human performance related problems is well recognized. Thus a lot of effort has been spent on this area, and one of the main streams for unraveling human performance related problems is the execution of HRA. Unfortunately a lack of prerequisite information has been pointed out as the most critical problem in conducting HRA. From this necessity, OPERA database that can provide operators' performance data obtained under simulated emergencies has been developed. In this study, typical operators' performance data that are available from OPERA database are briefly explained. After that, in order to ensure the appropriateness of OPERA database, operators' performance data from OPERA database are compared with those of other studies and real events. As a result, it is believed that operators' performance data of OPERA database are fairly comparable to those of other studies and real events. Therefore it is meaningful to expect that OPERA database can be used as a serviceable data source for scrutinizing human performance related problems including HRA

  12. Solid waste projection model: Database user's guide (Version 1.0)

    International Nuclear Information System (INIS)

    Carr, F.; Stiles, D.

    1991-01-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for preparing to use Version 1 of the SWPM database, for entering and maintaining data, and for performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions, and does not provide instructions in the use of Paradox, the database management system in which the SWPM database is established. 3 figs., 1 tab

  13. Solid Waste Projection Model: Database user's guide (Version 1.3)

    International Nuclear Information System (INIS)

    Blackburn, C.L.

    1991-11-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for preparing to use Version 1.3 of the SWPM database, for entering and maintaining data, and for performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not provide instruction in the use of Paradox, the database management system in which the SWPM database is established

  14. Improved Information Retrieval Performance on SQL Database Using Data Adapter

    Science.gov (United States)

    Husni, M.; Djanali, S.; Ciptaningtyas, H. T.; Wicaksana, I. G. N. A.

    2018-02-01

    The NoSQL databases, short for Not Only SQL, are increasingly being used as the number of big data applications increases. Most systems still use relational databases (RDBs), but as the number of data increases each year, the system handles big data with NoSQL databases to analyze and access data more quickly. NoSQL emerged as a result of the exponential growth of the internet and the development of web applications. The query syntax in the NoSQL database differs from the SQL database, therefore requiring code changes in the application. Data adapter allow applications to not change their SQL query syntax. Data adapters provide methods that can synchronize SQL databases with NotSQL databases. In addition, the data adapter provides an interface which is application can access to run SQL queries. Hence, this research applied data adapter system to synchronize data between MySQL database and Apache HBase using direct access query approach, where system allows application to accept query while synchronization process in progress. From the test performed using data adapter, the results obtained that the data adapter can synchronize between SQL databases, MySQL, and NoSQL database, Apache HBase. This system spends the percentage of memory resources in the range of 40% to 60%, and the percentage of processor moving from 10% to 90%. In addition, from this system also obtained the performance of database NoSQL better than SQL database.

  15. Discrete Optimization of Internal Part Structure via SLM Unit Structure-Performance Database

    Directory of Open Access Journals (Sweden)

    Li Tang

    2018-01-01

    Full Text Available The structural optimization of the internal structure of parts based on three-dimensional (3D printing has been recognized as being important in the field of mechanical design. The purpose of this paper is to present a creation of a unit structure-performance database based on the selective laser melting (SLM, which contains various structural units with different functions and records their structure and performance characteristics so that we can optimize the internal structure of parts directly, according to the database. The method of creating the unit structure-performance database was introduced in this paper and several structural units of the unit structure-performance database were introduced. The bow structure unit was used to show how to create the structure-performance database of the unit as an example. Some samples of the bow structure unit were designed and manufactured by SLM. These samples were tested in the WDW-100 compression testing machine to obtain their performance characteristics. After this, the paper collected all data regarding unit structure parameters, weight, performance characteristics, and other data; and, established a complete set of data from the bow structure unit for the unit structure-performance database. Furthermore, an aircraft part was reconstructed conveniently to be more lightweight according to the unit structure-performance database. Its weight was reduced by 36.8% when compared with the original structure, while the strength far exceeded the requirements.

  16. High-Performance Secure Database Access Technologies for HEP Grids

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the

  17. High-Performance Secure Database Access Technologies for HEP Grids

    International Nuclear Information System (INIS)

    Vranicar, Matthew; Weicher, John

    2006-01-01

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist's computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that 'Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications'. There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure

  18. High performance technique for database applicationsusing a hybrid GPU/CPU platform

    KAUST Repository

    Zidan, Mohammed A.

    2012-07-28

    Many database applications, such as sequence comparing, sequence searching, and sequence matching, etc, process large database sequences. we introduce a novel and efficient technique to improve the performance of database applica- tions by using a Hybrid GPU/CPU platform. In particular, our technique solves the problem of the low efficiency result- ing from running short-length sequences in a database on a GPU. To verify our technique, we applied it to the widely used Smith-Waterman algorithm. The experimental results show that our Hybrid GPU/CPU technique improves the average performance by a factor of 2.2, and improves the peak performance by a factor of 2.8 when compared to earlier implementations. Copyright © 2011 by ASME.

  19. Performance analysis of different database in new internet mapping system

    Science.gov (United States)

    Yao, Xing; Su, Wei; Gao, Shuai

    2017-03-01

    In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.

  20. Electrotactile feedback improves performance and facilitates learning in the routine grasping task

    Directory of Open Access Journals (Sweden)

    Milica Isaković

    2016-06-01

    Full Text Available Aim of this study was to investigate the feasibility of electrotactile feedback in closed loop training of force control during the routine grasping task. The feedback was provided using an array electrode and a simple six-level spatial coding, and the experiment was conducted in three amputee subjects. The psychometric tests confirmed that the subjects could perceive and interpret the electrotactile feedback with a high success rate. The subjects performed the routine grasping task comprising 4 blocks of 60 grasping trials. In each trial, the subjects employed feedforward control to close the hand and produce the desired grasping force (four levels. First (baseline and the last (validation session were performed in open loop, while the second and the third session (training included electrotactile feedback. The obtained results confirmed that using the feedback improved the accuracy and precision of the force control. In addition, the subjects performed significantly better in the validation vs. baseline session, therefore suggesting that electrotactile feedback can be used for learning and training of myoelectric control.

  1. Electrotactile Feedback Improves Performance and Facilitates Learning in the Routine Grasping Task.

    Science.gov (United States)

    Isaković, Milica; Belić, Minja; Štrbac, Matija; Popović, Igor; Došen, Strahinja; Farina, Dario; Keller, Thierry

    2016-06-13

    Aim of this study was to investigate the feasibility of electrotactile feedback in closed loop training of force control during the routine grasping task. The feedback was provided using an array electrode and a simple six-level spatial coding, and the experiment was conducted in three amputee subjects. The psychometric tests confirmed that the subjects could perceive and interpret the electrotactile feedback with a high success rate. The subjects performed the routine grasping task comprising 4 blocks of 60 grasping trials. In each trial, the subjects employed feedforward control to close the hand and produce the desired grasping force (four levels). First (baseline) and the last (validation) session were performed in open loop, while the second and the third session (training) included electrotactile feedback. The obtained results confirmed that using the feedback improved the accuracy and precision of the force control. In addition, the subjects performed significantly better in the validation vs. baseline session, therefore suggesting that electrotactile feedback can be used for learning and training of myoelectric control.

  2. Managing XML Data to optimize Performance into Object-Relational Databases

    Directory of Open Access Journals (Sweden)

    Iuliana BOTHA

    2011-06-01

    Full Text Available This paper propose some possibilities for manage XML data in order to optimize performance into object-relational databases. It is detailed the possibility of storing XML data into such databases, using for exemplification an Oracle database and there are tested some optimizing techniques of the queries over XMLType tables, like indexing and partitioning tables.

  3. Uses and limitations of registry and academic databases.

    Science.gov (United States)

    Williams, William G

    2010-01-01

    A database is simply a structured collection of information. A clinical database may be a Registry (a limited amount of data for every patient undergoing heart surgery) or Academic (an organized and extensive dataset of an inception cohort of carefully selected subset of patients). A registry and an academic database have different purposes and cost. The data to be collected for a database is defined by its purpose and the output reports required for achieving that purpose. A Registry's purpose is to ensure quality care, an Academic Database, to discover new knowledge through research. A database is only as good as the data it contains. Database personnel must be exceptionally committed and supported by clinical faculty. A system to routinely validate and verify data integrity is essential to ensure database utility. Frequent use of the database improves its accuracy. For congenital heart surgeons, routine use of a Registry Database is an essential component of clinical practice. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  4. Generating Novelty Through Interdependent Routines: A Process Model of Routine Work

    NARCIS (Netherlands)

    Deken, F.; Carlile, P.R.; Berends, H.; Lauche, K.

    2016-01-01

    We investigate how multiple actors accomplish interdependent routine performances directed at novel intended outcomes and how this affects routine dynamics over time. We report findings from a longitudinal ethnographic study in an automotive company where actors developed a new business model around

  5. Routine perinatal and paediatric post-mortem radiography: detection rates and implications for practice

    Energy Technology Data Exchange (ETDEWEB)

    Arthurs, Owen J. [NHS Foundation Trust, Department of Radiology Great Ormond Street Hospital for Children, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom); Calder, Alistair D. [NHS Foundation Trust, Department of Radiology Great Ormond Street Hospital for Children, London (United Kingdom); Kiho, Liina [Camelia Botnar Laboratories Great Ormond Street Hospital for Children, Department of Paediatric Pathology, London (United Kingdom); Taylor, Andrew M. [Great Ormond Street Hospital for Children, Cardiorespiratory Unit, London (United Kingdom); UCL Institute of Cardiovascular Science, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom); Sebire, Neil J. [Camelia Botnar Laboratories Great Ormond Street Hospital for Children, Department of Paediatric Pathology, London (United Kingdom); University College London, Institute of Child Health, London (United Kingdom)

    2014-03-15

    Routine perinatal and paediatric post-mortem plain radiography allows for the diagnosis and assessment of skeletal dysplasias, fractures and other bony abnormalities. The aim of this study was to review the diagnostic yield of this practice. We identified 1,027 cases performed in a single institution over a 21/2-year period, including babygrams (whole-body examinations) and full skeletal surveys. Images were reported prior to autopsy in all cases. Radiology findings were cross-referenced with the autopsy findings using an autopsy database. We scored each case from 0 to 4 according to the level of diagnostic usefulness. The overall abnormality rate was 126/1,027 (12.3%). There was a significantly higher rate of abnormality when a skeletal survey was performed (18%) rather than a babygram (10%; P < 0.01); 90% (665/739) of babygrams were normal. Of the 74 abnormal babygrams, we found 33 incidental non-contributory cases, 19 contributory, 20 diagnostic, and 2 false-positive cases. There were only 2 cases out of 739 (0.27%) in whom routine post-mortem imaging identified potentially significant abnormalities that would not have been detected if only selected imaging had been performed. A policy of performing selected, rather than routine, foetal post-mortem radiography could result in a significant cost saving. Routine post-mortem paediatric radiography in foetuses and neonates is neither diagnostically useful nor cost-effective. A more evidence-based, selective protocol should yield significant cost savings. (orig.)

  6. Routine perinatal and paediatric post-mortem radiography: detection rates and implications for practice

    International Nuclear Information System (INIS)

    Arthurs, Owen J.; Calder, Alistair D.; Kiho, Liina; Taylor, Andrew M.; Sebire, Neil J.

    2014-01-01

    Routine perinatal and paediatric post-mortem plain radiography allows for the diagnosis and assessment of skeletal dysplasias, fractures and other bony abnormalities. The aim of this study was to review the diagnostic yield of this practice. We identified 1,027 cases performed in a single institution over a 21/2-year period, including babygrams (whole-body examinations) and full skeletal surveys. Images were reported prior to autopsy in all cases. Radiology findings were cross-referenced with the autopsy findings using an autopsy database. We scored each case from 0 to 4 according to the level of diagnostic usefulness. The overall abnormality rate was 126/1,027 (12.3%). There was a significantly higher rate of abnormality when a skeletal survey was performed (18%) rather than a babygram (10%; P < 0.01); 90% (665/739) of babygrams were normal. Of the 74 abnormal babygrams, we found 33 incidental non-contributory cases, 19 contributory, 20 diagnostic, and 2 false-positive cases. There were only 2 cases out of 739 (0.27%) in whom routine post-mortem imaging identified potentially significant abnormalities that would not have been detected if only selected imaging had been performed. A policy of performing selected, rather than routine, foetal post-mortem radiography could result in a significant cost saving. Routine post-mortem paediatric radiography in foetuses and neonates is neither diagnostically useful nor cost-effective. A more evidence-based, selective protocol should yield significant cost savings. (orig.)

  7. Designing a database for performance assessment: Lessons learned from WIPP

    International Nuclear Information System (INIS)

    Martell, M.A.; Schenker, A.

    1997-01-01

    The Waste Isolation Pilot Plant (WIPP) Compliance Certification Application (CCA) Performance Assessment (PA) used a relational database that was originally designed only to supply the input parameters required for implementation of the PA codes. Reviewers used the database as a point of entry to audit quality assurance measures for control, traceability, and retrievability of input information used for analysis, and output/work products. During these audits it became apparent that modifications to the architecture and scope of the database would benefit the EPA regulator and other stakeholders when reviewing the recertification application. This paper contains a discussion of the WPP PA CCA database and lessons learned for designing a database

  8. Data Preparation Process for the Buildings Performance Database

    Energy Technology Data Exchange (ETDEWEB)

    Walter, Travis; Dunn, Laurel; Mercado, Andrea; Brown, Richard E.; Mathew, Paul

    2014-06-30

    The Buildings Performance Database (BPD) includes empirically measured data from a variety of data sources with varying degrees of data quality and data availability. The purpose of the data preparation process is to maintain data quality within the database and to ensure that all database entries have sufficient data for meaningful analysis and for the database API. Data preparation is a systematic process of mapping data into the Building Energy Data Exchange Specification (BEDES), cleansing data using a set of criteria and rules of thumb, and deriving values such as energy totals and dominant asset types. The data preparation process takes the most amount of effort and time therefore most of the cleansing process has been automated. The process also needs to adapt as more data is contributed to the BPD and as building technologies over time. The data preparation process is an essential step between data contributed by providers and data published to the public in the BPD.

  9. A database for on-line event analysis on a distributed memory machine

    CERN Document Server

    Argante, E; Van der Stok, P D V; Willers, Ian Malcolm

    1995-01-01

    Parallel in-memory databases can enhance the structuring and parallelization of programs used in High Energy Physics (HEP). Efficient database access routines are used as communication primitives which hide the communication topology in contrast to the more explicit communications like PVM or MPI. A parallel in-memory database, called SPIDER, has been implemented on a 32 node Meiko CS-2 distributed memory machine. The spider primitives generate a lower overhead than the one generated by PVM or PMI. The event reconstruction program, CPREAD of the CPLEAR experiment, has been used as a test case. Performance measurerate generated by CPLEAR.

  10. Sorption, Diffusion and Solubility Databases for Performance Assessment

    International Nuclear Information System (INIS)

    Garcia Gutierrez, M.

    2000-01-01

    This report presents a deterministic and probabilistic databases for application in Performance Assessment of a high-level radioactive waste disposal. This work includes a theoretical description of sorption, diffusion and solubility phenomena of radionuclides in geological media. The report presents and compares the databases of different nuclear wastes management agencies, describes the materials in the Spanish reference system, and the results of sorption diffusion and solubility in this system, with both the deterministic and probabilistic approximation. The probabilistic approximation is presented in the form of probability density functions (pdf). (Author) 52 refs

  11. Development of comprehensive material performance database for nuclear applications

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime

    1993-01-01

    This paper introduces the present status of the comprehensive material performance database for nuclear applications, which was named JAERI Material Performance Database (JMPD), and examples of its utilization. The JMPD has been developed since 1986 in JAERI with a view to utilizing various kinds of characteristics data of nuclear materials efficiently. Management system of relational database, PLANNER, was employed, and supporting systems for data retrieval and output were expanded. In order to improve user-friendliness of the retrieval system, the menu selection type procedures have been developed where knowledge of the system or the data structures are not required for end-users. As to utilization of the JMPD, two types of data analyses are mentioned as follows: (1) A series of statistical analyses was performed in order to estimate the design values both of the yield strength (Sy) and the tensile strength (Su) for aluminum alloys which are widely used as structural materials for research reactors. (2) Statistical analyses were accomplished by using the cyclic crack growth rate data for nuclear pressure vessel steels, and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and ΔK-constant type tests. (author)

  12. Comparison of Cloud backup performance and costs in Oracle database

    OpenAIRE

    Aljaž Zrnec; Dejan Lavbič

    2011-01-01

    Current practice of backing up data is based on using backup tapes and remote locations for storing data. Nowadays, with the advent of cloud computing a new concept of database backup emerges. The paper presents the possibility of making backup copies of data in the cloud. We are mainly focused on performance and economic issues of making backups in the cloud in comparison to traditional backups. We tested the performance and overall costs of making backup copies of data in Oracle database u...

  13. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  14. Performance Evaluation of Cloud Database and Traditional Database in terms of Response Time while Retrieving the Data

    OpenAIRE

    Donkena, Kaushik; Gannamani, Subbarayudu

    2012-01-01

    Context: There has been an exponential growth in the size of the databases in the recent times and the same amount of growth is expected in the future. There has been a firm drop in the storage cost followed by a rapid increase in t he storage capacity. The entry of Cloud in the recent times has changed the equations. The Performance of the Database plays a vital role in the competition. In this research, an attempt has been made to evaluate and compare the performance of the traditional data...

  15. A database for human performance under simulated emergencies of nuclear power plants

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea

    2005-01-01

    Reliable human performance is a prerequisite in securing the safety of complicated process systems such as nuclear power plants. However, the amount of available knowledge that can explain why operators deviate from an expected performance level is so small because of the infrequency of real accidents. Therefore, in this study, a database that contains a set of useful information extracted from simulated emergencies was developed in order to provide important clues for understanding the change of operators' performance under stressful conditions (i.e., real accidents). The database was developed under Microsoft Windows TM environment using Microsoft Access 97 TM and Microsoft Visual Basic 6.0 TM . In the database, operators' performance data obtained from the analysis of over 100 audio-visual records for simulated emergencies were stored using twenty kinds of distinctive data fields. A total of ten kinds of operators' performance data are available from the developed database. Although it is still difficult to predict operators' performance under stressful conditions based on the results of simulated emergencies, simulation studies remain the most feasible way to scrutinize performance. Accordingly, it is expected that the performance data of this study will provide a concrete foundation for understanding the change of operators' performance in emergency situations

  16. GUC100 multisensor fingerprint database for in-house (semipublic) performance test

    OpenAIRE

    Gafurov D.; Bours P.; Yang B.; Busch C.

    2010-01-01

    For evaluation of biometric performance of biometric components and system, the availability of independent databases and desirably independent evaluators is important. Both databases of significant size and independent testing institutions provide the precondition for fair and unbiased benchmarking. In order to show generalization capabilities of the system under test, it is essential that algorithm developers do not have access to the testing database, and thus the risk of tuned algorithms...

  17. Comparison of Cloud backup performance and costs in Oracle database

    Directory of Open Access Journals (Sweden)

    Aljaž Zrnec

    2011-06-01

    Full Text Available Normal 0 21 false false false SL X-NONE X-NONE Current practice of backing up data is based on using backup tapes and remote locations for storing data. Nowadays, with the advent of cloud computing a new concept of database backup emerges. The paper presents the possibility of making backup copies of data in the cloud. We are mainly focused on performance and economic issues of making backups in the cloud in comparison to traditional backups. We tested the performance and overall costs of making backup copies of data in Oracle database using Amazon S3 and EC2 cloud services. The costs estimation was performed on the basis of the prices published on Amazon S3 and Amazon EC2 sites.

  18. Virtual microscopy: an evaluation of its validity and diagnostic performance in routine histologic diagnosis of skin tumors

    DEFF Research Database (Denmark)

    Nielsen, Patricia Switten; Lindebjerg, Jan; Rasmussen, Jan

    2010-01-01

    Digitization of histologic slides is associated with many advantages, and its use in routine diagnosis holds great promise. Nevertheless, few articles evaluate virtual microscopy in routine settings. This study is an evaluation of the validity and diagnostic performance of virtual microscopy...... in routine histologic diagnosis of skin tumors. Our aim is to investigate whether conventional microscopy of skin tumors can be replaced by virtual microscopy. Ninety-six skin tumors and skin-tumor-like changes were consecutively gathered over a 1-week period. Specimens were routinely processed, and digital...... slides were captured on Mirax Scan (Carl Zeiss MicroImaging, Göttingen, Germany). Four pathologists evaluated the 96 virtual slides and the associated 96 conventional slides twice with intermediate time intervals of at least 3 weeks. Virtual slides that caused difficulties were reevaluated to identify...

  19. Performance Evaluation of a Database System in a Multiple Backend Configurations,

    Science.gov (United States)

    1984-10-01

    leaving a systemn process , the * internal performance measuremnents of MMSD have been carried out. Mathodo lo.- gies for constructing test databases...access d i rectory data via the AT, EDIT, and CDT. In designing the test database, one of the key concepts is the choice of the directory attributes in...internal timing. These requests are selected since they retrieve the seIaI lest portion of the test database and the processing time for each request is

  20. Bedtime routines child wellbeing & development.

    Science.gov (United States)

    Kitsaras, George; Goodwin, Michaela; Allan, Julia; Kelly, Michael P; Pretty, Iain A

    2018-03-21

    Bedtime routines has shown important associations with areas associated with child wellbeing and development. Research into bedtime routines is limited with studies mainly focusing on quality of sleep. The objectives of the present study were to examine the relationship between bedtime routines and a variety of factors associated with child wellbeing and to examine possible determinants of bedtime routines. A total of 50 families with children between 3 and 5 years old took part in the study. Data on bedtime routines, parenting styles, school readiness, children's dental health, and executive function were collected. Children in families with optimal bedtime routines showed better performance in terms of executive function, specifically working memory (t (44)= - 8.51, p ≤ .001), inhibition and attention (t (48)= - 9.70, p ≤ .001) and cognitive flexibility (t (48)= - 13.1, p ≤ .001). Also, children in households with optimal bedtime routines scored higher in their readiness for school (t (48)= 6.92, p ≤ .001) and had better dental health (U = 85.5, p = .011). Parents in households with suboptimal bedtime routines showed worse performance on all measures of executive function including working memory (t (48)= - 10.47, p ≤ .001), inhibition-attention (t (48)= - 10.50, p ≤ .001) and cognitive flexibility (t (48)= - 13.6, p ≤ .001). Finally, parents with optimal bedtime routines for their children deployed a more positive parenting style in general (i.e. authoritative parenting) compared to those with suboptimal bedtime routines (t (48)= - 6.45, p ≤ .001). The results of the present study highlight the potentially important role of bedtime routines in a variety of areas associated with child wellbeing and the need for further research.

  1. A high performance, ad-hoc, fuzzy query processing system for relational databases

    Science.gov (United States)

    Mansfield, William H., Jr.; Fleischman, Robert M.

    1992-01-01

    Database queries involving imprecise or fuzzy predicates are currently an evolving area of academic and industrial research. Such queries place severe stress on the indexing and I/O subsystems of conventional database environments since they involve the search of large numbers of records. The Datacycle architecture and research prototype is a database environment that uses filtering technology to perform an efficient, exhaustive search of an entire database. It has recently been modified to include fuzzy predicates in its query processing. The approach obviates the need for complex index structures, provides unlimited query throughput, permits the use of ad-hoc fuzzy membership functions, and provides a deterministic response time largely independent of query complexity and load. This paper describes the Datacycle prototype implementation of fuzzy queries and some recent performance results.

  2. Understanding, modeling, and improving main-memory database performance

    OpenAIRE

    Manegold, S.

    2002-01-01

    textabstractDuring the last two decades, computer hardware has experienced remarkable developments. Especially CPU (clock-)speed has been following Moore's Law, i.e., doubling every 18 months; and there is no indication that this trend will change in the foreseeable future. Recent research has revealed that database performance, even with main-memory based systems, can hardly benefit from the ever increasing CPU power. The reason for this is that the performance of other hardware components h...

  3. Oracle database 12c release 2 in-memory tips and techniques for maximum performance

    CERN Document Server

    Banerjee, Joyjeet

    2017-01-01

    This Oracle Press guide shows, step-by-step, how to optimize database performance and cut transaction processing time using Oracle Database 12c Release 2 In-Memory. Oracle Database 12c Release 2 In-Memory: Tips and Techniques for Maximum Performance features hands-on instructions, best practices, and expert tips from an Oracle enterprise architect. You will learn how to deploy the software, use In-Memory Advisor, build queries, and interoperate with Oracle RAC and Multitenant. A complete chapter of case studies illustrates real-world applications. • Configure Oracle Database 12c and construct In-Memory enabled databases • Edit and control In-Memory options from the graphical interface • Implement In-Memory with Oracle Real Application Clusters • Use the In-Memory Advisor to determine what objects to keep In-Memory • Optimize In-Memory queries using groups, expressions, and aggregations • Maximize performance using Oracle Exadata Database Machine and In-Memory option • Use Swingbench to create d...

  4. Routines and Organizational Change

    DEFF Research Database (Denmark)

    Yi, Sangyoon; Becker, Markus; Knudsen, Thorbjørn

    2014-01-01

    Routines have been perceived as a source of inertia in the process of organizational change. In this study, we suggest an overlooked, but prevalent, mechanism by which the inertial nature of routines helps, rather than hinders, organizational adaptation. Routine-level inertia plays a hidden role...... to cope with its task environment. In our nuanced perspective, inertia is not only a consequence of adaptation but also a source of adaptation. This logic is helpful to understand why reliable but apparently inertial organizations keep surviving and often exhibit outstanding performance. We conclude...

  5. Routine Responses to Disruption of Routines

    Science.gov (United States)

    Guha, Mahua

    2015-01-01

    "Organisational routines" is a widely studied research area. However, there is a dearth of research on disruption of routines. The few studies on disruption of routines discussed problem-solving activities that are carried out in response to disruption. In contrast, this study develops a theory of "solution routines" that are a…

  6. Federated or cached searches: providing expected performance from multiple invasive species databases

    Science.gov (United States)

    Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.

    2011-01-01

    Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search “deep” web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.

  7. Application Program Interface for the Orion Aerodynamics Database

    Science.gov (United States)

    Robinson, Philip E.; Thompson, James

    2013-01-01

    The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The

  8. Oblique Chest Views as a Routine Part of Skeletal Surveys Performed for Possible Physical Abuse--Is This Practice Worthwhile?

    Science.gov (United States)

    Hansen, Karen Kirhofer; Prince, Jeffrey S.; Nixon, G. William

    2008-01-01

    Objective: To evaluate the utility of oblique chest views in the diagnosis of rib fractures when used as a routine part of the skeletal survey performed for possible physical abuse. Methods: Oblique chest views have been part of the routine skeletal survey protocol at Primary Children's Medical Center since October 2002. Dictated radiology reports…

  9. Database on Performance of Neutron Irradiated FeCrAl Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Field, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Briggs, Samuel A. [Univ. of Wisconsin, Madison, WI (United States); Littrell, Ken [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Parish, Chad M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Yamamoto, Yukinori [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-08-01

    The present report summarizes and discusses the database on radiation tolerance for Generation I, Generation II, and commercial FeCrAl alloys. This database has been built upon mechanical testing and microstructural characterization on selected alloys irradiated within the High Flux Isotope Reactor (HFIR) at Oak Ridge National Laboratory (ORNL) up to doses of 13.8 dpa at temperatures ranging from 200°C to 550°C. The structure and performance of these irradiated alloys were characterized using advanced microstructural characterization techniques and mechanical testing. The primary objective of developing this database is to enhance the rapid development of a mechanistic understanding on the radiation tolerance of FeCrAl alloys, thereby enabling informed decisions on the optimization of composition and microstructure of FeCrAl alloys for application as an accident tolerant fuel (ATF) cladding. This report is structured to provide a brief summary of critical results related to the database on radiation tolerance of FeCrAl alloys.

  10. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    International Nuclear Information System (INIS)

    Shao, Weber; Kupelian, Patrick A; Wang, Jason; Low, Daniel A; Ruan, Dan

    2014-01-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  11. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    Science.gov (United States)

    Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan

    2014-03-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  12. Utility of routine postoperative chest radiography in pediatric tracheostomy.

    Science.gov (United States)

    Genther, Dane J; Thorne, Marc C

    2010-12-01

    Routine chest radiography following pediatric tracheostomy is commonly performed in order to evaluate for air-tracking complications. Routine chest radiography affords disadvantages of radiation exposure and cost. The primary objective of this study was to determine the utility of routine postoperative chest radiography following tracheostomy in pediatric patients. Secondary objectives were to compare the rates of postoperative complications by various patient and surgeon characteristics. All infants and children 18 years of age or less (n=421) who underwent tracheostomy at a single tertiary-care medical center from January 2000 to April 2009 were included in the study. A combination of data obtained from billing and administrative systems and review of electronic medical records were recorded and compiled in a database for statistical analysis. Three air-tracking complications (2 pneumothoraces and 1 pneumomediastinum) were identified in our population of 421 pediatric patients, for an incidence of 0.71% (95% CI: 0.1-2.0%). No significant relationships were found between the incidence of air-tracking complication and surgical specialty, patient age, or type of procedure (elective, urgent/emergent). Our study identified a low rate of pneumothorax and pneumomediastinum following pediatric tracheostomy. In all three cases, the pneumothorax was suspected clinically. This finding suggests that postoperative chest radiography should be reserved for cases where there is suspicion of a complication on the basis of intraoperative findings or clinical parameters. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  13. Technique for comparing automatic quadrature routines

    Energy Technology Data Exchange (ETDEWEB)

    Lyness, J N; Kaganove, J J

    1976-02-01

    The present unconstrained proliferation of automatic quadrature routines is a phenomenon which is wasteful in human time and computing resources. At the root of the problem is an absence of generally acceptable standards or benchmarks for comparing or evaluating such routines. In this paper a general technique, based on the nature of the performance profile, is described which can be used for evaluation of routines.

  14. TOF plotter - a program to perform routine analysis time-of-flight mass spectral data

    International Nuclear Information System (INIS)

    Knippel, Brad C.; Padgett, Clifford W.; Marcus, R. Kenneth

    2004-01-01

    The main article discusses the operation and application of the program to mass spectral data files. This laboratory has recently reported the construction and characterization of a linear time-of-flight mass spectrometer (ToF-MS) utilizing a radio frequency glow discharge ionization source. Data acquisition and analysis was performed using a digital oscilloscope and Microsoft Excel, respectively. Presently, no software package is available that is specifically designed for time-of-flight mass spectral analysis that is not instrument dependent. While spreadsheet applications such as Excel offer tremendous utility, they can be cumbersome when repeatedly performing tasks which are too complex or too user intensive for macros to be viable. To address this situation and make data analysis a faster, simpler task, our laboratory has developed a Microsoft Windows-based software program coded in Microsoft Visual Basic. This program enables the user to rapidly perform routine data analysis tasks such as mass calibration, plotting and smoothing on x-y data sets. In addition to a suite of tools for data analysis, a number of calculators are built into the software to simplify routine calculations pertaining to linear ToF-MS. These include mass resolution, ion kinetic energy and single peak identification calculators. A detailed description of the software and its associated functions is presented followed by a characterization of its performance in the analysis of several representative ToF-MS spectra obtained from different GD-ToF-MS systems

  15. Compilation and comparison of radionuclide sorption databases used in recent performance assessments

    International Nuclear Information System (INIS)

    McKinley, I.G.; Scholtis, A.

    1992-01-01

    The aim of this paper is to review the radionuclide sorption databases which have been used in performance assessments published within the last decade. It was hoped that such a review would allow areas of consistency to be identified, possibly indicating nuclide/rock/water systems which are now well characterised. Inconsistencies, on the other hand, might indicate areas in which further work is required. This study followed on from a prior review of the various databases which had been used in Swiss performance assessments. The latter was, however, considerably simplified by the fact that the authors had been heavily involved in sorption database definition for these assessments. The first phase of the current study was based entirely on available literature and it was quickly evident that the analyses would be much more complex (and time consuming) than initially envisaged. While some assessments clearly list all sorption data used, others depend on secondary literature (which may or may not be clearly referenced) or present sorption data which has been transmogrified into another form (e.g. into a retardation factor -c.f. following section). This study focused on database used (or intended) for performance assessment which have been published within the last 10 years or so. 45 refs., 12 tabs., 1 fig

  16. JAERI Material Performance Database (JMPD); outline of the system

    International Nuclear Information System (INIS)

    Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime.

    1991-01-01

    JAERI Material Performance Database (JMPD) has been developed since 1986 in JAERI with a view to utilizing the various kinds of characteristic data of nuclear materials efficiently. Management system of relational database, PLANNER was employed and supporting systems for data retrieval and output were expanded. JMPD is currently serving the following data; (1) Data yielded from the research activities of JAERI including fatigue crack growth data of LWR pressure vessel materials as well as creep and fatigue data of the alloy developed for the High Temperature Gas-cooled Reactor (HTGR), Hastelloy XR. (2) Data of environmentally assisted cracking of LWR materials arranged by Electric power Research Institute (EPRI) including fatigue crack growth data (3000 tests), stress corrosion data (500 tests) and Slow Strain Rate Technique (SSRT) data (1000 tests). In order to improve user-friendliness of retrieval system, the menu selection type procedures have been developed where knowledge of system and data structures are not required for end-users. In addition a retrieval via database commands, Structured Query Language (SQL), is supported by the relational database management system. In JMPD the retrieved data can be processed readily through supporting systems for graphical and statistical analyses. The present report outlines JMPD and describes procedures for data retrieval and analyses by utilizing JMPD. (author)

  17. Clinical Databases for Chest Physicians.

    Science.gov (United States)

    Courtwright, Andrew M; Gabriel, Peter E

    2018-04-01

    A clinical database is a repository of patient medical and sociodemographic information focused on one or more specific health condition or exposure. Although clinical databases may be used for research purposes, their primary goal is to collect and track patient data for quality improvement, quality assurance, and/or actual clinical management. This article aims to provide an introduction and practical advice on the development of small-scale clinical databases for chest physicians and practice groups. Through example projects, we discuss the pros and cons of available technical platforms, including Microsoft Excel and Access, relational database management systems such as Oracle and PostgreSQL, and Research Electronic Data Capture. We consider approaches to deciding the base unit of data collection, creating consensus around variable definitions, and structuring routine clinical care to complement database aims. We conclude with an overview of regulatory and security considerations for clinical databases. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  18. Interlocking Boards and Firm Performance: Evidence from a New Panel Database

    NARCIS (Netherlands)

    M.C. Non (Marielle); Ph.H.B.F. Franses (Philip Hans)

    2007-01-01

    textabstractAn interlock between two firms occurs if the firms share one or more directors in their boards of directors. We explore the effect of interlocks on firm performance for 101 large Dutch firms using a large and new panel database. We use five different performance measures, and for each

  19. Image quality validation of Sentinel 2 Level-1 products: performance status at the beginning of the constellation routine phase

    Science.gov (United States)

    Francesconi, Benjamin; Neveu-VanMalle, Marion; Espesset, Aude; Alhammoud, Bahjat; Bouzinac, Catherine; Clerc, Sébastien; Gascon, Ferran

    2017-09-01

    Sentinel-2 is an Earth Observation mission developed by the European Space Agency (ESA) in the frame of the Copernicus program of the European Commission. The mission is based on a constellation of 2-satellites: Sentinel-2A launched in June 2015 and Sentinel-2B launched in March 2017. It offers an unprecedented combination of systematic global coverage of land and coastal areas, a high revisit of five days at the equator and 2 days at mid-latitudes under the same viewing conditions, high spatial resolution, and a wide field of view for multispectral observations from 13 bands in the visible, near infrared and short wave infrared range of the electromagnetic spectrum. The mission performances are routinely and closely monitored by the S2 Mission Performance Centre (MPC), including a consortium of Expert Support Laboratories (ESL). This publication focuses on the Sentinel-2 Level-1 product quality validation activities performed by the MPC. It presents an up-to-date status of the Level-1 mission performances at the beginning of the constellation routine phase. Level-1 performance validations routinely performed cover Level-1 Radiometric Validation (Equalisation Validation, Absolute Radiometry Vicarious Validation, Absolute Radiometry Cross-Mission Validation, Multi-temporal Relative Radiometry Vicarious Validation and SNR Validation), and Level-1 Geometric Validation (Geolocation Uncertainty Validation, Multi-spectral Registration Uncertainty Validation and Multi-temporal Registration Uncertainty Validation). Overall, the Sentinel-2 mission is proving very successful in terms of product quality thereby fulfilling the promises of the Copernicus program.

  20. Performance of popular open source databases for HEP related computing problems

    International Nuclear Information System (INIS)

    Kovalskyi, D; Sfiligoi, I; Wuerthwein, F; Yagil, A

    2014-01-01

    Databases are used in many software components of HEP computing, from monitoring and job scheduling to data storage and processing. It is not always clear at the beginning of a project if a problem can be handled by a single server, or if one needs to plan for a multi-server solution. Before a scalable solution is adopted, it helps to know how well it performs in a single server case to avoid situations when a multi-server solution is adopted mostly due to sub-optimal performance per node. This paper presents comparison benchmarks of popular open source database management systems. As a test application we use a user job monitoring system based on the Glidein workflow management system used in the CMS Collaboration.

  1. An Integrated Database of Unit Training Performance: Description an Lessons Learned

    National Research Council Canada - National Science Library

    Leibrecht, Bruce

    1997-01-01

    The Army Research Institute (ARI) has developed a prototype relational database for processing and archiving unit performance data from home station, training area, simulation based, and Combat Training Center training exercises...

  2. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  3. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de

    2015-01-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  4. A performance study on the synchronisation of heterogeneous Grid databases using CONStanza

    CERN Document Server

    Pucciani, G; Domenici, Andrea; Stockinger, Heinz

    2010-01-01

    In Grid environments, several heterogeneous database management systems are used in various administrative domains. However, data exchange and synchronisation need to be available across different sites and different database systems. In this article we present our data consistency service CONStanza and give details on how we achieve relaxed update synchronisation between different database implementations. The integration in existing Grid environments is one of the major goals of the system. Performance tests have been executed following a factorial approach. Detailed experimental results and a statistical analysis are presented to evaluate the system components and drive future developments. (C) 2010 Elsevier B.V. All rights reserved.

  5. NoSQL database scaling

    OpenAIRE

    Žardin, Norbert

    2017-01-01

    NoSQL database scaling is a decision, where system resources or financial expenses are traded for database performance or other benefits. By scaling a database, database performance and resource usage might increase or decrease, such changes might have a negative impact on an application that uses the database. In this work it is analyzed how database scaling affect database resource usage and performance. As a results, calculations are acquired, using which database scaling types and differe...

  6. How Planful Is Routine Behavior? A Selective-Attention Model of Performance in the Tower of Hanoi

    Science.gov (United States)

    Patsenko, Elena G.; Altmann, Erik M.

    2010-01-01

    Routine human behavior has often been attributed to plans--mental representations of sequences goals and actions--but can also be attributed to more opportunistic interactions of mind and a structured environment. This study asks whether performance on a task traditionally analyzed in terms of plans can be better understood from a "situated" (or…

  7. A high-performance spatial database based approach for pathology imaging algorithm evaluation.

    Science.gov (United States)

    Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A D; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J; Saltz, Joel H

    2013-01-01

    Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. WE HAVE CONSIDERED TWO SCENARIOS FOR ALGORITHM EVALUATION: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and

  8. Oracle Database 11gR2 Performance Tuning Cookbook

    CERN Document Server

    Fiorillo, Ciro

    2012-01-01

    In this book you will find both examples and theoretical concepts covered. Every recipe is based on a script/procedure explained step-by-step, with screenshots, while theoretical concepts are explained in the context of the recipe, to explain why a solution performs better than another. This book is aimed at software developers, software and data architects, and DBAs who are using or are planning to use the Oracle Database, who have some experience and want to solve performance problems faster and in a rigorous way. If you are an architect who wants to design better applications, a DBA who is

  9. The performance of disk arrays in shared-memory database machines

    Science.gov (United States)

    Katz, Randy H.; Hong, Wei

    1993-01-01

    In this paper, we examine how disk arrays and shared memory multiprocessors lead to an effective method for constructing database machines for general-purpose complex query processing. We show that disk arrays can lead to cost-effective storage systems if they are configured from suitably small formfactor disk drives. We introduce the storage system metric data temperature as a way to evaluate how well a disk configuration can sustain its workload, and we show that disk arrays can sustain the same data temperature as a more expensive mirrored-disk configuration. We use the metric to evaluate the performance of disk arrays in XPRS, an operational shared-memory multiprocessor database system being developed at the University of California, Berkeley.

  10. The IPE Database: providing information on plant design, core damage frequency and containment performance

    International Nuclear Information System (INIS)

    Lehner, J.R.; Lin, C.C.; Pratt, W.T.; Su, T.; Danziger, L.

    1996-01-01

    A database, called the IPE Database has been developed that stores data obtained from the Individual Plant Examinations (IPEs) which licensees of nuclear power plants have conducted in response to the Nuclear Regulatory Commission's (NRC) Generic Letter GL88-20. The IPE Database is a collection of linked files which store information about plant design, core damage frequency (CDF), and containment performance in a uniform, structured way. The information contained in the various files is based on data contained in the IPE submittals. The information extracted from the submittals and entered into the IPE Database can be manipulated so that queries regarding individual or groups of plants can be answered using the IPE Database

  11. Routine environmental monitoring schedule, calendar year 1997

    Energy Technology Data Exchange (ETDEWEB)

    Markes, B.M., Westinghouse Hanford

    1996-12-10

    This document provides the Environmental Restorations Contractor (ERC) and the Project Hanford Management Contractor.(PHMC) a schedule in accordance with the WHC-CM-7-5, Environmental Compliance` and BHI- EE-02, Environmental Requirements, of monitoring and sampling routines for the Near-Field Monitoring (NFM) program during calendar year (CY) 1997. Every attempt will be made to consistently follow this schedule; any deviation from this schedule will be documented by an internal memorandum (DSI) explaining the reason for the deviation. The DSI will be issued by the scheduled performing organization and directed to Near-Field Monitoring. The survey frequencies for particular sites are determined by the technical judgment of Near- Field Monitoring and may depend on the site history, radiological status, use, and general conditions. Additional surveys may be requested at irregular frequencies if conditions warrant. All radioactive wastes sites are scheduled to be surveyed at least annually. Any newly discovered wastes sites not documented by this schedule will be included in the revised schedule for CY 1998. The outside perimeter road surveys of 200 East and West Area and the rail survey from the 300 Area to Columbia Center will be performed in the year 2000 per agreement with Department of Energy. Richland Field Office. This schedule does not discuss staffing needs, nor does it list the monitoring equipment to be used in completing specific routines. Personnel performing routines to meet this schedule shall communicate any need for assistance in completing these routines to Radiological Control management and Near-Field Monitoring. After each routine survey is completed, a copy of the survey record, maps, and data sheets will be forwarded to Near-Field Monitoring. These routine surveys will not be considered complete until this documentation is received. At the end of each month, the ERC and PHMC radiological control organizations shall forward a copy of the Routine

  12. Frontier: High Performance Database Access Using Standard Web Components in a Scalable Multi-Tier Architecture

    International Nuclear Information System (INIS)

    Kosyakov, S.; Kowalkowski, J.; Litvintsev, D.; Lueking, L.; Paterno, M.; White, S.P.; Autio, Lauri; Blumenfeld, B.; Maksimovic, P.; Mathis, M.

    2004-01-01

    A high performance system has been assembled using standard web components to deliver database information to a large number of broadly distributed clients. The CDF Experiment at Fermilab is establishing processing centers around the world imposing a high demand on their database repository. For delivering read-only data, such as calibrations, trigger information, and run conditions data, we have abstracted the interface that clients use to retrieve data objects. A middle tier is deployed that translates client requests into database specific queries and returns the data to the client as XML datagrams. The database connection management, request translation, and data encoding are accomplished in servlets running under Tomcat. Squid Proxy caching layers are deployed near the Tomcat servers, as well as close to the clients, to significantly reduce the load on the database and provide a scalable deployment model. Details the system's construction and use are presented, including its architecture, design, interfaces, administration, performance measurements, and deployment plan

  13. The significance of routines in nursing practice.

    Science.gov (United States)

    Rytterström, Patrik; Unosson, Mitra; Arman, Maria

    2011-12-01

    The aim of this study was to illuminate the significance of routines in nursing practice. Clinical nursing is performed under the guidance of routines to varying degrees. In the nursing literature, routine is described as having both negative and positive aspects, but use of the term is inconsistent, and empirical evidence is sparse. In the research on organisational routines, a distinction is made between routine as a rule and routine as action. A qualitative design using a phenomenological-hermeneutic approach. Data collection from three focus groups focused on nurses' experience of routines. Seventeen individual interviews from a previous study focusing on caring culture were also analysed in a secondary qualitative analysis. All participants were employed as 'qualified nursing pool' nurses. Routines are experienced as pragmatic, obstructive and meaningful. The aim of the pragmatic routine was to ensure that daily working life works; this routine is practised more on the basis of rational arguments and obvious intentions. The obstructive routine had negative consequences for nursing practice and was described as nursing losing its humanity and violating the patient's integrity. The meaningful routine involved becoming one with the routine and for the nurses, it felt right and meaningful to adapt to it. Routines become meaningful when the individual action is in harmony with the cultural pattern on which the nursing work is based. Instead of letting contemporary practice passively become routine, routines can be assessed and developed using research and theoretical underpinnings as a starting point for nursing practice. Leaders have a special responsibility to develop and support meaningful routines. One approach could be to let wards examine their routines from a patient perspective on the basis of the themes of pragmatic, meaningful and obstructive routine. © 2010 Blackwell Publishing Ltd.

  14. Downsizing a database platform for increased performance and decreased costs

    Energy Technology Data Exchange (ETDEWEB)

    Miller, M.M.; Tolendino, L.F.

    1993-06-01

    Technological advances in the world of microcomputers have brought forth affordable systems and powerful software than can compete with the more traditional world of minicomputers. This paper describes an effort at Sandia National Laboratories to decrease operational and maintenance costs and increase performance by moving a database system from a minicomputer to a microcomputer.

  15. An intelligent stochastic optimization routine for nuclear fuel cycle design

    International Nuclear Information System (INIS)

    Parks, G.T.

    1990-01-01

    A simulated annealing (Metropolis algorithm) optimization routine named AMETROP, which has been developed for use on realistic nuclear fuel cycle problems, is introduced. Each stage of the algorithm is described and the means by which it overcomes or avoids the difficulties posed to conventional optimization routines by such problems are explained. Special attention is given to innovations that enhance AMETROP's performance both through artificial intelligence features, in which the routine uses the accumulation of data to influence its future actions, and through a family of simple performance aids, which allow the designer to use his heuristic knowledge to guide the routine's essentially random search. Using examples from a typical fuel cycle optimization problem, the performance of the stochastic Metropolis algorithm is compared to that of the only suitable deterministic routine in a standard software library, showing AMETROP to have many advantages

  16. A high-performance spatial database based approach for pathology imaging algorithm evaluation

    Directory of Open Access Journals (Sweden)

    Fusheng Wang

    2013-01-01

    Full Text Available Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS data model. Aims: (1 Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2 Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3 Develop a set of queries to support data sampling and result comparisons; (4 Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1 algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2 algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The

  17. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  18. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    International Nuclear Information System (INIS)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea

    2016-01-01

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  19. 42 CFR 493.1267 - Standard: Routine chemistry.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Routine chemistry. 493.1267 Section 493.1267 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... Systems § 493.1267 Standard: Routine chemistry. For blood gas analyses, the laboratory must perform the...

  20. Perceptions of the uses of routine general practice data beyond individual care in England: a qualitative study.

    Science.gov (United States)

    Wyatt, David; Cook, Jenny; McKevitt, Christopher

    2018-01-08

    To investigate how different lay and professional groups perceive and understand the use of routinely collected general practice patient data for research, public health, service evaluation and commissioning. We conducted a multimethod, qualitative study. This entailed participant observation of the design and delivery of a series of deliberative engagement events about a local patient database made of routine primary care data. We also completed semistructured interviews with key professionals involved in the database. Qualitative data were thematically analysed. The research took place in an inner city borough in England. Of the community groups who participated in the six engagement events (111 individual citizens), five were health focused. It was difficult to recruit other types of organisations. Participants supported the uses of the database, but it was unclear how well they understood its scope and purpose. They had concerns about transparency, security and the potential misuse of data. Overall, they were more focused on the need for immediate investment in primary care capacity than data infrastructures to improve future health. The 10 interviewed professionals identified the purpose of the database in different ways, according to their interests. They emphasised the promise of the database as a resource in health research in its own right and in linking it to other datasets. Findings demonstrate positivity to the uses of this local database, but a disconnect between the long-term purposes of the database and participants' short-term priorities for healthcare quality. Varying understandings of the database and the potential for it to be used in multiple different ways in the future cement a need for systematic and routine public engagement to develop and maintain public awareness. Problems recruiting community groups signal a need to consider how we engage wider audiences more effectively. © Article author(s) (or their employer(s) unless otherwise stated

  1. Pursuit of a scalable high performance multi-petabyte database

    CERN Document Server

    Hanushevsky, A

    1999-01-01

    When the BaBar experiment at the Stanford Linear Accelerator Center starts in April 1999, it will generate approximately 200 TB/year of data at a rate of 10 MB/sec for 10 years. A mere six years later, CERN, the European Laboratory for Particle Physics, will start an experiment whose data storage requirements are two orders of magnitude larger. In both experiments, all of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). The quantity and rate at which the data is produced requires the use of a high performance hierarchical mass storage system in place of a standard Unix file system. Furthermore, the distributed nature of the experiment, involving scientists from 80 Institutions in 10 countries, also requires an extended security infrastructure not commonly found in standard Unix file systems. The combination of challenges that must be overcome in order to effectively deal with a multi-petabyte object oriented database is substantial. Our particular approach...

  2. Sorption databases for increasing confidence in performance assessment - 16053

    International Nuclear Information System (INIS)

    Richter, Anke; Brendler, Vinzenz; Nebelung, Cordula; Payne, Timothy E.; Brasser, Thomas

    2009-01-01

    requires that all mineral constituents of the solid phase are characterized. Another issue is the large number of required parameters combined with time-consuming iterations. Addressing both approaches, we present two sorption databases, developed mainly by or under participation of the Forschungszentrum Dresden-Rossendorf (FZD). Both databases are implemented as relational databases, assist identification of critical data gaps and the evaluation of existing parameter sets, provide web based data search and analyses and permit the comparison of SCM predictions with K d values. RES 3 T (Rossendorf Expert System for Surface and Sorption Thermodynamics) is a digitized thermodynamic sorption database (see www.fzd.de/db/RES3T.login) and free of charge. It is mineral-specific and can therefore also be used for additive models of more complex solid phases. ISDA (Integrated Sorption Database System) connects SCM with the K d concept but focuses on conventional K d . The integrated datasets are accessible through a unified user interface. An application case, K d values in Performance Assessment, is given. (authors)

  3. Refactoring databases evolutionary database design

    CERN Document Server

    Ambler, Scott W

    2006-01-01

    Refactoring has proven its value in a wide range of development projects–helping software professionals improve system designs, maintainability, extensibility, and performance. Now, for the first time, leading agile methodologist Scott Ambler and renowned consultant Pramodkumar Sadalage introduce powerful refactoring techniques specifically designed for database systems. Ambler and Sadalage demonstrate how small changes to table structures, data, stored procedures, and triggers can significantly enhance virtually any database design–without changing semantics. You’ll learn how to evolve database schemas in step with source code–and become far more effective in projects relying on iterative, agile methodologies. This comprehensive guide and reference helps you overcome the practical obstacles to refactoring real-world databases by covering every fundamental concept underlying database refactoring. Using start-to-finish examples, the authors walk you through refactoring simple standalone databas...

  4. NVST Data Archiving System Based On FastBit NoSQL Database

    Science.gov (United States)

    Liu, Ying-bo; Wang, Feng; Ji, Kai-fan; Deng, Hui; Dai, Wei; Liang, Bo

    2014-06-01

    The New Vacuum Solar Telescope (NVST) is a 1-meter vacuum solar telescope that aims to observe the fine structures of active regions on the Sun. The main tasks of the NVST are high resolution imaging and spectral observations, including the measurements of the solar magnetic field. The NVST has been collecting more than 20 million FITS files since it began routine observations in 2012 and produces a maximum observational records of 120 thousand files in a day. Given the large amount of files, the effective archiving and retrieval of files becomes a critical and urgent problem. In this study, we implement a new data archiving system for the NVST based on the Fastbit Not Only Structured Query Language (NoSQL) database. Comparing to the relational database (i.e., MySQL; My Structured Query Language), the Fastbit database manifests distinctive advantages on indexing and querying performance. In a large scale database of 40 million records, the multi-field combined query response time of Fastbit database is about 15 times faster and fully meets the requirements of the NVST. Our study brings a new idea for massive astronomical data archiving and would contribute to the design of data management systems for other astronomical telescopes.

  5. Collecting Taxes Database

    Data.gov (United States)

    US Agency for International Development — The Collecting Taxes Database contains performance and structural indicators about national tax systems. The database contains quantitative revenue performance...

  6. Intra-disciplinary differences in database coverage and the consequences for bibliometric research

    DEFF Research Database (Denmark)

    Faber Frandsen, Tove; Nicolaisen, Jeppe

    2008-01-01

    Bibliographic databases (including databases based on open access) are routinely used for bibliometric research. The value of a specific database depends to a large extent on the coverage of the discipline(s) under study. A number of studies have determined the coverage of databases in specific d...... and psychology). The point extends to include both the uneven coverage of specialties and research traditions. The implications for bibliometric research are discussed, and precautions which need to be taken are outlined. ...

  7. Comparison of performance of tile drainage routines in SWAT 2009 and 2012 in an extensively tile-drained watershed in the Midwest

    Science.gov (United States)

    Guo, Tian; Gitau, Margaret; Merwade, Venkatesh; Arnold, Jeffrey; Srinivasan, Raghavan; Hirschi, Michael; Engel, Bernard

    2018-01-01

    Subsurface tile drainage systems are widely used in agricultural watersheds in the Midwestern US and enable the Midwest area to become highly productive agricultural lands, but can also create environmental problems, for example nitrate-N contamination associated with drainage waters. The Soil and Water Assessment Tool (SWAT) has been used to model watersheds with tile drainage. SWAT2012 revisions 615 and 645 provide new tile drainage routines. However, few studies have used these revisions to study tile drainage impacts at both field and watershed scales. Moreover, SWAT2012 revision 645 improved the soil moisture based curve number calculation method, which has not been fully tested. This study used long-term (1991-2003) field site and river station data from the Little Vermilion River (LVR) watershed to evaluate performance of tile drainage routines in SWAT2009 revision 528 (the old routine) and SWAT2012 revisions 615 and 645 (the new routine). Both the old and new routines provided reasonable but unsatisfactory (NSE runoff. The calibrated monthly tile flow, surface flow, nitrate-N in tile and surface flow, sediment and annual corn and soybean yield results from SWAT with the old and new tile drainage routines were compared with observed values. Generally, the new routine provided acceptable simulated tile flow (NSE = 0.48-0.65) and nitrate in tile flow (NSE = 0.48-0.68) for field sites with random pattern tile and constant tile spacing, while the old routine simulated tile flow and nitrate in tile flow results for the field site with constant tile spacing were unacceptable (NSE = 0.00-0.32 and -0.29-0.06, respectively). The new modified curve number calculation method in revision 645 (NSE = 0.50-0.81) better simulated surface runoff than revision 615 (NSE = -0.11-0.49). The calibration provided reasonable parameter sets for the old and new routines in the LVR watershed, and the validation results showed that the new routine has the potential to accurately

  8. Predictors of routine episiotomy in primigravida women in Oman.

    Science.gov (United States)

    Al-Ghammari, Khadija; Al-Riyami, Zainab; Al-Moqbali, Moza; Al-Marjabi, Fatma; Al-Mahrouqi, Basma; Al-Khatri, Amal; Al-Khasawneh, Esra M

    2016-02-01

    Episiotomy is still the most common surgical procedure performed on women, despite the evidence against its routine use. This cross-sectional study was conducted to determine the practice and predictors of routine episiotomy on primigravidae in Oman. Demographic data, reasons for and rate of performing routine episiotomies, and perceptions of 269 obstetricians, midwives and nurses from 11 hospitals in Oman regarding the procedure were recorded and analyzed. The rate of episiotomies was 66%. In terms of performing routine episiotomies (p<0.05): non-Omanis were 4.49 times more likely than Omanis; bachelor's degree-holders were 2.26 more likely than diploma-holders; and regional hospitals were 2.36 times more likely than tertiary hospitals. The majority perceived episiotomies "reduce spontaneous perineal tearing risk", "reduce shoulder dystocia complications", and allow for "easier suturing". The rate of episiotomies was higher than other similar contexts. An urgent intervention is necessary to curb this excessive practice, and create a culture of evidence-based practice to deal with misleading perceptions. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Providing Availability, Performance, and Scalability By Using Cloud Database

    OpenAIRE

    Prof. Dr. Alaa Hussein Al-Hamami; RafalAdeeb Al-Khashab

    2014-01-01

    With the development of the internet, new technical and concepts have attention to all users of the internet especially in the development of information technology, such as concept is cloud. Cloud computing includes different components, of which cloud database has become an important one. A cloud database is a distributed database that delivers computing as a service or in form of virtual machine image instead of a product via the internet; its advantage is that database can...

  10. Performance test of dosimetric services in the EU member states and Switzerland for the routine assessment of individual doses (photon, beta and neutron)

    DEFF Research Database (Denmark)

    Bordy, J.M.; Stadtmann, H.; Ambrosi, P.

    2000-01-01

    of the dosimetry of routine services. It was assumed that each service would have already done a type test before performing routine dosimetry: the radiation fields were chosen to simulate, as far as possible, workplace radiation fields by mixing combining energies and incident angles. The results of photon...... for External Radiation. The other two papers are included in this issue of Radiation Protection Dosimetry....

  11. Data-based modelling of the Earth's dynamic magnetosphere: a review

    Directory of Open Access Journals (Sweden)

    N. A. Tsyganenko

    2013-10-01

    Full Text Available This paper reviews the main advances in the area of data-based modelling of the Earth's distant magnetic field achieved during the last two decades. The essence and the principal goal of the approach is to extract maximum information from available data, using physically realistic and flexible mathematical structures, parameterized by the most relevant and routinely accessible observables. Accordingly, the paper concentrates on three aspects of the modelling: (i mathematical methods to develop a computational "skeleton" of a model, (ii spacecraft databases, and (iii parameterization of the magnetospheric models by the solar wind drivers and/or ground-based indices. The review is followed by a discussion of the main issues concerning further progress in the area, in particular, methods to assess the models' performance and the accuracy of the field line mapping. The material presented in the paper is organized along the lines of the author Julius-Bartels' Medal Lecture during the General Assembly 2013 of the European Geosciences Union.

  12. Endocarditis : Effects of routine echocardiography during Gram-positive bacteraemia

    NARCIS (Netherlands)

    Vos, F J; Bleeker-Rovers, C P; Sturm, P D; Krabbe, P F M; van Dijk, A P J; Oyen, W J G; Kullberg, B J

    2011-01-01

    BACKGROUND: Despite firm recommendations to perform echocardiography in high-risk patients with Gram-positive bacteraemia, routine echocardiography is not embedded in daily practice in many settings. The aim of this study was to evaluate whether a regime including routine echocardiography results in

  13. Changing of the Guard: How Different School Leaders Change Organizational Routines

    Science.gov (United States)

    Enomoto, Ernestine K.; Conley, Sharon

    2008-01-01

    While providing stability and uniformity, organizational routines can foster continuous change. Using Feldman's (2000) performative model of routinized action theory, coupled with leadership succession research, we examined how three successive administrations in a California high school revised a student attendance (tardy-monitoring) routine over…

  14. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  15. Informational Aspects of Telepathology in Routine Surgical Pathology

    Directory of Open Access Journals (Sweden)

    Peter Gombas

    2000-01-01

    Full Text Available Application of computer and telecommunication technology calls serious challenges in routine diagnostic pathology. Complete data integration, fast access patients' data to usage of diagnosis thesaurus labeled with standardized codes and free text supplements, complex inquiry of the data contents, data exchange via teleconsultation and multilevel data protection are required functions of an integrated information system. Increasing requirement for teleconsultation transferring a large amount of multimedia data among different pathology information systems raises new questions in telepathology. Creation of complex telematic systems in pathology requires efficient methods of software engineering and implementation. Information technology of object‐oriented modeling, usage of client server architecture and relational database management systems enables more compatible systems in field of telepathology. The aim of this paper is to present a practical example how to unify text based database, image archive and teleconsultation in a frame of an integrated telematic system and to discuss the main conceptual questions of information technology of telepathology.

  16. Routine Pediatric Enterovirus 71 Vaccination in China: a Cost-Effectiveness Analysis.

    Science.gov (United States)

    Wu, Joseph T; Jit, Mark; Zheng, Yaming; Leung, Kathy; Xing, Weijia; Yang, Juan; Liao, Qiaohong; Cowling, Benjamin J; Yang, Bingyi; Lau, Eric H Y; Takahashi, Saki; Farrar, Jeremy J; Grenfell, Bryan T; Leung, Gabriel M; Yu, Hongjie

    2016-03-01

    China accounted for 87% (9.8 million/11.3 million) of all hand, foot, and mouth disease (HFMD) cases reported to WHO during 2010-2014. Enterovirus 71 (EV71) is responsible for most of the severe HFMD cases. Three EV71 vaccines recently demonstrated good efficacy in children aged 6-71 mo. Here we assessed the cost-effectiveness of routine pediatric EV71 vaccination in China. We characterized the economic and health burden of EV71-associated HFMD (EV71-HFMD) in China using (i) the national surveillance database, (ii) virological surveillance records from all provinces, and (iii) a caregiver survey on the household costs and health utility loss for 1,787 laboratory-confirmed pediatric cases. Using a static model parameterized with these data, we estimated the effective vaccine cost (EVC, defined as cost/efficacy or simply the cost of a 100% efficacious vaccine) below which routine pediatric vaccination would be considered cost-effective. We performed the base-case analysis from the societal perspective with a willingness-to-pay threshold of one times the gross domestic product per capita (GDPpc) and an annual discount rate of 3%. We performed uncertainty analysis by (i) accounting for the uncertainty in the risk of EV71-HFMD due to missing laboratory data in the national database, (ii) excluding productivity loss of parents and caregivers, (iii) increasing the willingness-to-pay threshold to three times GDPpc, (iv) increasing the discount rate to 6%, and (v) accounting for the proportion of EV71-HFMD cases not registered by national surveillance. In each of these scenarios, we performed probabilistic sensitivity analysis to account for parametric uncertainty in our estimates of the risk of EV71-HFMD and the expected costs and health utility loss due to EV71-HFMD. Routine pediatric EV71 vaccination would be cost-saving if the all-inclusive EVC is below US$10.6 (95% CI US$9.7-US$11.5) and would remain cost-effective if EVC is below US$17.9 (95% CI US$16.9-US$18.8) in

  17. Routine Pediatric Enterovirus 71 Vaccination in China: a Cost-Effectiveness Analysis

    Science.gov (United States)

    Leung, Kathy; Xing, Weijia; Yang, Juan; Liao, Qiaohong; Cowling, Benjamin J.; Yang, Bingyi; Lau, Eric H. Y.; Takahashi, Saki; Farrar, Jeremy J.; Grenfell, Bryan T.; Leung, Gabriel M.; Yu, Hongjie

    2016-01-01

    Background China accounted for 87% (9.8 million/11.3 million) of all hand, foot, and mouth disease (HFMD) cases reported to WHO during 2010–2014. Enterovirus 71 (EV71) is responsible for most of the severe HFMD cases. Three EV71 vaccines recently demonstrated good efficacy in children aged 6–71 mo. Here we assessed the cost-effectiveness of routine pediatric EV71 vaccination in China. Methods and Findings We characterized the economic and health burden of EV71-associated HFMD (EV71-HFMD) in China using (i) the national surveillance database, (ii) virological surveillance records from all provinces, and (iii) a caregiver survey on the household costs and health utility loss for 1,787 laboratory-confirmed pediatric cases. Using a static model parameterized with these data, we estimated the effective vaccine cost (EVC, defined as cost/efficacy or simply the cost of a 100% efficacious vaccine) below which routine pediatric vaccination would be considered cost-effective. We performed the base-case analysis from the societal perspective with a willingness-to-pay threshold of one times the gross domestic product per capita (GDPpc) and an annual discount rate of 3%. We performed uncertainty analysis by (i) accounting for the uncertainty in the risk of EV71-HFMD due to missing laboratory data in the national database, (ii) excluding productivity loss of parents and caregivers, (iii) increasing the willingness-to-pay threshold to three times GDPpc, (iv) increasing the discount rate to 6%, and (v) accounting for the proportion of EV71-HFMD cases not registered by national surveillance. In each of these scenarios, we performed probabilistic sensitivity analysis to account for parametric uncertainty in our estimates of the risk of EV71-HFMD and the expected costs and health utility loss due to EV71-HFMD. Routine pediatric EV71 vaccination would be cost-saving if the all-inclusive EVC is below US$10.6 (95% CI US$9.7–US$11.5) and would remain cost-effective if EVC is below

  18. Routine Pediatric Enterovirus 71 Vaccination in China: a Cost-Effectiveness Analysis.

    Directory of Open Access Journals (Sweden)

    Joseph T Wu

    2016-03-01

    Full Text Available China accounted for 87% (9.8 million/11.3 million of all hand, foot, and mouth disease (HFMD cases reported to WHO during 2010-2014. Enterovirus 71 (EV71 is responsible for most of the severe HFMD cases. Three EV71 vaccines recently demonstrated good efficacy in children aged 6-71 mo. Here we assessed the cost-effectiveness of routine pediatric EV71 vaccination in China.We characterized the economic and health burden of EV71-associated HFMD (EV71-HFMD in China using (i the national surveillance database, (ii virological surveillance records from all provinces, and (iii a caregiver survey on the household costs and health utility loss for 1,787 laboratory-confirmed pediatric cases. Using a static model parameterized with these data, we estimated the effective vaccine cost (EVC, defined as cost/efficacy or simply the cost of a 100% efficacious vaccine below which routine pediatric vaccination would be considered cost-effective. We performed the base-case analysis from the societal perspective with a willingness-to-pay threshold of one times the gross domestic product per capita (GDPpc and an annual discount rate of 3%. We performed uncertainty analysis by (i accounting for the uncertainty in the risk of EV71-HFMD due to missing laboratory data in the national database, (ii excluding productivity loss of parents and caregivers, (iii increasing the willingness-to-pay threshold to three times GDPpc, (iv increasing the discount rate to 6%, and (v accounting for the proportion of EV71-HFMD cases not registered by national surveillance. In each of these scenarios, we performed probabilistic sensitivity analysis to account for parametric uncertainty in our estimates of the risk of EV71-HFMD and the expected costs and health utility loss due to EV71-HFMD. Routine pediatric EV71 vaccination would be cost-saving if the all-inclusive EVC is below US$10.6 (95% CI US$9.7-US$11.5 and would remain cost-effective if EVC is below US$17.9 (95% CI US$16.9-US$18.8 in

  19. The Impact of Data-Based Science Instruction on Standardized Test Performance

    Science.gov (United States)

    Herrington, Tia W.

    Increased teacher accountability efforts have resulted in the use of data to improve student achievement. This study addressed teachers' inconsistent use of data-driven instruction in middle school science. Evidence of the impact of data-based instruction on student achievement and school and district practices has been well documented by researchers. In science, less information has been available on teachers' use of data for classroom instruction. Drawing on data-driven decision making theory, the purpose of this study was to examine whether data-based instruction impacted performance on the science Criterion Referenced Competency Test (CRCT) and to explore the factors that impeded its use by a purposeful sample of 12 science teachers at a data-driven school. The research questions addressed in this study included understanding: (a) the association between student performance on the science portion of the CRCT and data-driven instruction professional development, (b) middle school science teachers' perception of the usefulness of data, and (c) the factors that hindered the use of data for science instruction. This study employed a mixed methods sequential explanatory design. Data collected included 8th grade CRCT data, survey responses, and individual teacher interviews. A chi-square test revealed no improvement in the CRCT scores following the implementation of professional development on data-driven instruction (chi 2 (1) = .183, p = .67). Results from surveys and interviews revealed that teachers used data to inform their instruction, indicating time as the major hindrance to their use. Implications for social change include the development of lesson plans that will empower science teachers to deliver data-based instruction and students to achieve identified academic goals.

  20. Development of a database system for the management of non-treated radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso, E-mail: ajp@cdtn.br, E-mail: cbf@cdtn.br, E-mail: vc@cdtn.br, E-mail: pos@cdtn.br, E-mail: seless@cdtn.br, E-mail: hauczmj@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  1. Development of a database system for the management of non-treated radioactive waste

    International Nuclear Information System (INIS)

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso

    2017-01-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  2. Comparison of Cloud vs. Tape Backup Performance and Costs with Oracle Database

    OpenAIRE

    Zrnec, Aljaž; Lavbič, Dejan

    2011-01-01

    Current practice of backing up data is based on using backup tapes and remote locations for storing data. Nowadays, with the advent of cloud computing a new concept of database backup emerges. The paper presents the possibility of making backup copies of data in the cloud. We are mainly focused on performance and economic issues of making backups in the cloud in comparison to traditional backups. We tested the performance and overall costs of making backup copies of data in Ora...

  3. Benchmarking routine psychological services: a discussion of challenges and methods.

    Science.gov (United States)

    Delgadillo, Jaime; McMillan, Dean; Leach, Chris; Lucock, Mike; Gilbody, Simon; Wood, Nick

    2014-01-01

    Policy developments in recent years have led to important changes in the level of access to evidence-based psychological treatments. Several methods have been used to investigate the effectiveness of these treatments in routine care, with different approaches to outcome definition and data analysis. To present a review of challenges and methods for the evaluation of evidence-based treatments delivered in routine mental healthcare. This is followed by a case example of a benchmarking method applied in primary care. High, average and poor performance benchmarks were calculated through a meta-analysis of published data from services working under the Improving Access to Psychological Therapies (IAPT) Programme in England. Pre-post treatment effect sizes (ES) and confidence intervals were estimated to illustrate a benchmarking method enabling services to evaluate routine clinical outcomes. High, average and poor performance ES for routine IAPT services were estimated to be 0.91, 0.73 and 0.46 for depression (using PHQ-9) and 1.02, 0.78 and 0.52 for anxiety (using GAD-7). Data from one specific IAPT service exemplify how to evaluate and contextualize routine clinical performance against these benchmarks. The main contribution of this report is to summarize key recommendations for the selection of an adequate set of psychometric measures, the operational definition of outcomes, and the statistical evaluation of clinical performance. A benchmarking method is also presented, which may enable a robust evaluation of clinical performance against national benchmarks. Some limitations concerned significant heterogeneity among data sources, and wide variations in ES and data completeness.

  4. Intelligent stochastic optimization routine for in-core fuel cycle design

    International Nuclear Information System (INIS)

    Parks, G.T.

    1988-01-01

    Any reactor fuel management strategy must specify the fuel design, batch sizes, loading configurations, and operational procedures for each cycle. To permit detailed design studies, the complex core characteristics must necessarily be computer modeled. Thus, the identification of an optimal fuel cycle design represents an optimization problem with a nonlinear objective function (OF), nonlinear safety constraints, many control variables, and no direct derivative information. Most available library routines cannot tackle such problems; this paper introduces an intelligent stochastic optimization routine that can. There has been considerable interest recently in the application of stochastic methods to difficult optimization problems, based on the statistical mechanics algorithms originally attributed to Metropolis. Previous work showed that, in optimizing the performance of a British advanced gas-cooled reactor fuel stringer, a rudimentary version of the Metropolis algorithm performed as efficiently as the only suitable routine in the Numerical Algorithms Group library. Since then the performance of the Metropolis algorithm has been considerably enhanced by the introduction of self-tuning capabilities by which the routine adjusts its control parameters and search pattern as it progresses. Both features can be viewed as examples of artificial intelligence, in which the routine uses the accumulation of data, or experience, to guide its future actions

  5. "Mr. Database" : Jim Gray and the History of Database Technologies.

    Science.gov (United States)

    Hanwahr, Nils C

    2017-12-01

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

  6. NREL: U.S. Life Cycle Inventory Database - About the LCI Database Project

    Science.gov (United States)

    About the LCI Database Project The U.S. Life Cycle Inventory (LCI) Database is a publicly available database that allows users to objectively review and compare analysis results that are based on similar source of critically reviewed LCI data through its LCI Database Project. NREL's High-Performance

  7. Effect of Uncertainties in CO2 Property Databases on the S-CO2 Compressor Performance

    International Nuclear Information System (INIS)

    Lee, Je Kyoung; Lee, Jeong Ik; Ahn, Yoonhan; Kim, Seong Gu; Cha, Je Eun

    2013-01-01

    Various S-CO 2 Brayton cycle experiment facilities are on the state of construction or operation for demonstration of the technology. However, during the data analysis, S-CO 2 property databases are widely used to predict the performance and characteristics of S-CO 2 Brayton cycle. Thus, a reliable property database is very important before any experiment data analyses or calculation. In this paper, deviation of two different property databases which are widely used for the data analysis will be identified by using three selected properties for comparison, C p , density and enthalpy. Furthermore, effect of above mentioned deviation on the analysis of test data will be briefly discussed. From this deviation, results of the test data analysis can have critical error. As the S-CO 2 Brayton cycle researcher knows, CO 2 near the critical point has dramatic change on thermodynamic properties. Thus, it is true that a potential error source of property prediction exists in CO 2 properties near the critical point. During an experiment data analysis with the S-CO 2 Brayton cycle experiment facility, thermodynamic properties are always involved to predict the component performance and characteristics. Thus, construction or defining of precise CO 2 property database should be carried out to develop Korean S-CO 2 Brayton cycle technology

  8. Reliability database development and plant performance improvement effort at Korea Hydro and Nuclear Power Co

    International Nuclear Information System (INIS)

    Oh, S. J.; Hwang, S. W.; Na, J. H.; Lim, H. S.

    2008-01-01

    Nuclear utilities in recent years have focused on improved plant performance and equipment reliability. In U.S., there is a movement toward process integration. Examples are INPO AP-913 equipment reliability program and the standard nuclear performance model developed by NEI. Synergistic effect from an integrated approach can be far greater than as compared to individual effects from each program. In Korea, PSA for all Korean NPPs (Nuclear Power Plants) has been completed. Plant performance monitoring and improvement is an important goal for KHNP (Korea Hydro and Nuclear Power Company) and a risk monitoring system called RIMS has been developed for all nuclear plants. KHNP is in the process of voluntarily implementing maintenance rule program similar to that in U.S. In the future, KHNP would like to expand the effort to equipment reliability program and to achieve highest equipment reliability and improved plant performance. For improving equipment reliability, the current trend is moving toward preventive/predictive maintenance from corrective maintenance. With the emphasis on preventive maintenance, the failure cause and operation history and environment are important. Hence, the development of accurate reliability database is necessary. Furthermore, the database should be updated regularly and maintained as a living program to reflect the current status of equipment reliability. This paper examines the development of reliability database system and its application of maintenance optimization or Risk Informed Application (RIA). (authors)

  9. Eosinophilia in routine blood samples as a biomarker for solid tumor development

    DEFF Research Database (Denmark)

    Andersen, Christen Bertel L; Siersma, V.D.; Hasselbalch, H.C.

    2014-01-01

    eosinophilia in routine blood samples as a potential biomarker of solid tumor development in a prospective design. MATERIAL AND METHODS: From the Copenhagen Primary Care Differential Count (CopDiff) Database, we identified 356 196 individuals with at least one differential cell count (DIFF) encompassing...... was increased with mild eosinophilia [OR 1.93 (CI 1.29-2.89), p = 0.0013]. No associations with eosinophilia were observed for the remaining solid cancers. CONCLUSION: We demonstrate that eosinophilia in routine blood samples associates with an increased risk of bladder cancer. Our data emphasize...... that additional preclinical studies are needed in order to shed further light on the role of eosinophils in carcinogenesis, where it is still unknown whether the cells contribute to tumor immune surveillance or neoplastic evolution....

  10. Version 1.00 programmer's tools used in constructing the INEL RML/analytical radiochemistry sample tracking database and its user interface

    International Nuclear Information System (INIS)

    Femec, D.A.

    1995-09-01

    This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases

  11. 75 FR 40014 - Privacy Act of 1974, as Amended; Proposed System of Records and Routine Use Disclosures

    Science.gov (United States)

    2010-07-13

    ...: Economic Recovery List (ERL) Database, Social Security Administration. SYSTEM CLASSIFICATION: None. SYSTEM... SOCIAL SECURITY ADMINISTRATION Privacy Act of 1974, as Amended; Proposed System of Records and Routine Use Disclosures AGENCY: Social Security Administration (SSA). ACTION: Proposed System of Records...

  12. Comparison of performance of tile drainage routines in SWAT 2009 and 2012 in an extensively tile-drained watershed in the Midwest

    Directory of Open Access Journals (Sweden)

    T. Guo

    2018-01-01

    Full Text Available Subsurface tile drainage systems are widely used in agricultural watersheds in the Midwestern US and enable the Midwest area to become highly productive agricultural lands, but can also create environmental problems, for example nitrate-N contamination associated with drainage waters. The Soil and Water Assessment Tool (SWAT has been used to model watersheds with tile drainage. SWAT2012 revisions 615 and 645 provide new tile drainage routines. However, few studies have used these revisions to study tile drainage impacts at both field and watershed scales. Moreover, SWAT2012 revision 645 improved the soil moisture based curve number calculation method, which has not been fully tested. This study used long-term (1991–2003 field site and river station data from the Little Vermilion River (LVR watershed to evaluate performance of tile drainage routines in SWAT2009 revision 528 (the old routine and SWAT2012 revisions 615 and 645 (the new routine. Both the old and new routines provided reasonable but unsatisfactory (NSE  <  0.5 uncalibrated flow and nitrate loss results for a mildly sloped watershed with low runoff. The calibrated monthly tile flow, surface flow, nitrate-N in tile and surface flow, sediment and annual corn and soybean yield results from SWAT with the old and new tile drainage routines were compared with observed values. Generally, the new routine provided acceptable simulated tile flow (NSE  =  0.48–0.65 and nitrate in tile flow (NSE  =  0.48–0.68 for field sites with random pattern tile and constant tile spacing, while the old routine simulated tile flow and nitrate in tile flow results for the field site with constant tile spacing were unacceptable (NSE  =  0.00–0.32 and −0.29–0.06, respectively. The new modified curve number calculation method in revision 645 (NSE  =  0.50–0.81 better simulated surface runoff than revision 615 (NSE  =  −0.11–0.49. The calibration

  13. Evaluation of the Performance of Routine Information System Management (PRISM framework: evidence from Uganda

    Directory of Open Access Journals (Sweden)

    Aqil Anwer

    2010-07-01

    Full Text Available Abstract Background Sound policy, resource allocation and day-to-day management decisions in the health sector require timely information from routine health information systems (RHIS. In most low- and middle-income countries, the RHIS is viewed as being inadequate in providing quality data and continuous information that can be used to help improve health system performance. In addition, there is limited evidence on the effectiveness of RHIS strengthening interventions in improving data quality and use. The purpose of this study is to evaluate the usefulness of the newly developed Performance of Routine Information System Management (PRISM framework, which consists of a conceptual framework and associated data collection and analysis tools to assess, design, strengthen and evaluate RHIS. The specific objectives of the study are: a to assess the reliability and validity of the PRISM instruments and b to assess the validity of the PRISM conceptual framework. Methods Facility- and worker-level data were collected from 110 health care facilities in twelve districts in Uganda in 2004 and 2007 using records reviews, structured interviews and self-administered questionnaires. The analysis procedures include Cronbach's alpha to assess internal consistency of selected instruments, test-retest analysis to assess the reliability and sensitivity of the instruments, and bivariate and multivariate statistical techniques to assess validity of the PRISM instruments and conceptual framework. Results Cronbach's alpha analysis suggests high reliability (0.7 or greater for the indices measuring a promotion of a culture of information, RHIS tasks self-efficacy and motivation. The study results also suggest that a promotion of a culture of information influences RHIS tasks self-efficacy, RHIS tasks competence and motivation, and that self-efficacy and the presence of RHIS staff have a direct influence on the use of RHIS information, a key aspect of RHIS performance

  14. Med-records: an ADD database of AAEC medical records since 1966

    International Nuclear Information System (INIS)

    Barry, J.M.; Pollard, J.P.; Tucker, A.D.

    1986-08-01

    Since its inception in 1958 most of the staff of the AAEC Research Establishment at Lucas Heights have had annual medical examinations. Medical information accrued since 1966 has been collected as an ADD database to allow ad hoc enquiries to be made against the data. Details are given of the database schema and numerous support routines ranging from the integrity checking of input data to analysis and plotting of the summary results

  15. The SACADA database for human reliability and human performance

    International Nuclear Information System (INIS)

    James Chang, Y.; Bley, Dennis; Criscione, Lawrence; Kirwan, Barry; Mosleh, Ali; Madary, Todd; Nowell, Rodney; Richards, Robert; Roth, Emilie M.; Sieben, Scott; Zoulis, Antonios

    2014-01-01

    Lack of appropriate and sufficient human performance data has been identified as a key factor affecting human reliability analysis (HRA) quality especially in the estimation of human error probability (HEP). The Scenario Authoring, Characterization, and Debriefing Application (SACADA) database was developed by the U.S. Nuclear Regulatory Commission (NRC) to address this data need. An agreement between NRC and the South Texas Project Nuclear Operating Company (STPNOC) was established to support the SACADA development with aims to make the SACADA tool suitable for implementation in the nuclear power plants' operator training program to collect operator performance information. The collected data would support the STPNOC's operator training program and be shared with the NRC for improving HRA quality. This paper discusses the SACADA data taxonomy, the theoretical foundation, the prospective data to be generated from the SACADA raw data to inform human reliability and human performance, and the considerations on the use of simulator data for HRA. Each SACADA data point consists of two information segments: context and performance results. Context is a characterization of the performance challenges to task success. The performance results are the results of performing the task. The data taxonomy uses a macrocognitive functions model for the framework. At a high level, information is classified according to the macrocognitive functions of detecting the plant abnormality, understanding the abnormality, deciding the response plan, executing the response plan, and team related aspects (i.e., communication, teamwork, and supervision). The data are expected to be useful for analyzing the relations between context, error modes and error causes in human performance

  16. [Use of PubMed to improve evidence-based medicine in routine urological practice].

    Science.gov (United States)

    Rink, M; Kluth, L A; Shariat, S F; Chun, F K; Fisch, M; Dahm, P

    2013-03-01

    Applying evidence-based medicine in daily clinical practice is the basis of patient-centered medicine and knowledge of accurate literature acquisition skills is necessary for informed clinical decision-making. PubMed is an easy accessible, free bibliographic database comprising over 21 million citations from the medical field, life-science journals and online books. The article summarizes the effective use of PubMed in routine urological clinical practice based on a common case scenario. This article explains the simple use of PubMed to obtain the best search results with the highest evidence. Accurate knowledge about the use of PubMed in routine clinical practice can improve evidence-based medicine and also patient treatment.

  17. The high-performance database archiver for the LHC experiments

    CERN Document Server

    González-Berges, M

    2007-01-01

    Each of the Large Hadron Collider (LHC) experiments will be controlled by a large distributed system built with the Supervisory Control and Data Acquisition (SCADA) tool Prozeßvisualisierungs- und Steuerungsystem (PVSS). There will be in the order of 150 computers and one million input/output parameters per experiment. The values read from the hardware, the alarms generated and the user actions will be archived for the later physics analysis, the operation and the debugging of the control system itself. Although the original PVSS implementation of a database archiver was appropriate for standard industrial use, the performance was not sufficient for the experiments. A collaboration was setup between CERN and ETM, the company that develops PVSS. Changes in the architecture and several optimizations were made and tested in a system of a comparable size to the final ones. As a result, we have been able to improve the performance by more than one order of magnitude, and what is more important, we now have a scal...

  18. Performance of a real-time PCR assay in routine bovine mastitis diagnostics compared with in-depth conventional culture.

    Science.gov (United States)

    Hiitiö, Heidi; Riva, Rauna; Autio, Tiina; Pohjanvirta, Tarja; Holopainen, Jani; Pyörälä, Satu; Pelkonen, Sinikka

    2015-05-01

    Reliable identification of the aetiological agent is crucial in mastitis diagnostics. Real-time PCR is a fast, automated tool for detecting the most common udder pathogens directly from milk. In this study aseptically taken quarter milk samples were analysed with a real-time PCR assay (Thermo Scientific PathoProof Mastitis Complete-12 Kit, Thermo Fisher Scientific Ltd.) and by semi-quantitative, in-depth bacteriological culture (BC). The aim of the study was to evaluate the diagnostic performance of the real-time PCR assay in routine use. A total of 294 quarter milk samples from routine mastitis cases were cultured in the national reference laboratory of Finland and examined with real-time PCR. With BC, 251 out of 294 (85.7%) of the milk samples had at least one colony on the plate and 38 samples were considered contaminated. In the PCR mastitis assay, DNA of target species was amplified in 244 samples out of 294 (83.0%). The most common bacterial species detected in the samples, irrespective of the diagnostic method, was the coagulase negative staphylococci (CNS) group (later referred as Staphylococcus spp.) followed by Staphylococcus aureus. Sensitivity (Se) and specificity (Sp) for the PCR assay to provide a positive Staph. aureus result was 97.0 and 95.8% compared with BC. For Staphylococcus spp., the corresponding figures were 86.7 and 75.4%. Our results imply that PCR performed well as a diagnostic tool to detect Staph. aureus but may be too nonspecific for Staphylococcus spp. in routine use with the current cut-off Ct value (37.0). Using PCR as the only microbiological method for mastitis diagnostics, clinical relevance of the results should be carefully considered before further decisions, for instance antimicrobial treatment, especially when minor pathogens with low amount of DNA have been detected. Introducing the concept of contaminated samples should also be considered.

  19. Database usage and performance for the Fermilab Run II experiments

    International Nuclear Information System (INIS)

    Bonham, D.; Box, D.; Gallas, E.; Guo, Y.; Jetton, R.; Kovich, S.; Kowalkowski, J.; Kumar, A.; Litvintsev, D.; Lueking, L.; Stanfield, N.; Trumbo, J.; Vittone-Wiersma, M.; White, S.P.; Wicklund, E.; Yasuda, T.; Maksimovic, P.

    2004-01-01

    The Run II experiments at Fermilab, CDF and D0, have extensive database needs covering many areas of their online and offline operations. Delivering data to users and processing farms worldwide has represented major challenges to both experiments. The range of applications employing databases includes, calibration (conditions), trigger information, run configuration, run quality, luminosity, data management, and others. Oracle is the primary database product being used for these applications at Fermilab and some of its advanced features have been employed, such as table partitioning and replication. There is also experience with open source database products such as MySQL for secondary databases used, for example, in monitoring. Tools employed for monitoring the operation and diagnosing problems are also described

  20. An interferon-gamma release assay test performs well in routine screening for tuberculosis

    DEFF Research Database (Denmark)

    Vestergaard Danielsen, Allan; Fløe, Andreas; Lillebæk, Troels

    2014-01-01

    Introduction: A positive interferon-gamma release assay (IGRA) is regarded as proof of latent Mycobacterium tuberculosis infection. We conducted an evaluation of the IGRA test “T-SPOT.TB” to test its performance during clinical routine use by analysing the positivity rate and odds, effect of season...... and sensitivity. Material and methods: Data from T-SPOT.TB testing together with age and test indications (anti-tumour necrosis factor alpha (TNFα) candidate, contact investigation or suspicion of tuberculosis (TB)) were combined with mycobac­teria culture results. Results: A total of 1,809 patients were tested....... Conclusive results were achieved for 1,780 patients (98.4%). Among these, 4.6% of anti-TNFα candidates, 19.3% of contacts and 24.4% of TB suspects tested positive. Compared with anti-TNFα candidates, the odds for a positive result were significantly higher for contact investigations (odds ratio (OR), mean...

  1. YUCSA: A CLIPS expert database system to monitor academic performance

    Science.gov (United States)

    Toptsis, Anestis A.; Ho, Frankie; Leindekar, Milton; Foon, Debra Low; Carbonaro, Mike

    1991-01-01

    The York University CLIPS Student Administrator (YUCSA), an expert database system implemented in C Language Integrated Processing System (CLIPS), for monitoring the academic performance of undergraduate students at York University, is discussed. The expert system component in the system has already been implemented for two major departments, and it is under testing and enhancement for more departments. Also, more elaborate user interfaces are under development. We describe the design and implementation of the system, problems encountered, and immediate future plans. The system has excellent maintainability and it is very efficient, taking less than one minute to complete an assessment of one student.

  2. A Case for Database Filesystems

    Energy Technology Data Exchange (ETDEWEB)

    Adams, P A; Hax, J C

    2009-05-13

    Data intensive science is offering new challenges and opportunities for Information Technology and traditional relational databases in particular. Database filesystems offer the potential to store Level Zero data and analyze Level 1 and Level 3 data within the same database system [2]. Scientific data is typically composed of both unstructured files and scalar data. Oracle SecureFiles is a new database filesystem feature in Oracle Database 11g that is specifically engineered to deliver high performance and scalability for storing unstructured or file data inside the Oracle database. SecureFiles presents the best of both the filesystem and the database worlds for unstructured content. Data stored inside SecureFiles can be queried or written at performance levels comparable to that of traditional filesystems while retaining the advantages of the Oracle database.

  3. Differentiation of several interstitial lung disease patterns in HRCT images using support vector machine: role of databases on performance

    Science.gov (United States)

    Kale, Mandar; Mukhopadhyay, Sudipta; Dash, Jatindra K.; Garg, Mandeep; Khandelwal, Niranjan

    2016-03-01

    Interstitial lung disease (ILD) is complicated group of pulmonary disorders. High Resolution Computed Tomography (HRCT) considered to be best imaging technique for analysis of different pulmonary disorders. HRCT findings can be categorised in several patterns viz. Consolidation, Emphysema, Ground Glass Opacity, Nodular, Normal etc. based on their texture like appearance. Clinician often find it difficult to diagnosis these pattern because of their complex nature. In such scenario computer-aided diagnosis system could help clinician to identify patterns. Several approaches had been proposed for classification of ILD patterns. This includes computation of textural feature and training /testing of classifier such as artificial neural network (ANN), support vector machine (SVM) etc. In this paper, wavelet features are calculated from two different ILD database, publically available MedGIFT ILD database and private ILD database, followed by performance evaluation of ANN and SVM classifiers in terms of average accuracy. It is found that average classification accuracy by SVM is greater than ANN where trained and tested on same database. Investigation continued further to test variation in accuracy of classifier when training and testing is performed with alternate database and training and testing of classifier with database formed by merging samples from same class from two individual databases. The average classification accuracy drops when two independent databases used for training and testing respectively. There is significant improvement in average accuracy when classifiers are trained and tested with merged database. It infers dependency of classification accuracy on training data. It is observed that SVM outperforms ANN when same database is used for training and testing.

  4. Who should be performing routine abdominal ultrasound? A prospective double-blind study comparing the accuracy of radiologist and radiographer

    International Nuclear Information System (INIS)

    Leslie, A.; Lockyer, H.; Virjee, J.P.

    2000-01-01

    AIM: To compare the accuracy of radiographers and radiologists in routine abdominal ultrasound. MATERIALS AND METHODS: One hundred consecutive patients attending for routine abdominal ultrasound were included. Each patient was examined by both a radiographer and radiologist. Both operators noted their findings and wrote a concluding report without conferring. Reports were compared. Where there was disagreement the patient was either re-examined by another radiologist or had further investigation. RESULTS: Of 100 patients, 52 were men and 48 were women. The age range was 19-88 years (median 52 years). Thirty-seven patients had renal tract ultrasound, one had an aortic ultrasound and 62 had general upper abdominal ultrasound. In 44 cases both operators reported the examination as normal. In 49 cases both operators reported the examinations as abnormal and there was complete agreement between the operators. In seven cases there was not complete agreement between operators. Three of these disagreements were considered minor and four major. In three of the seven cases the radiographer was correct, and in four the radiologist was correct. CONCLUSION: Experienced radiographers and radiologists are highly accurate in performing and interpreting routine abdominal sonography. Both operators missed a small minority of abnormalities. There was no statistically significant difference in the accuracy of radiographers and radiologist. Leslie, A. (2000)

  5. Comparison of different references for brain perfusion SPECT quantification in clinical routine

    International Nuclear Information System (INIS)

    Olivera J, P.; Acton, P.; Costa, D.

    1997-01-01

    Full text: We used 40 brain perfusion SPECT studies from the INM, UCL database to investigate the performance of several references (denominators) in the calculation of perfusion ratios with single photon emission tomography (S PET) within a routine clinical service. According to clinical diagnosis and previous SPECT findings 4 groups were identified composed of: 10 controls (C, 23 to 84 y old); 10 myalgic-encephalomyelitis / chronic fatigue syndrome (ME/CFS, 22 to 61 y old); 10 major depression (MD, 24 to 68 y old); and 10 temporal lobe epilepsy (TLE, 19 to 39 y old). Routine protocols for processing were used and the analysis was blind to group classification. Brain perfusion ratios were calculated using 7 different references: hemi cerebellum with higher counts (Cer), total counts in a 4 pixel slice through the basal ganglia slice (BG), average counts per pixel in the visual cortex (VC), average counts per pixel in the white matter (WM), total acquired counts (TAC), total reconstructed counts (TRC) and maximum counts per pixel in the entire study (MAXX). Unpaired test to compare different diagnostic groups, coefficient of variation (CV) to assess the reliability to each references followed by ANOVA were the statistical test used. The lowest mean CV's were found with VC (4.8%) and TRC (5.1%), with all the others significantly higher (p<0.0001). The range of CV's for Cer was the lowest (3.7% to 5.9%). Consistent differentiation between diagnostic groups and controls was only obtained with Cer. In conclusion, it appears that for clinical routine services Cer is the most reliable reference, exception made for all diseases affecting the cerebellum. In these cases TRC or VC should be preferred. (authors)

  6. Performance of Point and Range Queries for In-memory Databases using Radix Trees on GPUs

    Energy Technology Data Exchange (ETDEWEB)

    Alam, Maksudul [ORNL; Yoginath, Srikanth B [ORNL; Perumalla, Kalyan S [ORNL

    2016-01-01

    In in-memory database systems augmented by hardware accelerators, accelerating the index searching operations can greatly increase the runtime performance of database queries. Recently, adaptive radix trees (ART) have been shown to provide very fast index search implementation on the CPU. Here, we focus on an accelerator-based implementation of ART. We present a detailed performance study of our GPU-based adaptive radix tree (GRT) implementation over a variety of key distributions, synthetic benchmarks, and actual keys from music and book data sets. The performance is also compared with other index-searching schemes on the GPU. GRT on modern GPUs achieves some of the highest rates of index searches reported in the literature. For point queries, a throughput of up to 106 million and 130 million lookups per second is achieved for sparse and dense keys, respectively. For range queries, GRT yields 600 million and 1000 million lookups per second for sparse and dense keys, respectively, on a large dataset of 64 million 32-bit keys.

  7. External Agents' Effect on Routine Dynamics:Lack of Compliance Resulting in Routine Breakdown

    OpenAIRE

    Busse Hansen, Nicolai

    2014-01-01

    Prior investigations on organizational routines have called for re- search to enlighten our understanding of how social actors establish and main- tain of routines as well as the causes of their disruption. The present paper con- tributes to this call by conducting systematic microethnographic analyses of naturally occurring interactional routine data in the form of recordings of job interviews in an international oil contractor company. The term interactional routine is used to describe recu...

  8. Database reliability engineering designing and operating resilient database systems

    CERN Document Server

    Campbell, Laine

    2018-01-01

    The infrastructure-as-code revolution in IT is also affecting database administration. With this practical book, developers, system administrators, and junior to mid-level DBAs will learn how the modern practice of site reliability engineering applies to the craft of database architecture and operations. Authors Laine Campbell and Charity Majors provide a framework for professionals looking to join the ranks of today’s database reliability engineers (DBRE). You’ll begin by exploring core operational concepts that DBREs need to master. Then you’ll examine a wide range of database persistence options, including how to implement key technologies to provide resilient, scalable, and performant data storage and retrieval. With a firm foundation in database reliability engineering, you’ll be ready to dive into the architecture and operations of any modern database. This book covers: Service-level requirements and risk management Building and evolving an architecture for operational visibility ...

  9. A database for CO2 Separation Performances of MOFs based on Computational Materials Screening.

    Science.gov (United States)

    Altintas, Cigdem; Avci, Gokay; Daglar, Hilal; Nemati Vesali Azar, Ayda; Velioglu, Sadiye; Erucar, Ilknur; Keskin, Seda

    2018-05-03

    Metal organic frameworks (MOFs) have been considered as great candidates for CO2 capture. Considering the very large number of available MOFs, high-throughput computational screening plays a critical role in identifying the top performing materials for target applications in a time-effective manner. In this work, we used molecular simulations to screen the most recent and complete MOF database for identifying the most promising materials for CO2 separation from flue gas (CO2/N2) and landfill gas (CO2/CH4) under realistic operating conditions. We first validated our approach by comparing the results of our molecular simulations for the CO2 uptakes, CO2/N2 and CO2/CH4 selectivities of various types of MOFs with the available experimental data. We then computed binary CO2/N2 and CO2/CH4 mixture adsorption data for the entire MOF database and used these results to calculate several adsorbent selection metrics such as selectivity, working capacity, adsorbent performance score, regenerability, and separation potential. MOFs were ranked based on the combination of these metrics and the top performing MOF adsorbents that can achieve CO2/N2 and CO2/CH4 separations with high performance were identified. Molecular simulations for the adsorption of a ternary CO2/N2/CH4 mixture were performed for these top materials in order to provide a more realistic performance assessment of MOF adsorbents. Structure-performance analysis showed that MOFs with ΔQ>30 kJ/mol, 3.8 A≤PLD≤5 A, 5 A≤LCD≤7.5 A, 0.5≤ϕ≤0.75, SA≤1,000 m2/g, ρ>1 g/cm 3 are the best candidates for selective separation of CO2 from flue gas and landfill gas. This information will be very useful to design novel MOFs with the desired structural features that can lead to high CO2 separation potentials. Finally, an online, freely accessible database https://cosmoserc.ku.edu.tr was established, for the first time in the literature, which reports all computed adsorbent metrics of 3,816 MOFs for CO2/N2, CO2/CH4

  10. DANBIO-powerful research database and electronic patient record

    DEFF Research Database (Denmark)

    Hetland, Merete Lund

    2011-01-01

    an overview of the research outcome and presents the cohorts of RA patients. The registry, which is approved as a national quality registry, includes patients with RA, PsA and AS, who are followed longitudinally. Data are captured electronically from the source (patients and health personnel). The IT platform...... as an electronic patient 'chronicle' in routine care, and at the same time provides a powerful research database....

  11. Specialist Bibliographic Databases

    OpenAIRE

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A.; Trukhachev, Vladimir I.; Kostyukova, Elena I.; Gerasimov, Alexey N.; Kitas, George D.

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and d...

  12. The Utrecht Health Project: Optimization of routine healthcare data for research

    International Nuclear Information System (INIS)

    Grobbee, Diederick E.; Hoes, Arno W.; Verheij, Theo J. M.; Schrijvers, Augustinus J. P.; Ameijden, Erik J. C. van; Numans, Mattijs E.

    2005-01-01

    Background. Research on the impact of changes in healthcare policy, developments in community and public health and determinants of health and disease during lifetime may effectively make use of routine healthcare data. These data, however, need to meet minimal criteria for quality and completeness. Research opportunities are further improved when routine data are supplemented with a standardized 'baseline' assessment of the full population. This formed the basis for a new study initiated in a newly developed large residential area in Leidsche Rijn, part of the city of Utrecht, the Netherlands.Methods. All new inhabitants are invited by their general practitioner to participate in the Utrecht Health Project (UHP). Informed consent is obtained and an individual health profile (IHP) is made by dedicated research nurses. The IHP is the starting point for the UHP research database as well as for the primary care electronic medical records. Follow-up data are collected through continuous linkage with the computerized medical files recorded by the general practitioners. UHP staff in each practice takes care of quality management of registration as well as data handling.Results. Currently, over 60 of invited new residents in the area have given informed consent with participation steadily increasing. Discussion. The Utrecht Health Project combines key elements of traditional epidemiologic cohort studies with the current power of routine electronic medical record keeping in primary care. The research approach optimizes routine health care data for use in scientific research

  13. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  14. An integrated data-analysis and database system for AMS 14C

    International Nuclear Information System (INIS)

    Kjeldsen, Henrik; Olsen, Jesper; Heinemeier, Jan

    2010-01-01

    AMSdata is the name of a combined database and data-analysis system for AMS 14 C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS 14 C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.

  15. The embeddedness of selfish Routines

    DEFF Research Database (Denmark)

    Andersen, Poul Houman

    2001-01-01

    Routines have traditionally been seen as an organisational feature. However, like genes, routines may be carriers and initiators of organisations as well......Routines have traditionally been seen as an organisational feature. However, like genes, routines may be carriers and initiators of organisations as well...

  16. The influence of the negative-positive ratio and screening database size on the performance of machine learning-based virtual screening.

    Science.gov (United States)

    Kurczab, Rafał; Bojarski, Andrzej J

    2017-01-01

    The machine learning-based virtual screening of molecular databases is a commonly used approach to identify hits. However, many aspects associated with training predictive models can influence the final performance and, consequently, the number of hits found. Thus, we performed a systematic study of the simultaneous influence of the proportion of negatives to positives in the testing set, the size of screening databases and the type of molecular representations on the effectiveness of classification. The results obtained for eight protein targets, five machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest), two types of molecular fingerprints (MACCS and CDK FP) and eight screening databases with different numbers of molecules confirmed our previous findings that increases in the ratio of negative to positive training instances greatly influenced most of the investigated parameters of the ML methods in simulated virtual screening experiments. However, the performance of screening was shown to also be highly dependent on the molecular library dimension. Generally, with the increasing size of the screened database, the optimal training ratio also increased, and this ratio can be rationalized using the proposed cost-effectiveness threshold approach. To increase the performance of machine learning-based virtual screening, the training set should be constructed in a way that considers the size of the screening database.

  17. ISSUES IN MOBILE DISTRIBUTED REAL TIME DATABASES: PERFORMANCE AND REVIEW

    OpenAIRE

    VISHNU SWAROOP,; Gyanendra Kumar Gupta,; UDAI SHANKER

    2011-01-01

    Increase in handy and small electronic devices in computing fields; it makes the computing more popularand useful in business. Tremendous advances in wireless networks and portable computing devices have led to development of mobile computing. Support of real time database system depending upon thetiming constraints, due to availability of data distributed database, and ubiquitous computing pull the mobile database concept, which emerges them in a new form of technology as mobile distributed ...

  18. DataBase on Demand

    International Nuclear Information System (INIS)

    Aparicio, R Gaspar; Gomez, D; Wojcik, D; Coz, I Coterillo

    2012-01-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  19. Attitudes, subjective norms, and intention to perform routine oral examination for oropharyngeal candidiasis as perceived by primary health-care providers in Nairobi Province

    NARCIS (Netherlands)

    Koyio, L.N.; Kikwilu, E.N.; Mulder, J.; Frencken, J.E.F.M.

    2013-01-01

    Objectives: To assess attitudes, subjective norms, and intentions of primary health-care (PHC) providers in performing routine oral examination for oropharyngeal candidiasis (OPC) during outpatient consultations. Methods: A 47-item Theory of Planned Behaviour-based questionnaire was developed and

  20. Accelerating the energy retrofit of commercial buildings using a database of energy efficiency performance

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Hong, Tianzhen; Piette, Mary Ann; Sawaya, Geof; Chen, Yixing; Taylor-Lange, Sarah C.

    2015-01-01

    Small and medium-sized commercial buildings can be retrofitted to significantly reduce their energy use, however it is a huge challenge as owners usually lack of the expertise and resources to conduct detailed on-site energy audit to identify and evaluate cost-effective energy technologies. This study presents a DEEP (database of energy efficiency performance) that provides a direct resource for quick retrofit analysis of commercial buildings. DEEP, compiled from the results of about ten million EnergyPlus simulations, enables an easy screening of ECMs (energy conservation measures) and retrofit analysis. The simulations utilize prototype models representative of small and mid-size offices and retails in California climates. In the formulation of DEEP, large scale EnergyPlus simulations were conducted on high performance computing clusters to evaluate hundreds of individual and packaged ECMs covering envelope, lighting, heating, ventilation, air-conditioning, plug-loads, and service hot water. The architecture and simulation environment to create DEEP is flexible and can expand to cover additional building types, additional climates, and new ECMs. In this study DEEP is integrated into a web-based retrofit toolkit, the Commercial Building Energy Saver, which provides a platform for energy retrofit decision making by querying DEEP and unearthing recommended ECMs, their estimated energy savings and financial payback. - Highlights: • A DEEP (database of energy efficiency performance) supports building retrofit. • DEEP is an SQL database with pre-simulated results from 10 million EnergyPlus runs. • DEEP covers 7 building types, 6 vintages, 16 climates, and 100 energy measures. • DEEP accelerates retrofit of small commercial buildings to save energy use and cost. • DEEP can be expanded and integrated with third-party energy software tools.

  1. Nuclear materials thermo-physical property database and property analysis using the database

    International Nuclear Information System (INIS)

    Jeong, Yeong Seok

    2002-02-01

    It is necessary that thermo-physical properties and understand of nuclear materials for evaluation and analysis to steady and accident states of commercial and research reactor. In this study, development of nuclear materials thermo-properties database and home page. In application of this database, it is analyzed of thermal conductivity, heat capacity, enthalpy, and linear thermal expansion of fuel and cladding material and compared thermo-properties model in nuclear fuel performance evaluation codes with experimental data in database. Results of compare thermo-property model of UO 2 fuel and cladding major performance evaluation code, both are similar

  2. A database of chlorophyll a in Australian waters

    Science.gov (United States)

    Davies, Claire H.; Ajani, Penelope; Armbrecht, Linda; Atkins, Natalia; Baird, Mark E.; Beard, Jason; Bonham, Pru; Burford, Michele; Clementson, Lesley; Coad, Peter; Crawford, Christine; Dela-Cruz, Jocelyn; Doblin, Martina A.; Edgar, Steven; Eriksen, Ruth; Everett, Jason D.; Furnas, Miles; Harrison, Daniel P.; Hassler, Christel; Henschke, Natasha; Hoenner, Xavier; Ingleton, Tim; Jameson, Ian; Keesing, John; Leterme, Sophie C.; James McLaughlin, M.; Miller, Margaret; Moffatt, David; Moss, Andrew; Nayar, Sasi; Patten, Nicole L.; Patten, Renee; Pausina, Sarah A.; Proctor, Roger; Raes, Eric; Robb, Malcolm; Rothlisberg, Peter; Saeck, Emily A.; Scanes, Peter; Suthers, Iain M.; Swadling, Kerrie M.; Talbot, Samantha; Thompson, Peter; Thomson, Paul G.; Uribe-Palomino, Julian; van Ruth, Paul; Waite, Anya M.; Wright, Simon; Richardson, Anthony J.

    2018-02-01

    Chlorophyll a is the most commonly used indicator of phytoplankton biomass in the marine environment. It is relatively simple and cost effective to measure when compared to phytoplankton abundance and is thus routinely included in many surveys. Here we collate 173, 333 records of chlorophyll a collected since 1965 from Australian waters gathered from researchers on regular coastal monitoring surveys and ocean voyages into a single repository. This dataset includes the chlorophyll a values as measured from samples analysed using spectrophotometry, fluorometry and high performance liquid chromatography (HPLC). The Australian Chlorophyll a database is freely available through the Australian Ocean Data Network portal (https://portal.aodn.org.au/). These data can be used in isolation as an index of phytoplankton biomass or in combination with other data to provide insight into water quality, ecosystem state, and relationships with other trophic levels such as zooplankton or fish.

  3. Daily life activity routine discovery in hemiparetic rehabilitation patients using topic models.

    Science.gov (United States)

    Seiter, J; Derungs, A; Schuster-Amft, C; Amft, O; Tröster, G

    2015-01-01

    Monitoring natural behavior and activity routines of hemiparetic rehabilitation patients across the day can provide valuable progress information for therapists and patients and contribute to an optimized rehabilitation process. In particular, continuous patient monitoring could add type, frequency and duration of daily life activity routines and hence complement standard clinical scores that are assessed for particular tasks only. Machine learning methods have been applied to infer activity routines from sensor data. However, supervised methods require activity annotations to build recognition models and thus require extensive patient supervision. Discovery methods, including topic models could provide patient routine information and deal with variability in activity and movement performance across patients. Topic models have been used to discover characteristic activity routine patterns of healthy individuals using activity primitives recognized from supervised sensor data. Yet, the applicability of topic models for hemiparetic rehabilitation patients and techniques to derive activity primitives without supervision needs to be addressed. We investigate, 1) whether a topic model-based activity routine discovery framework can infer activity routines of rehabilitation patients from wearable motion sensor data. 2) We compare the performance of our topic model-based activity routine discovery using rule-based and clustering-based activity vocabulary. We analyze the activity routine discovery in a dataset recorded with 11 hemiparetic rehabilitation patients during up to ten full recording days per individual in an ambulatory daycare rehabilitation center using wearable motion sensors attached to both wrists and the non-affected thigh. We introduce and compare rule-based and clustering-based activity vocabulary to process statistical and frequency acceleration features to activity words. Activity words were used for activity routine pattern discovery using topic models

  4. Prevalence of incidental or unexpected findings on low-dose CT performed during routine SPECT/CT nuclear medicine studies

    International Nuclear Information System (INIS)

    Yap, Kelvin Kwok-Ho; Sutherland, Tom; Shafik-Eid, Raymond; Taubman, Kim; Schlicht, Stephen; Ramaseshan, Ganeshan

    2015-01-01

    In nuclear medicine, single-photon-emission computed tomography (SPECT) is often combined with ‘simultaneous’ low-dose CT (LDCT) to provide complementary anatomical and functional correlation. As a consequence, numerous incidental and unexpected findings may be detected on LDCT. Recognition of these findings and appropriate determination of their relevance can add to the utility of SPECT/CT. We aimed to evaluate the prevalence and categorise the relevance of incidental and unexpected findings on LDCT scans performed as part of routine SPECT/CT studies. All available LDCT scans performed as part of SPECT/CT studies at St. Vincent's Hospital Melbourne in the year 2013 were retrospectively reviewed. Two qualified radiologists independently reviewed the studies and any previous available imaging and categorised any detected incidental findings. A total of 2447 LDCT studies were reviewed. The relevance of the findings was classified according to a modified version of a scale used in the Colonography Reporting and Data System: E1 = normal or normal variant (28.0%); E2 = clinically unimportant (63.5%); E3 = likely unimportant or incompletely characterised (6.2%); E4 = potentially important (2.5%). Imaging specialists need to be cognisant of incidental and unexpected findings present on LDCT studies performed as part of SPECT/CT. Appropriate categorisation of findings and communication of potentially important findings to referring clinicians should form part of routine practice. The overall prevalence of potentially significant incidental and unexpected findings in our series was 8.7% (E3, 6.2%; E4, 2.5%) and was comparable to rates in other published imaging series.

  5. Routine chest X-ray in the allergy clinic

    International Nuclear Information System (INIS)

    Garcia-Barredo, M.R.; Usamentiaga, E.; Fidalgo, I.

    1997-01-01

    To determine whether routine chest X-ray is indicated in allergy patients when there is no evidence of cardiopulmonary involvement. A retrospective study to analyze the indications and radiologic findings in 515 consecutive patients who underwent chest X-ray: Positive findings were considered to be any radiological sing that led to the performance of additional diagnostic measures or a change in the therapeutic management of the patient. Positive radiologic findings were observed in 39 cases (7.59%). Only two patients (0.38%) were diagnosed as having diseases that were susceptible to proper treatment. In one of them (0.19%), the failure to perform chest X-ray would have impeded the introduction of proper treatment. We do not recommend carrying out routine chest X-ray in this patient population. (Author) 7 refs

  6. The shortest path algorithm performance comparison in graph and relational database on a transportation network

    Directory of Open Access Journals (Sweden)

    Mario Miler

    2014-02-01

    Full Text Available In the field of geoinformation and transportation science, the shortest path is calculated on graph data mostly found in road and transportation networks. This data is often stored in various database systems. Many applications dealing with transportation network require calculation of the shortest path. The objective of this research is to compare the performance of Dijkstra shortest path calculation in PostgreSQL (with pgRouting and Neo4j graph database for the purpose of determining if there is any difference regarding the speed of the calculation. Benchmarking was done on commodity hardware using OpenStreetMap road network. The first assumption is that Neo4j graph database would be well suited for the shortest path calculation on transportation networks but this does not come without some cost. Memory proved to be an issue in Neo4j setup when dealing with larger transportation networks.

  7. An integrated data-analysis and database system for AMS {sup 14}C

    Energy Technology Data Exchange (ETDEWEB)

    Kjeldsen, Henrik, E-mail: kjeldsen@phys.au.d [AMS 14C Dating Centre, Department of Physics and Astronomy, Aarhus University, Aarhus (Denmark); Olsen, Jesper [Department of Earth Sciences, Aarhus University, Aarhus (Denmark); Heinemeier, Jan [AMS 14C Dating Centre, Department of Physics and Astronomy, Aarhus University, Aarhus (Denmark)

    2010-04-15

    AMSdata is the name of a combined database and data-analysis system for AMS {sup 14}C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS {sup 14}C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.

  8. The Danish fetal medicine database

    DEFF Research Database (Denmark)

    Ekelund, Charlotte Kvist; Kopp, Tine Iskov; Tabor, Ann

    2016-01-01

    trimester ultrasound scan performed at all public hospitals in Denmark are registered in the database. Main variables/descriptive data: Data on maternal characteristics, ultrasonic, and biochemical variables are continuously sent from the fetal medicine units’Astraia databases to the central database via...... analyses are sent to the database. Conclusion: It has been possible to establish a fetal medicine database, which monitors first-trimester screening for chromosomal abnormalities and second-trimester screening for major fetal malformations with the input from already collected data. The database...

  9. A multicenter study of routine versus selective intraoperative leak testing for sleeve gastrectomy.

    Science.gov (United States)

    Bingham, Jason; Kaufman, Jedediah; Hata, Kai; Dickerson, James; Beekley, Alec; Wisbach, Gordon; Swann, Jacob; Ahnfeldt, Eric; Hawkins, Devon; Choi, Yong; Lim, Robert; Martin, Matthew

    2017-09-01

    Staple line leaks after sleeve gastrectomy are dreaded complications. Many surgeons routinely perform an intraoperative leak test (IOLT) despite little evidence to validate the reliability, clinical benefit, and safety of this procedure. To determine the efficacy of IOLT and if routine use has any benefit over selective use. Eight teaching hospitals, including private, university, and military facilities. A multicenter, retrospective analysis over a 5-year period. The efficacy of the IOLT for identifying unsuspected staple line defects and for predicting postoperative leaks was evaluated. An anonymous survey was also collected reflecting surgeons' practices and beliefs regarding IOLT. From January 2010 through December 2014, 4284 patients underwent sleeve gastrectomy. Of these, 37 patients (.9%) developed a postoperative leak, and 2376 patients (55%) received an IOLT. Only 2 patients (0.08%) had a positive finding. Subsequently, 21 patients with a negative IOLT developed a leak. IOLT demonstrated a sensitivity of only 8.7%. There was a nonsignificant trend toward increased leak rates when an IOLT was performed versus when IOLT was not performed. Leak rates were not statistically different between centers that routinely perform IOLT versus those that selectively perform IOLT. Routine IOLT had very poor sensitivity and was negative in 91% of patients who later developed postoperative leaks. The use of IOLT was not associated with a decrease in the incidence of postoperative leaks, and routine IOLT had no benefit over selective leak testing. IOLT should not be used as a quality indicator or "best practice" for bariatric surgery. Published by Elsevier Inc.

  10. The database for accelerator control in the CERN PS Complex

    International Nuclear Information System (INIS)

    Cuperus, J.H.

    1987-01-01

    The use of a database started 7 years ago and is an effort to separate logic from data so that programs and routines can do a larger number of operations on data structures without knowing a priori the contents of these structures. It is of great help in coping with the complexities of a system controlling many linked accelerators and storage rings

  11. Computer-aided diagnosis in routine mammography

    International Nuclear Information System (INIS)

    Sittek, H.; Perlet, C.; Helmberger, R.; Linsmeier, E.; Kessler, M.; Reiser, M.

    1998-01-01

    Purpose: Computer-aided diagnosis in mammography is a topic many study groups have been concerned with since the first presentation of a system for computer-aided interpretation in 1967. Currently, there is only one system avilable for clinical use in mammography, the CAD-System Image Checker (R2 Technology). The purpose of our prospective study was to evaluate whether the integration of the CAD-system into the routine of a radiological breast diagnosis unit is feasible. Results: After the installation of the CAD-system, 300 patients with 1110 mammograms were included for evaluation in the present study. In 54 of these cases histological examination was indicated due to suspect criteria on conventional mammography. In 39 of 54 cases (72,2%) malignancy could be proven histologically. The CAD-system marked 82,1% of the histologically verified carcinomas correctly 94,3% of all 1797 marks made by the CAD-system indicated normal or benign structures. Routinely performed CAD analysis prolonged patients waiting time by about 15 min because the marks of the CAD system had to be interpreted in addition to the routine diagnostic investigations. Conclusion: Our experience with the use of the CAD-system in daily routine showed that CAD analysis can easily be integrated into a preexisting mammography unit. However, the diagnostic benefit is not yet clearly established. Since the rate of false negative marks by the CAD-system Image Checker is still high, the results of CAD analysis must be checked and corrected by an observer well experienced in mammography reading. (orig.) [de

  12. Database characterisation of HEP applications

    International Nuclear Information System (INIS)

    Piorkowski, Mariusz; Grancher, Eric; Topurov, Anton

    2012-01-01

    Oracle-based database applications underpin many key aspects of operations for both the LHC accelerator and the LHC experiments. In addition to the overall performance, the predictability of the response is a key requirement to ensure smooth operations and delivering predictability requires understanding the applications from the ground up. Fortunately, database management systems provide several tools to check, measure, analyse and gather useful information. We present our experiences characterising the performance of several typical HEP database applications performance characterisations that were used to deliver improved predictability and scalability as well as for optimising the hardware platform choice as we migrated to new hardware and Oracle 11g.

  13. ZeBase: an open-source relational database for zebrafish laboratories.

    Science.gov (United States)

    Hensley, Monica R; Hassenplug, Eric; McPhail, Rodney; Leung, Yuk Fai

    2012-03-01

    Abstract ZeBase is an open-source relational database for zebrafish inventory. It is designed for the recording of genetic, breeding, and survival information of fish lines maintained in a single- or multi-laboratory environment. Users can easily access ZeBase through standard web-browsers anywhere on a network. Convenient search and reporting functions are available to facilitate routine inventory work; such functions can also be automated by simple scripting. Optional barcode generation and scanning are also built-in for easy access to the information related to any fish. Further information of the database and an example implementation can be found at http://zebase.bio.purdue.edu.

  14. Applicability of thermodynamic database of radioactive elements developed for the Japanese performance assessment of HLW repository

    International Nuclear Information System (INIS)

    Yui, Mikazu; Shibata, Masahiro; Rai, Dhanpat; Ochs, Michael

    2003-01-01

    In 1999 Japan Nuclear Cycle Development Institute (JNC) published a second progress report (also known as H12 report) on high-level radioactive waste (HLW) disposal in Japan (JNC 1999). This report helped to develop confidence in the selected HLW disposal system and to establish the implementation body in 2000 for the disposal of HLW. JNC developed an in-house thermodynamic database for radioactive elements for performance analysis of the engineered barrier system (EBS) and the geosphere for H12 report. This paper briefly presents the status of the JNC's thermodynamic database and its applicability to perform realistic analyses of the solubilities of radioactive elements, evolution of solubility-limiting solid phases, predictions of the redox state of Pu in the neutral pH range under reducing conditions, and to estimate solubilities of radioactive elements in cementitious conditions. (author)

  15. Automated Oracle database testing

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Ensuring database stability and steady performance in the modern world of agile computing is a major challenge. Various changes happening at any level of the computing infrastructure: OS parameters & packages, kernel versions, database parameters & patches, or even schema changes, all can potentially harm production services. This presentation shows how an automatic and regular testing of Oracle databases can be achieved in such agile environment.

  16. Individual values, learning routines and academic procrastination.

    Science.gov (United States)

    Dietz, Franziska; Hofer, Manfred; Fries, Stefan

    2007-12-01

    Academic procrastination, the tendency to postpone learning activities, is regarded as a consequence of postmodern values that are prominent in post-industrialized societies. When students strive for leisure goals and have no structured routines for academic tasks, delaying strenuous learning activities becomes probable. The model tested in this study posits that postmodern value orientations are positively related to procrastination and to a lack of daily routines concerning the performance of academic activities. In contrast, modern values are negatively related to procrastination and positively to learning routines. Academic procrastination, in-turn, should be associated with the tendency to prefer leisure activities to schoolwork in case of conflicts between these two life domains. Seven hundred and four students from 6th and 8th grade with a mean age of 13.5 years participated in the study. The sample included students from all tracks of the German educational system. Students completed a questionnaire containing two value prototypes as well as scales on learning routines and procrastination. Decisions in motivational conflicts were measured using two vignettes. Results from structural equation modelling supported the proposed model for the whole sample as well as for each school track. A planned course of the day can prevent procrastination and foster decisions for academic tasks in case of conflicts. Students' learning takes place within a societal context and reflects the values held in the respective culture.

  17. Learning and remembering strategies of novice and advanced jazz dancers for skill level appropriate dance routines.

    Science.gov (United States)

    Poon, P P; Rodgers, W M

    2000-06-01

    This study examined the influence of the challenge level of to-be-learned stimulus on learning strategies in novice and advanced dancers. In Study 1, skill-level appropriate dance routines were developed for novice and advanced jazz dancers. In Study 2, 8 novice and 9 advanced female jazz dancers attempted to learn and remember the two routines in mixed model factorial design, with one between-participants factor: skill level (novice or advanced) and two within-participants factors: routine (easy or difficult) and performance (immediate or delayed). Participants were interviewed regarding the strategies used to learn and remember the routines. Results indicated that advanced performers used atypical learning strategies for insufficiently challenging stimuli, which may reflect characteristics of the stimuli rather than the performer. The qualitative data indicate a clear preference of novice and advanced performers for spatial compatibility of stimuli and response.

  18. A Systematic Review of Coding Systems Used in Pharmacoepidemiology and Database Research.

    Science.gov (United States)

    Chen, Yong; Zivkovic, Marko; Wang, Tongtong; Su, Su; Lee, Jianyi; Bortnichak, Edward A

    2018-02-01

    systems were used in Europe (59%) and North America (57%). 34% of the reviewed coding systems were utilized in at least 1 of the 16 pharmacoepidemiology databases of interest evaluated. 21% of coding systems had studies evaluating the validity and consistency of their use in research within pharmacoepidemiology databases of interest. The most prevalent validation method was comparison with a review of patient charts, case notes or medical records (64% of reviewed validation studies). The reported performance measures in the reviewed studies varied across a large range of values (PPV 0-100%, NPV 6-100%, sensitivity 0-100%, specificity 23-100% and accuracy 16-100%) and were dependent on many factors including coding system(s), therapeutic area, pharmacoepidemiology database, and outcome. Coding systems vary by type of information captured, clinical setting, and pharmacoepidemiology database and region of use. Of the 57 reviewed coding systems, few are routinely and widely applied in pharmacoepidemiology database research. Indication and outcome dependent heterogeneity in coding system performance suggest that accurate definitions and algorithms for capturing specific exposures and outcomes within large healthcare datasets should be developed on a case-by-case basis and in consultation with clinical experts. Schattauer GmbH.

  19. Operation and Performance of the ATLAS Muon Spectrometer Databases during 2011-12 Data Taking

    CERN Document Server

    Verducci, Monica

    2014-01-01

    The size and complexity of the ATLAS experiment at the Large Hadron Collider, including its Muon Spectrometer, raise unprecedented challenges in terms of operation, software model and data management. One of the challenging tasks is the storage of non-event data produced by the calibration and alignment stream processes and by online and offline monitoring frameworks, which can unveil problems in the detector hardware and in the data processing chain. During 2011 and 2012 data taking, the software model and data processing enabled high quality track resolution as a better understanding of the detector performance was developed using the most reliable detector simulation and reconstruction. This work summarises the various aspects of the Muon Spectrometer Databases, with particular emphasis given to the Conditions Databases and their usage in the data analysis.

  20. Fire test database

    International Nuclear Information System (INIS)

    Lee, J.A.

    1989-01-01

    This paper describes a project recently completed for EPRI by Impell. The purpose of the project was to develop a reference database of fire tests performed on non-typical fire rated assemblies. The database is designed for use by utility fire protection engineers to locate test reports for power plant fire rated assemblies. As utilities prepare to respond to Information Notice 88-04, the database will identify utilities, vendors or manufacturers who have specific fire test data. The database contains fire test report summaries for 729 tested configurations. For each summary, a contact is identified from whom a copy of the complete fire test report can be obtained. Five types of configurations are included: doors, dampers, seals, wraps and walls. The database is computerized. One version for IBM; one for Mac. Each database is accessed through user-friendly software which allows adding, deleting, browsing, etc. through the database. There are five major database files. One each for the five types of tested configurations. The contents of each provides significant information regarding the test method and the physical attributes of the tested configuration. 3 figs

  1. Sodium Chloride Supplementation Is Not Routinely Performed in the Majority of German and Austrian Infants with Classic Salt-Wasting Congenital Adrenal Hyperplasia and Has No Effect on Linear Growth and Hydrocortisone or Fludrocortisone Dose.

    Science.gov (United States)

    Bonfig, Walter; Roehl, Friedhelm; Riedl, Stefan; Brämswig, Jürgen; Richter-Unruh, Annette; Fricke-Otto, Susanne; Hübner, Angela; Bettendorf, Markus; Schönau, Eckhard; Dörr, Helmut; Holl, Reinhard W; Mohnike, Klaus

    2018-01-01

    Sodium chloride supplementation in salt-wasting congenital adrenal hyperplasia (CAH) is generally recommended in infants, but its implementation in routine care is very heterogeneous. To evaluate oral sodium chloride supplementation, growth, and hydrocortisone and fludrocortisone dose in infants with salt-wasting CAH due to 21-hydroxylase in 311 infants from the AQUAPE CAH database. Of 358 patients with classic CAH born between 1999 and 2015, 311 patients had salt-wasting CAH (133 females, 178 males). Of these, 86 patients (27.7%) received oral sodium chloride supplementation in a mean dose of 0.9 ± 1.4 mmol/kg/day (excluding nutritional sodium content) during the first year of life. 225 patients (72.3%) were not treated with sodium chloride. The percentage of sodium chloride-supplemented patients rose from 15.2% in children born 1999-2004 to 37.5% in children born 2011-2015. Sodium chloride-supplemented and -unsupplemented infants did not significantly differ in hydrocortisone and fludrocortisone dose, target height-corrected height-SDS, and BMI-SDS during the first 2 years of life. In the AQUAPE CAH database, approximately one-third of infants with salt-wasting CAH receive sodium chloride supplementation. Sodium chloride supplementation is performed more frequently in recent years. However, salt supplementation had no influence on growth, daily fludrocortisone and hydrocortisone dose, and frequency of adrenal crisis. © 2017 S. Karger AG, Basel.

  2. Data format translation routines

    International Nuclear Information System (INIS)

    Burris, R.D.

    1981-02-01

    To enable the effective connection of several dissimilar computers into a network, modification of the data being passed from one computer to another may become necessary. This document describes a package of routines which permit the translation of data in PDP-8 formats to PDP-11 or DECsystem-10 formats or from PDP-11 format to DECsystem-10 format. Additional routines are described which permit the effective use of the translation routines in the environment of the Fusion Energy Division (FED) network and the Elmo Bumpy Torus (EBT) data base

  3. Searching mixed DNA profiles directly against profile databases.

    Science.gov (United States)

    Bright, Jo-Anne; Taylor, Duncan; Curran, James; Buckleton, John

    2014-03-01

    DNA databases have revolutionised forensic science. They are a powerful investigative tool as they have the potential to identify persons of interest in criminal investigations. Routinely, a DNA profile generated from a crime sample could only be searched for in a database of individuals if the stain was from single contributor (single source) or if a contributor could unambiguously be determined from a mixed DNA profile. This meant that a significant number of samples were unsuitable for database searching. The advent of continuous methods for the interpretation of DNA profiles offers an advanced way to draw inferential power from the considerable investment made in DNA databases. Using these methods, each profile on the database may be considered a possible contributor to a mixture and a likelihood ratio (LR) can be formed. Those profiles which produce a sufficiently large LR can serve as an investigative lead. In this paper empirical studies are described to determine what constitutes a large LR. We investigate the effect on a database search of complex mixed DNA profiles with contributors in equal proportions with dropout as a consideration, and also the effect of an incorrect assignment of the number of contributors to a profile. In addition, we give, as a demonstration of the method, the results using two crime samples that were previously unsuitable for database comparison. We show that effective management of the selection of samples for searching and the interpretation of the output can be highly informative. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Quality standards for DNA sequence variation databases to improve clinical management under development in Australia

    Directory of Open Access Journals (Sweden)

    B. Bennetts

    2014-09-01

    Full Text Available Despite the routine nature of comparing sequence variations identified during clinical testing to database records, few databases meet quality requirements for clinical diagnostics. To address this issue, The Royal College of Pathologists of Australasia (RCPA in collaboration with the Human Genetics Society of Australasia (HGSA, and the Human Variome Project (HVP is developing standards for DNA sequence variation databases intended for use in the Australian clinical environment. The outputs of this project will be promoted to other health systems and accreditation bodies by the Human Variome Project to support the development of similar frameworks in other jurisdictions.

  5. Development of environment radiation database management system

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun [Daeduk College, Taejon (Korea, Republic of)

    1999-03-15

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation.

  6. Development of environment radiation database management system

    International Nuclear Information System (INIS)

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun

    1999-03-01

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation

  7. Supply Chain Initiatives Database

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-11-01

    The Supply Chain Initiatives Database (SCID) presents innovative approaches to engaging industrial suppliers in efforts to save energy, increase productivity and improve environmental performance. This comprehensive and freely-accessible database was developed by the Institute for Industrial Productivity (IIP). IIP acknowledges Ecofys for their valuable contributions. The database contains case studies searchable according to the types of activities buyers are undertaking to motivate suppliers, target sector, organization leading the initiative, and program or partnership linkages.

  8. Performance Evaluation of an Automated ELISA System for Alzheimer's Disease Detection in Clinical Routine.

    Science.gov (United States)

    Chiasserini, Davide; Biscetti, Leonardo; Farotti, Lucia; Eusebi, Paolo; Salvadori, Nicola; Lisetti, Viviana; Baschieri, Francesca; Chipi, Elena; Frattini, Giulia; Stoops, Erik; Vanderstichele, Hugo; Calabresi, Paolo; Parnetti, Lucilla

    2016-07-22

    The variability of Alzheimer's disease (AD) cerebrospinal fluid (CSF) biomarkers undermines their full-fledged introduction into routine diagnostics and clinical trials. Automation may help to increase precision and decrease operator errors, eventually improving the diagnostic performance. Here we evaluated three new CSF immunoassays, EUROIMMUNtrademark amyloid-β 1-40 (Aβ1-40), amyloid-β 1-42 (Aβ1-42), and total tau (t-tau), in combination with automated analysis of the samples. The CSF biomarkers were measured in a cohort consisting of AD patients (n = 28), mild cognitive impairment (MCI, n = 77), and neurological controls (OND, n = 35). MCI patients were evaluated yearly and cognitive functions were assessed by Mini-Mental State Examination. The patients clinically diagnosed with AD and MCI were classified according to the CSF biomarkers profile following NIA-AA criteria and the Erlangen score. Technical evaluation of the immunoassays was performed together with the calculation of their diagnostic performance. Furthermore, the results for EUROIMMUN Aβ1-42 and t-tau were compared to standard immunoassay methods (INNOTESTtrademark). EUROIMMUN assays for Aβ1-42 and t-tau correlated with INNOTEST (r = 0.83, p ratio measured with EUROIMMUN was the best parameter for AD detection and improved the diagnostic accuracy of Aβ1-42 (area under the curve = 0.93). In MCI patients, the Aβ1-42/Aβ1-40 ratio was associated with cognitive decline and clinical progression to AD.The diagnostic performance of the EUROIMMUN assays with automation is comparable to other currently used methods. The variability of the method and the value of the Aβ1-42/Aβ1-40 ratio in AD diagnosis need to be validated in large multi-center studies.

  9. The Balancing Act: Student Classroom Placement Routines and the Uses of Data in Elementary Schools

    Science.gov (United States)

    Park, Vicki; St. John, Elise; Datnow, Amanda; Choi, Bailey

    2017-01-01

    Purpose: The purpose of this paper is to examine how data are used in classroom placement routines. The authors explore educators' assumptions about the purposes of the classroom placement routine, detailing the ostensive (i.e. structure and template) and performative aspects of the routine itself, and the implications of data use for equity and…

  10. MTF Database: A Repository of Students' Academic Performance Measurements for the Development of Techniques for Evaluating Team Functioning

    Science.gov (United States)

    Hsiung, Chin-Min; Zheng, Xiang-Xiang

    2015-01-01

    The Measurements for Team Functioning (MTF) database contains a series of student academic performance measurements obtained at a national university in Taiwan. The measurements are acquired from unit tests and homework tests performed during a core mechanical engineering course, and provide an objective means of assessing the functioning of…

  11. Pre-procedural scout radiographs are unnecessary for routine pediatric fluoroscopic examinations

    Energy Technology Data Exchange (ETDEWEB)

    Creeden, Sean G.; Rao, Anil G.; Eklund, Meryle J.; Hill, Jeanne G.; Thacker, Paul G. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States)

    2017-03-15

    Although practice patterns vary, scout radiographs are often routinely performed with pediatric fluoroscopic studies. However few studies have evaluated their utility in routine pediatric fluoroscopy. To evaluate the value of scout abdomen radiographs in routine barium or water-soluble enema, upper gastrointestinal (GI) series, and voiding cystourethrogram pediatric fluoroscopic procedures. We retrospectively evaluated 723 barium or water-soluble enema, upper GI series, and voiding cystourethrogram fluoroscopic procedures performed at our institution. We assessed patient history and demographics, clinical indication for the examination, prior imaging findings and impressions, scout radiograph findings, additional findings provided by the scout radiograph that were previously unknown, and whether the scout radiograph contributed any findings that significantly changed management. We retrospectively evaluated 723 fluoroscopic studies (368 males and 355 females) in pediatric patients. Of these, 700 (96.8%) had a preliminary scout radiograph. Twenty-three (3.2%) had a same-day radiograph substituted as a scout radiograph. Preliminary scout abdomen radiographs/same-day radiographs showed no new significant findings in 719 (99.4%) studies. New but clinically insignificant findings were seen in 4 (0.6%) studies and included umbilical hernia, inguinal hernia and hip dysplasia. No findings were found on the scout radiographs that would either alter the examination performed or change management with regard to the exam. Pre-procedural scout abdomen radiographs are unnecessary in routine barium and water-soluble enema, upper GI series, and voiding cystourethrogram pediatric fluoroscopic procedures and can be substituted with a spot fluoroscopic last-image hold. (orig.)

  12. Specialist Bibliographic Databases.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A; Trukhachev, Vladimir I; Kostyukova, Elena I; Gerasimov, Alexey N; Kitas, George D

    2016-05-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls.

  13. Specialist Bibliographic Databases

    Science.gov (United States)

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls. PMID:27134485

  14. [Routine fluoroscopic investigations after primary bariatric surgery].

    Science.gov (United States)

    Gärtner, D; Ernst, A; Fedtke, K; Jenkner, J; Schöttler, A; Reimer, P; Blüher, M; Schön, M R

    2016-03-01

    Staple line and anastomotic leakages are life-threatening complications after bariatric surgery. Upper gastrointestinal (GI) tract X-ray examination with oral administration of a water-soluble contrast agent can be used to detect leaks. The aim of this study was to evaluate the impact of routine upper GI tract fluoroscopy after primary bariatric surgery. Between January 2009 and December 2014 a total of 658 bariatric interventions were carried out of which 442 were primary bariatric operations. Included in this single center study were 307 sleeve gastrectomies and 135 Roux-en-Y gastric bypasses. Up to December 2012 upper GI tract fluoroscopy was performed routinely between the first and third postoperative days and the detection of leakages was evaluated. In the investigation period 8 leakages (2.6 %) after sleeve gastrectomy, 1 anastomotic leakage in gastrojejunostomy and 1 in jejunojejunostomy after Roux-en-Y gastric bypass occurred. All patients developed clinical symptoms, such as abdominal pain, tachycardia or fever. In one case the leakage was detected by upper GI fluoroscopy and in nine cases radiological findings were unremarkable. No leakages were detected in asymptomatic patients. Routine upper GI fluoroscopy is not recommended for uneventful postoperative courses after primary bariatric surgery.

  15. Database Replication Prototype

    OpenAIRE

    Vandewall, R.

    2000-01-01

    This report describes the design of a Replication Framework that facilitates the implementation and com-parison of database replication techniques. Furthermore, it discusses the implementation of a Database Replication Prototype and compares the performance measurements of two replication techniques based on the Atomic Broadcast communication primitive: pessimistic active replication and optimistic active replication. The main contributions of this report can be split into four parts....

  16. Developing Routines in Large Inter-organisational Projects: A Case Study of an Infrastructure Megaproject

    Directory of Open Access Journals (Sweden)

    Therese Eriksson

    2015-08-01

    Full Text Available General management research has increasingly recognised the significance of routines in organisational performance. Among organisational tasks, megaprojects depend more on routines selected and created within the project than standard, small-scale projects do, owing largely to their size, duration, and uniqueness. Within this context, the present paper investigates how project routines were established and developed during the early design phase of an inter-organisational megaproject. A case study of a large public infrastructure project was conducted, in which data were collected during observations, semi-structured interviews, and project document studies over the course of three years. Results of analysis revealed that the client exerted the greatest impact on choice of routines and that the temporary nature of tasks limited efforts to fine-tune routines. Changes in routines were primarily reactive to new knowledge concerning project needs. The findings suggest that meta-routines to consciously review routines should be used to a greater extent and designed to capture supplier experiences as well.

  17. Diagnostic accuracy of routine blood examinations and CSF lactate level for post-neurosurgical bacterial meningitis.

    Science.gov (United States)

    Zhang, Yang; Xiao, Xiong; Zhang, Junting; Gao, Zhixian; Ji, Nan; Zhang, Liwei

    2017-06-01

    To evaluate the diagnostic accuracy of routine blood examinations and Cerebrospinal Fluid (CSF) lactate level for Post-neurosurgical Bacterial Meningitis (PBM) at a large sample-size of post-neurosurgical patients. The diagnostic accuracies of routine blood examinations and CSF lactate level to distinguish between PAM and PBM were evaluated with the values of the Area Under the Curve of the Receiver Operating Characteristic (AUC -ROC ) by retrospectively analyzing the datasets of post-neurosurgical patients in the clinical information databases. The diagnostic accuracy of routine blood examinations was relatively low (AUC -ROC CSF lactate level achieved rather high diagnostic accuracy (AUC -ROC =0.891; CI 95%, 0.852-0.922). The variables of patient age, operation duration, surgical diagnosis and postoperative days (the interval days between the neurosurgery and examinations) were shown to affect the diagnostic accuracy of these examinations. The variables were integrated with routine blood examinations and CSF lactate level by Fisher discriminant analysis to improve their diagnostic accuracy. As a result, the diagnostic accuracy of blood examinations and CSF lactate level was significantly improved with an AUC -ROC value=0.760 (CI 95%, 0.737-0.782) and 0.921 (CI 95%, 0.887-0.948) respectively. The PBM diagnostic accuracy of routine blood examinations was relatively low, whereas the accuracy of CSF lactate level was high. Some variables that are involved in the incidence of PBM can also affect the diagnostic accuracy for PBM. Taking into account the effects of these variables significantly improves the diagnostic accuracies of routine blood examinations and CSF lactate level. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Developing a virtual reality application for training Nuclear Power Plant operators: Setting up a database containing dose rates in the refuelling plant

    International Nuclear Information System (INIS)

    Rodenas, J.; Zarza, I.; Burgos, M. C.; Felipe, A.; Sanchez-Mayoral, M. L.

    2004-01-01

    Operators in Nuclear Power Plants can receive high doses during refuelling operations. A training programme for simulating refuelling operations will be useful in reducing the doses received by workers as well as minimising operation time. With this goal in mind, a virtual reality application is developed within the framework of the CIPRES project. The application requires doses, both instantaneous and accumulated, to be displayed at all times during operator training. Therefore, it is necessary to set up a database containing dose rates at every point in the refuelling plant. This database is based on radiological protection surveillance data measured in the plant during refuelling operations. Some interpolation routines have been used to estimate doses through the refuelling plant. Different assumptions have been adopted in order to perform the interpolation and obtain consistent data. In this paper, the procedures developed to set up the dose database for the virtual reality application are presented and analysed. (authors)

  19. Summary of Data and Steps for Processing the 1997-2001 SRS Meteorological Database

    International Nuclear Information System (INIS)

    Weber, A.H.

    2003-01-01

    Every five years since the mid-1970s DOE has requested an update on the meteorological conditions at SRS in order to provide dose calculations for accident or routine release scenarios for onsite and offsite populations. The meteorological database includes wind speed, wind direction, temperature, dew point, and horizontal and vertical turbulence intensities. The two most recent databases prior to the current one were completed in 1998 for the time period 1992-96 (Weber, 1998) and one for 1987-91 (Parker, et. al., 1992). The current database covers the period 1997-2001. The advantage of updating the database every five years is that meteorological observations are steadily growing more complete and less subject to errors with the implementation of better electronic data archiving software and hardware, and improved data quality assurance procedures. Also, changes in the region's climate may be manifest

  20. Treatment performances of French constructed wetlands: results from a database collected over the last 30 years.

    Science.gov (United States)

    Morvannou, A; Forquet, N; Michel, S; Troesch, S; Molle, P

    2015-01-01

    Approximately 3,500 constructed wetlands (CWs) provide raw wastewater treatment in France for small communities (Built during the past 30 years, most consist of two vertical flow constructed wetlands (VFCWs) in series (stages). Many configurations exist, with systems associated with horizontal flow filters or waste stabilization ponds, vertical flow with recirculation, partially saturated systems, etc. A database analyzed 10 years earlier on the classical French system summarized the global performances data. This paper provides a similar analysis of performance data from 415 full-scale two-stage VFCWs from an improved database expanded by monitoring data available from Irstea and the French technical department. Trends presented in the first study are confirmed, exhibiting high chemical oxygen demand (COD), total suspended solids (TSS) and total Kjeldahl nitrogen (TKN) removal rates (87%, 93% and 84%, respectively). Typical concentrations at the second-stage outlet are 74 mgCOD L(-1), 17 mgTSS L(-1) and 11 mgTKN L(-1). Pollutant removal performances are summarized in relation to the loads applied at the first treatment stage. While COD and TSS removal rates remain stable over the range of applied loads, the spreading of TKN removal rates increases as applied loads increase.

  1. OCA Oracle Database 11g database administration I : a real-world certification guide

    CERN Document Server

    Ries, Steve

    2013-01-01

    Developed as a practical book, ""Oracle Database 11g Administration I Certification Guide"" will show you all you need to know to effectively excel at being an Oracle DBA, for both examinations and the real world. This book is for anyone who needs the essential skills to become an Oracle DBA, pass the Oracle Database Administration I exam, and use those skills in the real world to manage secure, high performance, and highly available Oracle databases.

  2. On the use of databases about research performance

    NARCIS (Netherlands)

    Rodela, Romina

    2016-01-01

    The accuracy of interdisciplinarity measurements depends on how well the data is used for this purpose and whether it can meaningfully inform about work that crosses disciplinary domains. At present, there are no ad hoc databases compiling information only and exclusively about interdisciplinary

  3. The potential of high resolution melting analysis (hrma) to streamline, facilitate and enrich routine diagnostics in medical microbiology.

    Science.gov (United States)

    Ruskova, Lenka; Raclavsky, Vladislav

    2011-09-01

    Routine medical microbiology diagnostics relies on conventional cultivation followed by phenotypic techniques for identification of pathogenic bacteria and fungi. This is not only due to tradition and economy but also because it provides pure culture needed for antibiotic susceptibility testing. This review focuses on the potential of High Resolution Melting Analysis (HRMA) of double-stranded DNA for future routine medical microbiology. Search of MEDLINE database for publications showing the advantages of HRMA in routine medical microbiology for identification, strain typing and further characterization of pathogenic bacteria and fungi in particular. The results show increasing numbers of newly-developed and more tailor-made assays in this field. For microbiologists unfamiliar with technical aspects of HRMA, we also provide insight into the technique from the perspective of microbial characterization. We can anticipate that the routine availability of HRMA in medical microbiology laboratories will provide a strong stimulus to this field. This is already envisioned by the growing number of medical microbiology applications published recently. The speed, power, convenience and cost effectiveness of this technology virtually predestine that it will advance genetic characterization of microbes and streamline, facilitate and enrich diagnostics in routine medical microbiology without interfering with the proven advantages of conventional cultivation.

  4. Increased Exposure to Rigid Routines Can Lead to Increased Challenging Behavior Following Changes to Those Routines

    Science.gov (United States)

    Bull, Leah E.; Oliver, Chris; Callaghan, Eleanor; Woodcock, Kate A.

    2015-01-01

    Several neurodevelopmental disorders are associated with preference for routine and challenging behavior following changes to routines. We examine individuals with Prader-Willi syndrome, who show elevated levels of this behavior, to better understand how previous experience of a routine can affect challenging behavior elicited by disruption to…

  5. Conceptual considerations for CBM databases

    Energy Technology Data Exchange (ETDEWEB)

    Akishina, E. P.; Aleksandrov, E. I.; Aleksandrov, I. N.; Filozova, I. A.; Ivanov, V. V.; Zrelov, P. V. [Lab. of Information Technologies, JINR, Dubna (Russian Federation); Friese, V.; Mueller, W. [GSI, Darmstadt (Germany)

    2014-07-01

    We consider a concept of databases for the Cm experiment. For this purpose, an analysis of the databases for large experiments at the LHC at CERN has been performed. Special features of various DBMS utilized in physical experiments, including relational and object-oriented DBMS as the most applicable ones for the tasks of these experiments, were analyzed. A set of databases for the CBM experiment, DBMS for their developments as well as use cases for the considered databases are suggested.

  6. Conceptual considerations for CBM databases

    International Nuclear Information System (INIS)

    Akishina, E.P.; Aleksandrov, E.I.; Aleksandrov, I.N.; Filozova, I.A.; Ivanov, V.V.; Zrelov, P.V.; Friese, V.; Mueller, W.

    2014-01-01

    We consider a concept of databases for the Cm experiment. For this purpose, an analysis of the databases for large experiments at the LHC at CERN has been performed. Special features of various DBMS utilized in physical experiments, including relational and object-oriented DBMS as the most applicable ones for the tasks of these experiments, were analyzed. A set of databases for the CBM experiment, DBMS for their developments as well as use cases for the considered databases are suggested.

  7. Performance analysis of a real-time database with optimistic concurrency control

    NARCIS (Netherlands)

    Sassen, S.A.E.; Wal, van der J.

    1997-01-01

    For a real-time shared-memory database with Optimistic Concurrency Control (OCC), an approximation for the transaction response-time distribution and thus for the deadline miss probability is obtained. Transactions arrive at the database according to a Poisson process. There is a limited number of

  8. Resolving Questioned Paternity Issues Using a Philippine Genetic Database

    Directory of Open Access Journals (Sweden)

    Maria Corazon De Ungria

    2002-06-01

    Full Text Available The utility of the Philippines genetic database consisting of seven Short Tandem Repeat (STR markers for testing of ten questioned paternity cases was investigated. The markers used were HUMvWA, HUMTH01, HUMCSF1PO, HUMFOLP23, D8S306, HUMFES/FPS, and HUMF13A01. These markers had a combined Power of Paternity Exclusion of 99.17%. Due to the gravity of some cases handled in the laboratory, routine procedures must be assessed to determine the capacity of the analysis to exclude a non-father of predict paternity. Clients showed a preference for only testing father and child to lower costs and reduce conflicts, particularly when the mother objects to the conduct of DNA tests, or when she is deceased or cannot be located. The Probability of Paternity was calculated with and without the mother’s profile in each of the cases. In all instances, results were more informative when the mother’s DNA profile was included. Moreover, variations in the allelic distribution of five STR markers among eight Caucasian, one African-American, and two Amerindian (Argentina populations resulted in significant differences in Probability of Paternity estimates compared to those calculated using the Philippine Database.Based on the results of the present study, it is recommended that tests on alleged father-child samples be performed to screen for at least two mismatches. In the absence of theses mismatches, further analysis that includes the mother’s DNA profile is recommended. Moreover, it is recommended that a Philippines genetic database be used for DNA-based paternity testing in the Philippines.

  9. Database system for management of health physics and industrial hygiene records

    International Nuclear Information System (INIS)

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-01-01

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection

  10. The barriers and facilitators to routine outcome measurement by allied health professionals in practice: a systematic review

    Directory of Open Access Journals (Sweden)

    Duncan Edward AS

    2012-05-01

    Full Text Available Abstract Background Allied Health Professionals today are required, more than ever before, to demonstrate their impact. However, despite at least 20 years of expectation, many services fail to deliver routine outcome measurement in practice. This systematic review investigates what helps and hinders routine outcome measurement of allied health professionals practice. Methods A systematic review protocol was developed comprising: a defined search strategy for PsycINFO, MEDLINE and CINHAL databases and inclusion criteria and systematic procedures for data extraction and quality appraisal. Studies were included if they were published in English and investigated facilitators and/or barriers to routine outcome measurement by allied health professionals. No restrictions were placed on publication type, design, country, or year of publication. Reference lists of included publications were searched to identify additional papers. Descriptive methods were used to synthesise the findings. Results 960 papers were retrieved; 15 met the inclusion criteria. Professional groups represented were Physiotherapy, Occupational Therapy, and Speech and Language Therapy. The included literature varied in quality and design. Facilitators and barriers to routine outcome measurement exist at individual, managerial and organisational levels. Key factors affecting professionals’ use of routine outcome measurement include: professionals’ level of knowledge and confidence about using outcome measures, and the degree of organisational and peer-support professionals received with a view to promoting their work in practice. Conclusions Whilst the importance of routinely measuring outcomes within the allied health professions is well recognised, it has largely failed to be delivered in practice. Factors that influence clinicians’ ability and desire to undertake routine outcome measurement are bi-directional: they can act as either facilitators or barriers. Routine outcome

  11. Danish Urogynaecological Database

    DEFF Research Database (Denmark)

    Hansen, Ulla Darling; Gradel, Kim Oren; Larsen, Michael Due

    2016-01-01

    , complications if relevant, implants used if relevant, 3-6-month postoperative recording of symptoms, if any. A set of clinical quality indicators is being maintained by the steering committee for the database and is published in an annual report which also contains extensive descriptive statistics. The database......The Danish Urogynaecological Database is established in order to ensure high quality of treatment for patients undergoing urogynecological surgery. The database contains details of all women in Denmark undergoing incontinence surgery or pelvic organ prolapse surgery amounting to ~5,200 procedures...... has a completeness of over 90% of all urogynecological surgeries performed in Denmark. Some of the main variables have been validated using medical records as gold standard. The positive predictive value was above 90%. The data are used as a quality monitoring tool by the hospitals and in a number...

  12. Audit of the autoantibody test, EarlyCDT®-lung, in 1600 patients: an evaluation of its performance in routine clinical practice.

    Science.gov (United States)

    Jett, James R; Peek, Laura J; Fredericks, Lynn; Jewell, William; Pingleton, William W; Robertson, John F R

    2014-01-01

    EarlyCDT(®)-Lung may enhance detection of early stage lung cancer by aiding physicians in assessing high-risk patients through measurement of biological markers (i.e., autoantibodies). The test's performance characteristics in routine clinical practice were evaluated by auditing clinical outcomes of 1613 US patients deemed at high risk for lung cancer by their physician, who ordered the EarlyCDT-Lung test for their patient. Clinical outcomes for all 1613 patients who provided HIPAA authorization are reported. Clinical data were collected from each patient's treating physician. Pathology reports when available were reviewed for diagnostic classification. Staging was assessed on histology, otherwise on imaging. Six month follow-up for the positives/negatives was 99%/93%. Sixty-one patients (4%) were identified with lung cancer, 25 of whom tested positive by EarlyCDT-Lung (sensitivity=41%). A positive EarlyCDT-Lung test on the current panel was associated with a 5.4-fold increase in lung cancer incidence versus a negative. Importantly, 57% (8/14) of non-small cell lung cancers detected as positive (where stage was known) were stage I or II. EarlyCDT-Lung has been extensively tested and validated in case-control settings and has now been shown in this audit to perform in routine clinical practice as predicted. EarlyCDT-Lung may be a complementary tool to CT for detection of early lung cancer. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  13. Psychophysical studies of the performance of an image database retrieval system

    Science.gov (United States)

    Papathomas, Thomas V.; Conway, Tiffany E.; Cox, Ingemar J.; Ghosn, Joumana; Miller, Matt L.; Minka, Thomas P.; Yianilos, Peter N.

    1998-07-01

    We describe psychophysical experiments conducted to study PicHunter, a content-based image retrieval (CBIR) system. Experiment 1 studies the importance of using (a) semantic information, (2) memory of earlier input and (3) relative, rather than absolute, judgements of image similarity. The target testing paradigm is used in which a user must search for an image identical to a target. We find that the best performance comes from a version of PicHunter that uses only semantic cues, with memory and relative similarity judgements. Second best is use of both pictorial and semantic cues, with memory and relative similarity judgements. Most reports of CBIR systems provide only qualitative measures of performance based on how similar retrieved images are to a target. Experiment 2 puts PicHunter into this context with a more rigorous test. We first establish a baseline for our database by measuring the time required to find an image that is similar to a target when the images are presented in random order. Although PicHunter's performance is measurably better than this, the test is weak because even random presentation of images yields reasonably short search times. This casts doubt on the strength of results given in other reports where no baseline is established.

  14. The Danish Fetal Medicine Database

    Directory of Open Access Journals (Sweden)

    Ekelund CK

    2016-10-01

    Full Text Available Charlotte Kvist Ekelund,1 Tine Iskov Kopp,2 Ann Tabor,1 Olav Bjørn Petersen3 1Department of Obstetrics, Center of Fetal Medicine, Rigshospitalet, University of Copenhagen, Copenhagen, Denmark; 2Registry Support Centre (East – Epidemiology and Biostatistics, Research Centre for Prevention and Health, Glostrup, Denmark; 3Fetal Medicine Unit, Aarhus University Hospital, Aarhus Nord, Denmark Aim: The aim of this study is to set up a database in order to monitor the detection rates and false-positive rates of first-trimester screening for chromosomal abnormalities and prenatal detection rates of fetal malformations in Denmark. Study population: Pregnant women with a first or second trimester ultrasound scan performed at all public hospitals in Denmark are registered in the database. Main variables/descriptive data: Data on maternal characteristics, ultrasonic, and biochemical variables are continuously sent from the fetal medicine units' Astraia databases to the central database via web service. Information about outcome of pregnancy (miscarriage, termination, live birth, or stillbirth is received from the National Patient Register and National Birth Register and linked via the Danish unique personal registration number. Furthermore, results of all pre- and postnatal chromosome analyses are sent to the database. Conclusion: It has been possible to establish a fetal medicine database, which monitors first-trimester screening for chromosomal abnormalities and second-trimester screening for major fetal malformations with the input from already collected data. The database is valuable to assess the performance at a regional level and to compare Danish performance with international results at a national level. Keywords: prenatal screening, nuchal translucency, fetal malformations, chromosomal abnormalities

  15. Medical databases in studies of drug teratogenicity: methodological issues

    Directory of Open Access Journals (Sweden)

    Vera Ehrenstein

    2010-03-01

    Full Text Available Vera Ehrenstein1, Henrik T Sørensen1, Leiv S Bakketeig1,2, Lars Pedersen11Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus, Denmark; 2Norwegian Institute of Public Health, Oslo, NorwayAbstract: More than half of all pregnant women take prescription medications, raising concerns about fetal safety. Medical databases routinely collecting data from large populations are potentially valuable resources for cohort studies addressing teratogenicity of drugs. These include electronic medical records, administrative databases, population health registries, and teratogenicity information services. Medical databases allow estimation of prevalences of birth defects with enhanced precision, but systematic error remains a potentially serious problem. In this review, we first provide a brief description of types of North American and European medical databases suitable for studying teratogenicity of drugs and then discuss manifestation of systematic errors in teratogenicity studies based on such databases. Selection bias stems primarily from the inability to ascertain all reproductive outcomes. Information bias (misclassification may be caused by paucity of recorded clinical details or incomplete documentation of medication use. Confounding, particularly confounding by indication, can rarely be ruled out. Bias that either masks teratogenicity or creates false appearance thereof, may have adverse consequences for the health of the child and the mother. Biases should be quantified and their potential impact on the study results should be assessed. Both theory and software are available for such estimation. Provided that methodological problems are understood and effectively handled, computerized medical databases are a valuable source of data for studies of teratogenicity of drugs.Keywords: databases, birth defects, epidemiologic methods, pharmacoepidemiology

  16. LHCb distributed conditions database

    International Nuclear Information System (INIS)

    Clemencic, M

    2008-01-01

    The LHCb Conditions Database project provides the necessary tools to handle non-event time-varying data. The main users of conditions are reconstruction and analysis processes, which are running on the Grid. To allow efficient access to the data, we need to use a synchronized replica of the content of the database located at the same site as the event data file, i.e. the LHCb Tier1. The replica to be accessed is selected from information stored on LFC (LCG File Catalog) and managed with the interface provided by the LCG developed library CORAL. The plan to limit the submission of jobs to those sites where the required conditions are available will also be presented. LHCb applications are using the Conditions Database framework on a production basis since March 2007. We have been able to collect statistics on the performance and effectiveness of both the LCG library COOL (the library providing conditions handling functionalities) and the distribution framework itself. Stress tests on the CNAF hosted replica of the Conditions Database have been performed and the results will be summarized here

  17. Routine intraoperative leak testing for sleeve gastrectomy: is the leak test full of hot air?

    Science.gov (United States)

    Bingham, Jason; Lallemand, Michael; Barron, Morgan; Kuckelman, John; Carter, Preston; Blair, Kelly; Martin, Matthew

    2016-05-01

    Staple line leak after sleeve gastrectomy (SG) is a rare but dreaded complication with a reported incidence of 0% to 8%. Many surgeons routinely test the staple line with an intraoperative leak test (IOLT), but there is little evidence to validate this practice. In fact, there is a theoretical concern that the leak test may weaken the staple line and increase the risk of a postop leak. Retrospective review of all SGs performed over a 7-year period was conducted. Cases were grouped by whether an IOLT was performed, and compared for the incidence of postop staple line leaks. The ability of the IOLT for identifying a staple line defect and for predicting a postoperative leak was analyzed. Five hundred forty-two SGs were performed between 2007 and 2014. Thirteen patients (2.4%) developed a postop staple line leak. The majority of patients (n = 494, 91%) received an IOLT, including all 13 patients (100%) who developed a subsequent clinical leak. There were no (0%) positive IOLTs and no additional interventions were performed based on the IOLT. The IOLT sensitivity and positive predictive value were both 0%. There was a trend, although not significant, to increase leak rates when a routine IOLT was performed vs no routine IOLT (2.6% vs 0%, P = .6). The performance of routine IOLT after SG provided no actionable information, and was negative in all patients who developed a postoperative leak. The routine use of an IOLT did not reduce the incidence of postop leak, and in fact was associated with a higher leak rate after SG. Published by Elsevier Inc.

  18. Development of a personalized training system using the Lung Image Database Consortium and Image Database resource Initiative Database.

    Science.gov (United States)

    Lin, Hongli; Wang, Weisheng; Luo, Jiawei; Yang, Xuedong

    2014-12-01

    The aim of this study was to develop a personalized training system using the Lung Image Database Consortium (LIDC) and Image Database resource Initiative (IDRI) Database, because collecting, annotating, and marking a large number of appropriate computed tomography (CT) scans, and providing the capability of dynamically selecting suitable training cases based on the performance levels of trainees and the characteristics of cases are critical for developing a efficient training system. A novel approach is proposed to develop a personalized radiology training system for the interpretation of lung nodules in CT scans using the Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) database, which provides a Content-Boosted Collaborative Filtering (CBCF) algorithm for predicting the difficulty level of each case of each trainee when selecting suitable cases to meet individual needs, and a diagnostic simulation tool to enable trainees to analyze and diagnose lung nodules with the help of an image processing tool and a nodule retrieval tool. Preliminary evaluation of the system shows that developing a personalized training system for interpretation of lung nodules is needed and useful to enhance the professional skills of trainees. The approach of developing personalized training systems using the LIDC/IDRL database is a feasible solution to the challenges of constructing specific training program in terms of cost and training efficiency. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  19. Database Description - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Database Description General information of database Database name Trypanosomes Database...stitute of Genetics Research Organization of Information and Systems Yata 1111, Mishima, Shizuoka 411-8540, JAPAN E mail: Database...y Name: Trypanosoma Taxonomy ID: 5690 Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database description The... Article title: Author name(s): Journal: External Links: Original website information Database maintenance s...DB (Protein Data Bank) KEGG PATHWAY Database DrugPort Entry list Available Query search Available Web servic

  20. Database Constraints Applied to Metabolic Pathway Reconstruction Tools

    Directory of Open Access Journals (Sweden)

    Jordi Vilaplana

    2014-01-01

    Full Text Available Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (reannotation of proteomes, to properly identify both the individual proteins involved in the process(es of interest and their function. It also enables the sets of proteins involved in the process(es in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.

  1. Database constraints applied to metabolic pathway reconstruction tools.

    Science.gov (United States)

    Vilaplana, Jordi; Solsona, Francesc; Teixido, Ivan; Usié, Anabel; Karathia, Hiren; Alves, Rui; Mateo, Jordi

    2014-01-01

    Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the process(es) of interest and their function. It also enables the sets of proteins involved in the process(es) in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.

  2. Comparison of the Frontier Distributed Database Caching System with NoSQL Databases

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Non-relational "NoSQL" databases such as Cassandra and CouchDB are best known for their ability to scale to large numbers of clients spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects, is based on traditional SQL databases but also has the same high scalability and wide-area distributability for an important subset of applications. This paper compares the architectures, behavior, performance, and maintainability of the two different approaches and identifies the criteria for choosing which approach to prefer over the other.

  3. The MAR databases: development and implementation of databases specific for marine metagenomics.

    Science.gov (United States)

    Klemetsen, Terje; Raknes, Inge A; Fu, Juan; Agafonov, Alexander; Balasundaram, Sudhagar V; Tartari, Giacomo; Robertsen, Espen; Willassen, Nils P

    2018-01-04

    We introduce the marine databases; MarRef, MarDB and MarCat (https://mmp.sfb.uit.no/databases/), which are publicly available resources that promote marine research and innovation. These data resources, which have been implemented in the Marine Metagenomics Portal (MMP) (https://mmp.sfb.uit.no/), are collections of richly annotated and manually curated contextual (metadata) and sequence databases representing three tiers of accuracy. While MarRef is a database for completely sequenced marine prokaryotic genomes, which represent a marine prokaryote reference genome database, MarDB includes all incomplete sequenced prokaryotic genomes regardless level of completeness. The last database, MarCat, represents a gene (protein) catalog of uncultivable (and cultivable) marine genes and proteins derived from marine metagenomics samples. The first versions of MarRef and MarDB contain 612 and 3726 records, respectively. Each record is built up of 106 metadata fields including attributes for sampling, sequencing, assembly and annotation in addition to the organism and taxonomic information. Currently, MarCat contains 1227 records with 55 metadata fields. Ontologies and controlled vocabularies are used in the contextual databases to enhance consistency. The user-friendly web interface lets the visitors browse, filter and search in the contextual databases and perform BLAST searches against the corresponding sequence databases. All contextual and sequence databases are freely accessible and downloadable from https://s1.sfb.uit.no/public/mar/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. The Danish Testicular Cancer database

    DEFF Research Database (Denmark)

    Daugaard, Gedske; Kier, Maria Gry Gundgaard; Bandak, Mikkel

    2016-01-01

    AIM: The nationwide Danish Testicular Cancer database consists of a retrospective research database (DaTeCa database) and a prospective clinical database (Danish Multidisciplinary Cancer Group [DMCG] DaTeCa database). The aim is to improve the quality of care for patients with testicular cancer (TC......) in Denmark, that is, by identifying risk factors for relapse, toxicity related to treatment, and focusing on late effects. STUDY POPULATION: All Danish male patients with a histologically verified germ cell cancer diagnosis in the Danish Pathology Registry are included in the DaTeCa databases. Data...... collection has been performed from 1984 to 2007 and from 2013 onward, respectively. MAIN VARIABLES AND DESCRIPTIVE DATA: The retrospective DaTeCa database contains detailed information with more than 300 variables related to histology, stage, treatment, relapses, pathology, tumor markers, kidney function...

  5. Performance of an open-source heart sound segmentation algorithm on eight independent databases.

    Science.gov (United States)

    Liu, Chengyu; Springer, David; Clifford, Gari D

    2017-08-01

    Heart sound segmentation is a prerequisite step for the automatic analysis of heart sound signals, facilitating the subsequent identification and classification of pathological events. Recently, hidden Markov model-based algorithms have received increased interest due to their robustness in processing noisy recordings. In this study we aim to evaluate the performance of the recently published logistic regression based hidden semi-Markov model (HSMM) heart sound segmentation method, by using a wider variety of independently acquired data of varying quality. Firstly, we constructed a systematic evaluation scheme based on a new collection of heart sound databases, which we assembled for the PhysioNet/CinC Challenge 2016. This collection includes a total of more than 120 000 s of heart sounds recorded from 1297 subjects (including both healthy subjects and cardiovascular patients) and comprises eight independent heart sound databases sourced from multiple independent research groups around the world. Then, the HSMM-based segmentation method was evaluated using the assembled eight databases. The common evaluation metrics of sensitivity, specificity, accuracy, as well as the [Formula: see text] measure were used. In addition, the effect of varying the tolerance window for determining a correct segmentation was evaluated. The results confirm the high accuracy of the HSMM-based algorithm on a separate test dataset comprised of 102 306 heart sounds. An average [Formula: see text] score of 98.5% for segmenting S1 and systole intervals and 97.2% for segmenting S2 and diastole intervals were observed. The [Formula: see text] score was shown to increases with an increases in the tolerance window size, as expected. The high segmentation accuracy of the HSMM-based algorithm on a large database confirmed the algorithm's effectiveness. The described evaluation framework, combined with the largest collection of open access heart sound data, provides essential resources for

  6. Motion database of disguised and non-disguised team handball penalty throws by novice and expert performers

    Directory of Open Access Journals (Sweden)

    Fabian Helm

    2017-12-01

    Full Text Available This article describes the motion database for a large sample (n = 2400 of 7-m penalty throws in team handball that includes 1600 disguised throws. Throws were performed by both novice (n = 5 and expert (n = 5 penalty takers. The article reports the methods and materials used to capture the motion data. The database itself is accessible for download via JLU Web Server and provides all raw files in a three-dimensional motion data format (.c3d. Additional information is given on the marker placement of the penalty taker, goalkeeper, and ball together with details on the skill level and/or playing history of the expert group. The database was first used by Helm et al. (2017 [1] to investigate the kinematic patterns of disguised movements. Results of this analysis are reported and discussed in their article “Kinematic patterns underlying disguised movements: Spatial and temporal dissimilarity compared to genuine movement patterns” (doi:10.1016/j.humov.2017.05.010 [1]. Keywords: Motion capture data, Disguise, Expertise

  7. 42 CFR 493.931 - Routine chemistry.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Routine chemistry. 493.931 Section 493.931 Public... Proficiency Testing Programs by Specialty and Subspecialty § 493.931 Routine chemistry. (a) Program content and frequency of challenge. To be approved for proficiency testing for routine chemistry, a program...

  8. BDVC (Bimodal Database of Violent Content): A database of violent audio and video

    Science.gov (United States)

    Rivera Martínez, Jose Luis; Mijes Cruz, Mario Humberto; Rodríguez Vázqu, Manuel Antonio; Rodríguez Espejo, Luis; Montoya Obeso, Abraham; García Vázquez, Mireya Saraí; Ramírez Acosta, Alejandro Álvaro

    2017-09-01

    Nowadays there is a trend towards the use of unimodal databases for multimedia content description, organization and retrieval applications of a single type of content like text, voice and images, instead bimodal databases allow to associate semantically two different types of content like audio-video, image-text, among others. The generation of a bimodal database of audio-video implies the creation of a connection between the multimedia content through the semantic relation that associates the actions of both types of information. This paper describes in detail the used characteristics and methodology for the creation of the bimodal database of violent content; the semantic relationship is stablished by the proposed concepts that describe the audiovisual information. The use of bimodal databases in applications related to the audiovisual content processing allows an increase in the semantic performance only and only if these applications process both type of content. This bimodal database counts with 580 audiovisual annotated segments, with a duration of 28 minutes, divided in 41 classes. Bimodal databases are a tool in the generation of applications for the semantic web.

  9. International Nuclear Safety Center (INSC) database

    International Nuclear Information System (INIS)

    Sofu, T.; Ley, H.; Turski, R.B.

    1997-01-01

    As an integral part of DOE's International Nuclear Safety Center (INSC) at Argonne National Laboratory, the INSC Database has been established to provide an interactively accessible information resource for the world's nuclear facilities and to promote free and open exchange of nuclear safety information among nations. The INSC Database is a comprehensive resource database aimed at a scope and level of detail suitable for safety analysis and risk evaluation for the world's nuclear power plants and facilities. It also provides an electronic forum for international collaborative safety research for the Department of Energy and its international partners. The database is intended to provide plant design information, material properties, computational tools, and results of safety analysis. Initial emphasis in data gathering is given to Soviet-designed reactors in Russia, the former Soviet Union, and Eastern Europe. The implementation is performed under the Oracle database management system, and the World Wide Web is used to serve as the access path for remote users. An interface between the Oracle database and the Web server is established through a custom designed Web-Oracle gateway which is used mainly to perform queries on the stored data in the database tables

  10. Database Description - SKIP Stemcell Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SKIP Stemcell Database Database Description General information of database Database name SKIP Stemcell Database...rsity Journal Search: Contact address http://www.skip.med.keio.ac.jp/en/contact/ Database classification Human Genes and Diseases Dat...abase classification Stemcell Article Organism Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database...ks: Original website information Database maintenance site Center for Medical Genetics, School of medicine, ...lable Web services Not available URL of Web services - Need for user registration Not available About This Database Database

  11. Database Description - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Database Description General information of database Database n... BioResource Center Hiroshi Masuya Database classification Plant databases - Arabidopsis thaliana Organism T...axonomy Name: Arabidopsis thaliana Taxonomy ID: 3702 Database description The Arabidopsis thaliana phenome i...heir effective application. We developed the new Arabidopsis Phenome Database integrating two novel database...seful materials for their experimental research. The other, the “Database of Curated Plant Phenome” focusing

  12. An evaluation of routine specialist palliative care for patients on the Liverpool Care Pathway.

    Science.gov (United States)

    Thompson, Jo; Brown, Jayne; Davies, Andrew

    2014-01-01

    This report describes a service evaluation of the 'added value' of routine specialist palliative care team (SPCT) involvement with patients on the Liverpool Care Pathway for the Dying Patient (LCP). In the authors' hospital, patients that are commenced on the LCP are routinely referred to the SPCT. They are reviewed on the day of referral and then at least every other day, depending on the clinical situation. The data for this report was obtained by reviewing the SPCT's clinical database and the patients' LCP proformas. The SPCT intervened in the care of 80% of 158 newly referred patients, e.g. for alteration of continuous subcutaneous infusion (23%) or alteration of use of non-pharmacological interventions (21%). Furthermore, 11% of patients were taken off the LCP, around one quarter of whom were later put back on. The authors' model of care could overcome many of the issues relating to the LCP and would ameliorate the developing vacuum of care for patients at the end of life.

  13. Database on wind characteristics - Contents of database bank

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2004-01-01

    and Denmark, with Denmark as the Operating Agent. The reporting of the continuation of Annex XVII falls in two separate parts. Part one accounts in detailsfor the available data in the established database bank, and part two describes various data analyses performed with the overall purpose of improving...

  14. Comparison of dementia recorded in routinely collected hospital admission data in England with dementia recorded in primary care.

    Science.gov (United States)

    Brown, Anna; Kirichek, Oksana; Balkwill, Angela; Reeves, Gillian; Beral, Valerie; Sudlow, Cathie; Gallacher, John; Green, Jane

    2016-01-01

    Electronic linkage of UK cohorts to routinely collected National Health Service (NHS) records provides virtually complete follow-up for cause-specific hospital admissions and deaths. The reliability of dementia diagnoses recorded in NHS hospital data is not well documented. For a sample of Million Women Study participants in England we compared dementia recorded in routinely collected NHS hospital data (Hospital Episode Statistics: HES) with dementia recorded in two separate sources of primary care information: a primary care database [Clinical Practice Research Datalink (CPRD), n = 340] and a survey of study participants' General Practitioners (GPs, n = 244). Dementia recorded in HES fully agreed both with CPRD and with GP survey data for 85% of women; it did not agree for 1 and 4%, respectively. Agreement was uncertain for the remaining 14 and 11%, respectively; and among those classified as having uncertain agreement in CPRD, non-specific terms compatible with dementia, such as 'memory loss', were recorded in the CPRD database for 79% of the women. Agreement was significantly better (p primary care (CPRD) than in hospital (HES) data. Age-specific rates for dementia based on the hospital admission data were lower than the rates based on the primary care data, but were similar if the delay in recording in HES was taken into account. Dementia recorded in routinely collected NHS hospital admission data for women in England agrees well with primary care records of dementia assessed separately from two different sources, and is sufficiently reliable for epidemiological research.

  15. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage of MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. The data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 9 figs

  16. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 10 figs

  17. Dual-Routine HCV/HIV Testing: Seroprevalence and Linkage to Care in Four Community Health Centers in Philadelphia, Pennsylvania.

    Science.gov (United States)

    Coyle, Catelyn; Kwakwa, Helena

    2016-01-01

    Despite common risk factors, screening for hepatitis C virus (HCV) and HIV at the same time as part of routine medical care (dual-routine HCV/HIV testing) is not commonly implemented in the United States. This study examined improvements in feasibility of implementation, screening increase, and linkage to care when a dual-routine HCV/HIV testing model was integrated into routine primary care. National Nursing Centers Consortium implemented a dual-routine HCV/HIV testing model at four community health centers in Philadelphia, Pennsylvania, on September 1, 2013. Routine HCV and opt-out HIV testing replaced the routine HCV and opt-in HIV testing model through medical assistant-led, laboratory-based testing and electronic medical record modification to prompt, track, report, and facilitate reimbursement for tests performed on uninsured individuals. This study examined testing, seropositivity, and linkage-to-care comparison data for the nine months before (December 1, 2012-August 31, 2013) and after (September 1, 2013-May 31, 2014) implementation of the dual-routine HCV/HIV testing model. A total of 1,526 HCV and 1,731 HIV tests were performed before, and 1,888 HCV and 3,890 HIV tests were performed after dual-routine testing implementation, resulting in a 23.7% increase in HCV tests and a 124.7% increase in HIV tests. A total of 70 currently HCV-infected and four new HIV-seropositive patients vs. 101 HCV-infected and 13 new HIV-seropositive patients were identified during these two periods, representing increases of 44.3% for HCV antibody-positive and RNA-positive tests and 225.0% for HIV-positive tests. Linkage to care increased from 27 currently infected HCV--positive and one HIV-positive patient pre-dual-routine testing to 39 HCV--positive and nine HIV-positive patients post-dual-routine testing. The dual-routine HCV/HIV testing model shows that integrating dual-routine testing in a primary care setting is possible and leads to increased HCV and HIV screening

  18. Role of Postoperative Vitamin D and/or Calcium Routine Supplementation in Preventing Hypocalcemia After Thyroidectomy: A Systematic Review and Meta-Analysis

    Science.gov (United States)

    Alhefdhi, Amal; Mazeh, Haggi

    2013-01-01

    Background. Transient hypocalcemia is a frequent complication after total thyroidectomy. Routine postoperative administration of vitamin D and calcium can reduce the incidence of symptomatic postoperative hypocalcemia. We performed a systematic review to assess the effectiveness of this intervention. The primary aim was to evaluate the efficacy of routine postoperative oral calcium and vitamin D supplementation in preventing symptomatic post-thyroidectomy hypocalcemia. The second aim was to draw clear guidelines regarding prophylactic calcium and/or vitamin D therapy for patients after thyroidectomy. Methods. We identified randomized controlled trials comparing the administration of vitamin D or its metabolites to calcium or no treatment in adult patients after thyroidectomy. The search was performed in PubMed, Cochrane Library, Cumulative Index to Nursing and Allied Health Literature, Google Scholar, and Web of Knowledge databases. Patients with a history of previous neck surgery, calcium supplementation, or renal impairment were excluded. Results. Nine studies with 2,285 patients were included: 22 in the vitamin D group, 580 in the calcium group, 792 in the vitamin D and calcium group, and 891 in the no intervention group, with symptomatic hypocalcemia incidences of 4.6%, 14%, 14%, and 20.5%, respectively. Subcomparisons demonstrated that the incidences of postoperative hypocalcemia were 10.1% versus 18.8% for calcium versus no intervention and 6.8% versus 25.9% for vitamin D and calcium versus no intervention. The studies showed a significant range of variability in patients' characteristics. Conclusions. A significant decrease in postoperative hypocalcemia was identified in patients who received routine supplementation of oral calcium or vitamin D. The incidence decreased even more with the combined administration of both supplements. Based on this analysis, we recommend oral calcium for all patients following thyroidectomy, with the addition of vitamin D for

  19. Mining routinely collected acute data to reveal non-linear relationships between nurse staffing levels and outcomes.

    Science.gov (United States)

    Leary, Alison; Cook, Rob; Jones, Sarahjane; Smith, Judith; Gough, Malcolm; Maxwell, Elaine; Punshon, Geoffrey; Radford, Mark

    2016-12-16

    Nursing is a safety critical activity but not easily quantified. This makes the building of predictive staffing models a challenge. The aim of this study was to determine if relationships between registered and non-registered nurse staffing levels and clinical outcomes could be discovered through the mining of routinely collected clinical data. The secondary aim was to examine the feasibility and develop the use of 'big data' techniques commonly used in industry for this area of healthcare and examine future uses. The data were obtained from 1 large acute National Health Service hospital trust in England. Routinely collected physiological, signs and symptom data from a clinical database were extracted, imported and mined alongside a bespoke staffing and outcomes database using Mathmatica V.10. The physiological data consisted of 120 million patient entries over 6 years, the bespoke database consisted of 9 years of daily data on staffing levels and safety factors such as falls. To discover patterns in these data or non-linear relationships that would contribute to modelling. To examine feasibility of this technique in this field. After mining, 40 correlations (pdata (such as the presence or absence of nausea) and staffing factors. Several inter-related factors demonstrated step changes where registered nurse availability appeared to relate to physiological parameters or outcomes such as falls and the management of symptoms. Data extraction proved challenging as some commercial databases were not built for extraction of the massive data sets they contain. The relationship between staffing and outcomes appears to exist. It appears to be non-linear but calculable and a data-driven model appears possible. These findings could be used to build an initial mathematical model for acute staffing which could be further tested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Microfoundations of Routines and Capabilities

    DEFF Research Database (Denmark)

    Felin, Teppo; Foss, Nicolai Juul; Heimriks, Koen H.

    We discuss the microfoundations of routines and capabilities, including why a microfoundations view is needed and how it may inform work on organizational and competitive heterogeneity. Building on extant research, we identify three primary categories of micro-level components underlying routines...

  1. Improved technical performance of a multifunctional prehospital telemedicine system between the research phase and the routine use phase - an observational study.

    Science.gov (United States)

    Felzen, Marc; Brokmann, Jörg C; Beckers, Stefan K; Czaplik, Michael; Hirsch, Frederik; Tamm, Miriam; Rossaint, Rolf; Bergrath, Sebastian

    2017-04-01

    Introduction Telemedical concepts in emergency medical services (EMS) lead to improved process times and patient outcomes, but their technical performance has thus far been insufficient; nevertheless, the concept was transferred into EMS routine care in Aachen, Germany. This study evaluated the system's technical performance and compared it to a precursor system. Methods The telemedicine system was implemented on seven ambulances and a teleconsultation centre staffed with experienced EMS physicians was established in April 2014. Telemedical applications included mobile vital data, 12-lead, picture transmission and video streaming from inside the ambulances. The tele-EMS physician filled in a questionnaire regarding the technical performance of the applications, background noise and assessed clinical values of the transmitted pictures and videos after each mission between 15 May 2014-15 October 2014. Results Teleconsultation was established during 539 emergency cases. In 83% of the cases ( n = 447), only the paramedics and the tele-EMS physician were involved. Transmission success rates ranged from 98% (audio connection) to 93% (12-lead electrocardiogram (ECG) transmission). All functionalities, except video transmission, were significantly better than the pilot project ( p < 0.05). Severe background noise was detected to a lesser extent ( p = 0.0004) and the clinical value of the pictures and videos were considered significantly more valuable. Discussion The multifunctional system is now sufficient for routine use and is the most reliable mobile emergency telemedicine system compared to other published projects. Dropouts were due to user errors and network coverage problems. These findings enable widespread use of this system in the future, reducing the critical time intervals until medical therapy is started.

  2. Development of a PSA information database system

    International Nuclear Information System (INIS)

    Kim, Seung Hwan

    2005-01-01

    The need to develop the PSA information database for performing a PSA has been growing rapidly. For example, performing a PSA requires a lot of data to analyze, to evaluate the risk, to trace the process of results and to verify the results. PSA information database is a system that stores all PSA related information into the database and file system with cross links to jump to the physical documents whenever they are needed. Korea Atomic Energy Research Institute is developing a PSA information database system, AIMS (Advanced Information Management System for PSA). The objective is to integrate and computerize all the distributed information of a PSA into a system and to enhance the accessibility to PSA information for all PSA related activities. This paper describes how we implemented such a database centered application in the view of two areas, database design and data (document) service

  3. SVM detection of epileptiform activity in routine EEG.

    LENUS (Irish Health Repository)

    Kelleher, Daniel

    2010-01-01

    Routine electroencephalogram (EEG) is an important test in aiding the diagnosis of patients with suspected epilepsy. These recordings typically last 20-40 minutes, during which signs of abnormal activity (spikes, sharp waves) are looked for in the EEG trace. It is essential that events of short duration are detected during the routine EEG test. The work presented in this paper examines the effect of changing a range of input values to the detection system on its ability to distinguish between normal and abnormal EEG activity. It is shown that the length of analysis window in the range of 0.5s to 1s are well suited to the task. Additionally, it is reported that patient specific systems should be used where possible due to their better performance.

  4. Routine data for disease surveillance in the undeveloped region of the OR Tambo district of the Eastern Cape Province.

    Science.gov (United States)

    Kabuya, Chrispin; Wright, Graham; Odama, Anthony; O'Mahoney, Don

    2014-01-01

    The research team needed to upsize the solution previously tested so that it could expand the routine data collected via tablet computers. The research team identified the general flow of data within clinics. Data was mainly collected from registers, which were later converted to electronic form and checked for duplication. A database was designed for the collection of demographic data (Patient Master Index), which was aimed at eliminating duplication of patients' data in several registers. Open Data Kit (ODK) Collect was setup on Android tablets for collecting disease related routine data, while ODK Aggregate as the storage and aggregates of data captured by ODK Collect and the Patient Master Index for demographic data, were setup on an Apple Mini Mac server. Data collection is in progress. The expected results include improved data quality, reliability and quick access to summary data. Secondly, instant retrieval of patient demographic details and clinic numbers are included. Thirdly, ability to form standard reporting from the SQL database and lastly exporting data into the TIER.net and DHIS systems via CVS files thus eliminating the need for data capturers are shown.

  5. Dansk Hjerteregister--en klinisk database

    DEFF Research Database (Denmark)

    Abildstrøm, Steen Zabell; Kruse, Marie; Rasmussen, Søren

    2008-01-01

    INTRODUCTION: The Danish Heart Registry (DHR) keeps track of all coronary angiographies (CATH), percutaneous coronary interventions (PCI), coronary artery bypass grafting (CABG), and adult heart valve surgery performed in Denmark. DHR is a clinical database established in order to follow the acti......INTRODUCTION: The Danish Heart Registry (DHR) keeps track of all coronary angiographies (CATH), percutaneous coronary interventions (PCI), coronary artery bypass grafting (CABG), and adult heart valve surgery performed in Denmark. DHR is a clinical database established in order to follow...

  6. JAEA thermodynamic database for performance assessment of geological disposal of high-level and TRU wastes. Selection of thermodynamic data of selenium

    International Nuclear Information System (INIS)

    Doi, Reisuke; Kitamura, Akira; Yui, Mikazu

    2010-02-01

    Within the scope of the JAEA thermodynamic database project for performance assessment of geological disposal of high-level and TRU radioactive wastes, the selection of the thermodynamic data on the inorganic compounds and complexes of selenium was carried out. Selection of thermodynamic data of selenium was based on a thermodynamic database of selenium published by the Nuclear Energy Agency in the Organisation for Economic Co-operation and Development (OECD/NEA). The remarks of a thermodynamic database by OECD/NEA found by the authors were noted in this report and then thermodynamic data was reviewed after surveying latest literatures. Some thermodynamic values of iron selenides were not selected by the OECD/NEA due to low reliability. But they were important for the performance assessment of geological disposal of radioactive wastes, so we selected them as a tentative value with specifying reliability and needs of the value to be determined. (author)

  7. Clearer, Simpler and more Efficient LAPACK Routines for Symmetric Positive Definite Band Factorization

    DEFF Research Database (Denmark)

    Gustavson, Fred G.; Quintania-Orti, Enrique S.; Quintana-Orti, Gregorio

    We describe a minor format change for representing a symmetric band matrix AB using the same array space specified by LAPACK. In LAPACK, band codes operating on the lower part of a symmetric matrix reference matrix element (i, j) as AB1+i−j,j . The format change we propose allows LAPACK band codes...... to reference the (i, j) element as ABi,j . Doing this yields lower band codes that use standard matrix terminology so that they become clearer and hence easier to understand. As a second contribution, we simplify the LAPACK Cholesky Band Factorization routine pbtrf by reducing from six to three the number...... of subroutine calls one needs to invoke during a right-looking block factorization step. Our new routines perform exactly the same number of floating-point arithmetic operations as the current LAPACK routine pbtrf. Almost always they deliver higher performance. The experimental results show...

  8. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  9. Assessing U.S. ESCO industry performance and market trends: Results from the NAESCO database project

    International Nuclear Information System (INIS)

    Osborn, Julie; Goldman, Chuck; Hopper, Nicole; Singer, Terry

    2002-01-01

    The U.S. Energy Services Company (ESCO) industry is often cited as the most successful model for the private sector delivery of energy-efficiency services. This study documents actual performance of the ESCO industry in order to provide policymakers and investors with objective informative and customers with a resource for benchmarking proposed projects relative to industry performance. We have assembled a database of nearly 1500 case studies of energy-efficiency projects - the most comprehensive data set of the U.S. ESCO industry available. These projects include$2.55B of work completed by 51 ESCOs and span much of the history of this industry

  10. Consuming technologies - developing routines

    DEFF Research Database (Denmark)

    Gram-Hanssen, Kirsten

    2008-01-01

    technologies and in this article these processes will be investigated from three different perspectives: an historical perspective of how new technologies have entered homes, a consumer perspective of how both houses and new technologies are purchased and finally, as the primary part of the article, a user...... perspective of how routines develop while these technologies are being used. In the conclusion these insights are discussed in relation to possible ways of influencing routines....

  11. Development of a dose database in the refuelling scenario of a nuclear power plant for a virtual reality application

    International Nuclear Information System (INIS)

    Rodenas, J.; Zarza, I.; Pascual, A.; Felipe, A.; Sanchez-Mayoral, M.L.

    2002-01-01

    Operators in Nuclear Power Plants can receive high doses during refuelling operation. A training program simulating refuelling operations will be useful to reduce doses received by workers as well as to minimise operation time. With this goal in mind a Virtual Reality application is developed in the frame of CIPRES Project (Calculos Interactivos de Proteccion Radiologica en un Entorno de Simulacion - Interactive Calculations of Radiological Protection in a Simulation Environment), a RD project sponsored by IBERINCO and developed jointly by IBERINCO and the Nuclear Engineering Department of the Polytechnic University of Valencia. The Virtual Reality application requires the possibility of displaying doses, both instantaneous and accumulated, at all times during the operator training. Therefore, it is necessary to elaborate a database containing dose rates at every point of the refuelling plant. This database is elaborated from Radiological Protection Surveillance data measured throughout the plant during refuelling operation. To estimate doses throughout the refuelling plant some interpolation routines have been used. Different assumptions have been adopted in order to perform the interpolation and obtain consistent data. In this paper, procedures developed to elaborate the dose database for the Virtual Reality application are presented and analysed

  12. Exploring emergency department 4-hour target performance and cancelled elective operations: a regression analysis of routinely collected and openly reported NHS trust data.

    Science.gov (United States)

    Keogh, Brad; Culliford, David; Guerrero-Ludueña, Richard; Monks, Thomas

    2018-05-24

    To quantify the effect of intrahospital patient flow on emergency department (ED) performance targets and indicate if the expectations set by the National Health Service (NHS) England 5-year forward review are realistic in returning emergency services to previous performance levels. Linear regression analysis of routinely reported trust activity and performance data using a series of cross-sectional studies. NHS trusts in England submitting routine nationally reported measures to NHS England. 142 acute non-specialist trusts operating in England between 2012 and 2016. The primary outcome measures were proportion of 4-hour waiting time breaches and cancelled elective operations. Univariate and multivariate linear regression models were used to show relationships between the outcome measures and various measures of trust activity including empty day beds, empty night beds, day bed to night bed ratio, ED conversion ratio and delayed transfers of care. Univariate regression results using the outcome of 4-hour breaches showed clear relationships with empty night beds and ED conversion ratio between 2012 and 2016. The day bed to night bed ratio showed an increasing ability to explain variation in performance between 2015 and 2016. Delayed transfers of care showed little evidence of an association. Multivariate model results indicated that the ability of patient flow variables to explain 4-hour target performance had reduced between 2012 and 2016 (19% to 12%), and had increased in explaining cancelled elective operations (7% to 17%). The flow of patients through trusts is shown to influence ED performance; however, performance has become less explainable by intratrust patient flow between 2012 and 2016. Some commonly stated explanatory factors such as delayed transfers of care showed limited evidence of being related. The results indicate some of the measures proposed by NHS England to reduce pressure on EDs may not have the desired impact on returning services to previous

  13. Diet History Questionnaire: Database Revision History

    Science.gov (United States)

    The following details all additions and revisions made to the DHQ nutrient and food database. This revision history is provided as a reference for investigators who may have performed analyses with a previous release of the database.

  14. Comparing routine neurorehabilitation program with trunk exercises based on Bobath concept in multiple sclerosis: pilot study.

    Science.gov (United States)

    Keser, Ilke; Kirdi, Nuray; Meric, Aydin; Kurne, Asli Tuncer; Karabudak, Rana

    2013-01-01

    This study compared trunk exercises based on the Bobath concept with routine neurorehabilitation approaches in multiple sclerosis (MS). Bobath and routine neurorehabilitation exercises groups were evaluated. MS cases were divided into two groups. Both groups joined a 3 d/wk rehabilitation program for 8 wk. The experimental group performed trunk exercises based on the Bobath concept, and the control group performed routine neurorehabilitation exercises. Additionally, both groups performed balance and coordination exercises. All patients were evaluated with the Trunk Impairment Scale (TIS), Berg Balance Scale (BBS), International Cooperative Ataxia Rating Scale (ICARS), and Multiple Sclerosis Functional Composite (MSFC) before and after the physiotherapy program. In group analysis, TIS, BBS, ICARS, and MSFC scores and strength of abdominal muscles were significantly different after treatment in both groups (p 0.05). Although trunk exercises based on the Bobath concept are rarely applied in MS rehabilitation, the results of this study show that they are as effective as routine neurorehabilitation exercises. Therefore, trunk exercises based on the Bobath concept can be beneficial in MS rehabilitation programs.

  15. Google Scholar Out-Performs Many Subscription Databases when Keyword Searching. A Review of: Walters, W. H. (2009. Google Scholar search performance: Comparative recall and precision. portal: Libraries and the Academy, 9(1, 5-24.

    Directory of Open Access Journals (Sweden)

    Giovanna Badia

    2010-09-01

    Full Text Available Objective – To compare the search performance (i.e., recall and precision of Google Scholar with that of 11 other bibliographic databases when using a keyword search to find references on later-life migration. Design – Comparative database evaluation. Setting – Not stated in the article. It appears from the author’s affiliation that this research took place in an academic institution of higher learning. Subjects – Twelve databases were compared: Google Scholar, Academic Search Elite, AgeLine, ArticleFirst, EconLit, Geobase, Medline, PAIS International, Popline, Social Sciences Abstracts, Social Sciences Citation Index, and SocIndex. Methods – The relevant literature on later-life migration was pre-identified as a set of 155 journal articles published from 1990 to 2000. The author selected these articles from database searches, citation tracking, journal scans, and consultations with social sciences colleagues. Each database was evaluated with regards to its performance in finding references to these 155 papers.Elderly and migration were the keywords used to conduct the searches in each of the 12 databases, since these were the words that were the most frequently used in the titles of the 155 relevant articles. The search was performed in the most basic search interface of each database that allowed limiting results by the needed publication dates (1990-2000. Search results were sorted by relevance when possible (for 9 out of the 12 databases, and by date when the relevance sorting option was not available. Recall and precision statistics were then calculated from the search results. Recall is the number of relevant results obtained in the database for a search topic, divided by all the potential results which can be obtained on that topic (in this case, 155 references. Precision is the number of relevant results obtained in the database for a search topic, divided by the total number of results that were obtained in the database on

  16. Integration of short bouts of physical activity into organizational routine a systematic review of the literature.

    Science.gov (United States)

    Barr-Anderson, Daheia J; AuYoung, Mona; Whitt-Glover, Melicia C; Glenn, Beth A; Yancey, Antronette K

    2011-01-01

    Recommended daily physical activity accumulated in short intervals (e.g., organizational routine as part of the regular "conduct of business." PubMed, MEDLINE, and Google Scholar databases were searched in August 2009 (updated search in February and July 2010) to identify relevant, peer-reviewed journal articles and abstracts on school-, worksite-, and faith-based interventions of short, structurally integrated physical activity breaks. The majority of interventions implemented daily physical activity bouts of 10-15 minutes in length. Schools were the most common settings among the 40 published articles included in this review. The rigor of the studies varied by setting, with more than 75% of worksite versus 25% of school studies utilizing RCT designs. Studies focused on a broad range of outcomes, including academic/work performance indicators, mental health outcomes, and clinical disease risk indicators, in addition to physical activity level. Physical activity was the most commonly assessed outcome in school-based studies, with more than half of studies assessing and observing improvements in physical activity outcomes following the intervention. About a quarter of worksite-based studies assessed physical activity, and the majority found a positive effect of the intervention on physical activity levels. About half of studies also observed improvements in other relevant outcomes such as academic and work performance indicators (e.g., academic achievement, cognitive performance, work productivity); psychosocial factors (e.g., stress, mood); and clinical disease risk indicators (e.g., blood pressure, BMI). The average study duration was more than 1 year, and several reported outcomes at 3-6 years. Interventions integrating physical activity into organizational routine during everyday life have demonstrated modest but consistent benefits, particularly for physical activity, and these are promising avenues of investigation. The proportionately longer-term outcomes

  17. Investigations of CR39 dosimeters for neutron routine dosimetry

    International Nuclear Information System (INIS)

    Weinstein, M.; Abraham, A.; Tshuva, A.; German, U.

    2004-01-01

    CR-39 is a polymeric nuclear track detector which is widely used for neutron dosimetry. CR-39 detector development was conducted at a number of laboratories throughout the world(1,2) , and was accepted also for routine dosimetry. However, there are shortcomings which must be taken into consideration the lack of a dosimetry grade material which causes batch variations, significant angular dependence and a moderate sensitivity. CR-39 also under-responds for certain classes of neutron spectra (lower energy neutrons from reactors or high energy accelerator-produced neutrons).In order to introduce CR-39 as a routine dosimeter at NRCN, a series of checks were performed. The present work describes the results of some of our checks, to characterize the main properties of CR-39 dosimeters

  18. Routine-industrial planning in the ATOMMASh enterprise

    International Nuclear Information System (INIS)

    Zabara, V.N.; Kovalev, B.V.; Bobrov, A.A.; Gostishchev, V.S.; Edikhanov, V.P.

    1987-01-01

    Structure of automated system for routine-industrial planning, developed at the ATOMMASh enterprise is considered. 11 problems, enabling to calculate the duration of cycles of fabrication and lead of detail putting to departments, schedules of detail production in departments fo year, quater, mounth, production plans in norm-hours, equipment utilization, as well as problems, providing for schedule performance were developed. All operational data on the state of industry are concentrated in the data base of operational control

  19. A motional Stark effect diagnostic analysis routine for improved resolution of iota in the core of the large helical device.

    Science.gov (United States)

    Dobbins, T J; Ida, K; Suzuki, C; Yoshinuma, M; Kobayashi, T; Suzuki, Y; Yoshida, M

    2017-09-01

    A new Motional Stark Effect (MSE) analysis routine has been developed for improved spatial resolution in the core of the Large Helical Device (LHD). The routine was developed to reduce the dependency of the analysis on the Pfirsch-Schlüter (PS) current in the core. The technique used the change in the polarization angle as a function of flux in order to find the value of diota/dflux at each measurement location. By integrating inwards from the edge, the iota profile can be recovered from this method. This reduces the results' dependency on the PS current because the effect of the PS current on the MSE measurement is almost constant as a function of flux in the core; therefore, the uncertainty in the PS current has a minimal effect on the calculation of the iota profile. In addition, the VMEC database was remapped from flux into r/a space by interpolating in mode space in order to improve the database core resolution. These changes resulted in a much smoother iota profile, conforming more to the physics expectations of standard discharge scenarios in the core of the LHD.

  20. Applicability and Efficiency of NGS in Routine Diagnosis: In-Depth Performance Analysis of a Complete Workflow for CFTR Mutation Analysis.

    Directory of Open Access Journals (Sweden)

    Adrien Pagin

    Full Text Available Actually, about 2000 sequence variations have been documented in the CFTR gene requiring extensive and multi-step genetic testing in the diagnosis of cystic fibrosis and CFTR-related disorders. We present a two phases study, with validation and performance monitoring, of a single experiment methodology based on multiplex PCR and high throughput sequencing that allows detection of all variants, including large rearrangements, affecting the coding regions plus three deep intronic loci.A total of 340 samples, including 257 patients and 83 previously characterized control samples, were sequenced in 17 MiSeq runs and analyzed with two bioinformatic pipelines in routine diagnostic conditions. We obtained 100% coverage for all the target regions in every tested sample.We correctly identified all the 87 known variants in the control samples and successfully confirmed the 62 variants identified among the patients without observing false positive results. Large rearrangements were identified in 18/18 control samples. Only 17 patient samples showed false positive signals (6.6%, 12 of which showed a borderline result for a single amplicon. We also demonstrated the ability of the assay to detect allele specific dropout of amplicons when a sequence variation occurs at a primer binding site thus limiting the risk for false negative results.We described here the first NGS workflow for CFTR routine analysis that demonstrated equivalent diagnostic performances compared to Sanger sequencing and multiplex ligation-dependent probe amplification. This study illustrates the advantages of NGS in term of scalability, workload reduction and cost-effectiveness in combination with an improvement of the overall data quality due to the simultaneous detection of SNVs and large rearrangements.

  1. Liver biopsy performance and histological findings among patients with chronic viral hepatitis: a Danish database study

    DEFF Research Database (Denmark)

    Christensen, Peer Brehm; Krarup, Henrik Bygum; Møller, Axel

    2007-01-01

    We investigated the variance of liver biopsy frequency and histological findings among patients with chronic viral hepatitis attending 10 medical centres in Denmark. Patients who tested positive for HBsAg or HCV- RNA were retrieved from a national clinical database (DANHEP) and demographic data...... had developed in 23% after 20 y of infection. Age above 40 y was a better predictor of cirrhosis than elevated ALT. National database comparison may identify factors of importance for improved management of patients with chronic viral hepatitis. Udgivelsesdato: 2007-null......, laboratory analyses and liver biopsy results were collected. A total of 1586 patients were identified of whom 69.7% had hepatitis C, 28.9% hepatitis B, and 1.5% were coinfected. In total, 771 (48.6%) had a biopsy performed (range 33.3-78.7%). According to the Metavir classification, 29.3% had septal fibrosis...

  2. Molecule database framework: a framework for creating database applications with chemical structure search capability.

    Science.gov (United States)

    Kiener, Joos

    2013-12-11

    Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework

  3. Real-time whole-genome sequencing for routine typing, surveillance, and outbreak detection of verotoxigenic Escherichia coli.

    OpenAIRE

    Joensen, Katrine Grimstrup; Scheutz, Flemming; Lund, Ole; Hasman, Henrik; Kaas, Rolf Sommer; Nielsen, Eva M.; Aarestrup, Frank Møller

    2014-01-01

    Fast and accurate identification and typing of pathogens are essential for effective surveillance and outbreak detection. The current routine procedure is based on a variety of techniques, making the procedure laborious, time-consuming, and expensive. With whole-genome sequencing (WGS) becoming cheaper, it has huge potential in both diagnostics and routine surveillance. The aim of this study was to perform a real-time evaluation of WGS for routine typing and surveillance of verocytotoxin-prod...

  4. Real-Time Whole-Genome Sequencing for Routine Typing, Surveillance, and Outbreak Detection of Verotoxigenic Escherichia coli

    OpenAIRE

    Joensen, Katrine Grimstrup; Scheutz, Flemming; Lund, Ole; Hasman, Henrik; Kaas, Rolf S.; Nielsen, Eva M.; Aarestrup, Frank M.

    2014-01-01

    Fast and accurate identification and typing of pathogens are essential for effective surveillance and outbreak detection. The current routine procedure is based on a variety of techniques, making the procedure laborious, time-consuming, and expensive. With whole-genome sequencing (WGS) becoming cheaper, it has huge potential in both diagnostics and routine surveillance. The aim of this study was to perform a real-time evaluation of WGS for routine typing and surveillance of verocytotoxin-prod...

  5. The quality of clinical maternal and neonatal healthcare - a strategy for identifying 'routine care signal functions'.

    Directory of Open Access Journals (Sweden)

    Stephan Brenner

    Full Text Available A variety of clinical process indicators exists to measure the quality of care provided by maternal and neonatal health (MNH programs. To allow comparison across MNH programs in low- and middle-income countries (LMICs, a core set of essential process indicators is needed. Although such a core set is available for emergency obstetric care (EmOC, the 'EmOC signal functions', a similar approach is currently missing for MNH routine care evaluation. We describe a strategy for identifying core process indicators for routine care and illustrate their usefulness in a field example.We first developed an indicator selection strategy by combining epidemiological and programmatic aspects relevant to MNH in LMICs. We then identified routine care process indicators meeting our selection criteria by reviewing existing quality of care assessment protocols. We grouped these indicators into three categories based on their main function in addressing risk factors of maternal or neonatal complications. We then tested this indicator set in a study assessing MNH quality of clinical care in 33 health facilities in Malawi.Our strategy identified 51 routine care processes: 23 related to initial patient risk assessment, 17 to risk monitoring, 11 to risk prevention. During the clinical performance assessment a total of 82 cases were observed. Birth attendants' adherence to clinical standards was lowest in relation to risk monitoring processes. In relation to major complications, routine care processes addressing fetal and newborn distress were performed relatively consistently, but there were major gaps in the performance of routine care processes addressing bleeding, infection, and pre-eclampsia risks.The identified set of process indicators could identify major gaps in the quality of obstetric and neonatal care provided during the intra- and immediate postpartum period. We hope our suggested indicators for essential routine care processes will contribute to streamlining

  6. The quality of clinical maternal and neonatal healthcare - a strategy for identifying 'routine care signal functions'.

    Science.gov (United States)

    Brenner, Stephan; De Allegri, Manuela; Gabrysch, Sabine; Chinkhumba, Jobiba; Sarker, Malabika; Muula, Adamson S

    2015-01-01

    A variety of clinical process indicators exists to measure the quality of care provided by maternal and neonatal health (MNH) programs. To allow comparison across MNH programs in low- and middle-income countries (LMICs), a core set of essential process indicators is needed. Although such a core set is available for emergency obstetric care (EmOC), the 'EmOC signal functions', a similar approach is currently missing for MNH routine care evaluation. We describe a strategy for identifying core process indicators for routine care and illustrate their usefulness in a field example. We first developed an indicator selection strategy by combining epidemiological and programmatic aspects relevant to MNH in LMICs. We then identified routine care process indicators meeting our selection criteria by reviewing existing quality of care assessment protocols. We grouped these indicators into three categories based on their main function in addressing risk factors of maternal or neonatal complications. We then tested this indicator set in a study assessing MNH quality of clinical care in 33 health facilities in Malawi. Our strategy identified 51 routine care processes: 23 related to initial patient risk assessment, 17 to risk monitoring, 11 to risk prevention. During the clinical performance assessment a total of 82 cases were observed. Birth attendants' adherence to clinical standards was lowest in relation to risk monitoring processes. In relation to major complications, routine care processes addressing fetal and newborn distress were performed relatively consistently, but there were major gaps in the performance of routine care processes addressing bleeding, infection, and pre-eclampsia risks. The identified set of process indicators could identify major gaps in the quality of obstetric and neonatal care provided during the intra- and immediate postpartum period. We hope our suggested indicators for essential routine care processes will contribute to streamlining MNH program

  7. Active Movement Warm-Up Routines

    Science.gov (United States)

    Walter, Teri; Quint, Ashleigh; Fischer, Kim; Kiger, Joy

    2011-01-01

    This article presents warm-ups that are designed to physiologically and psychologically prepare students for vigorous physical activity. An active movement warm-up routine is made up of three parts: (1) active warm-up movement exercises, (2) general preparation, and (3) the energy system. These warm-up routines can be used with all grade levels…

  8. Fit Between Organization Design and Organizational Routines

    Directory of Open Access Journals (Sweden)

    Constance E. Helfat

    2014-07-01

    Full Text Available Despite decades of research on both organization design and organizational routines, little research has analyzed the relationship between them. Here we propose a normative theory in which the effectiveness of organization design and redesign depends on the characteristics of routines. The analysis shows which types of organization designs may be useful as well as which design changes may or may not succeed depending on (a the specificity of routines and (b the dynamic versus static purposes of organizational routines.

  9. Performance Assessment of Dynaspeak Speech Recognition System on Inflight Databases

    National Research Council Canada - National Science Library

    Barry, Timothy

    2004-01-01

    .... To aid in the assessment of various commercially available speech recognition systems, several aircraft speech databases have been developed at the Air Force Research Laboratory's Human Effectiveness Directorate...

  10. Improving care coordination using organisational routines.

    Science.gov (United States)

    Prætorius, Thim

    2016-01-01

    The purpose of this paper is to systematically apply theory of organisational routines to standardised care pathways. The explanatory power of routines is used to address open questions in the care pathway literature about their coordinating and organising role, the way they change and can be replicated, the way they are influenced by the organisation and the way they influence health care professionals. Theory of routines is systematically applied to care pathways in order to develop theoretically derived propositions. Care pathways mirror routines by being recurrent, collective and embedded and specific to an organisation. In particular, care pathways resemble standard operating procedures that can give rise to recurrent collective action patterns. In all, 11 propositions related to five categories are proposed by building on these insights: care pathways and coordination, change, replication, the organisation and health care professionals. Research limitations/implications - The paper is conceptual and uses care pathways as illustrative instances of hospital routines. The propositions provide a starting point for empirical research. The analysis highlights implications that health care professionals and managers have to consider in relation to coordination, change, replication, the way the organisation influences care pathways and the way care pathways influence health care professionals. Originality/value - Theory on organisational routines offers fundamental, yet unexplored, insights into hospital processes, including in particular care coordination.

  11. Database Description - Yeast Interacting Proteins Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Yeast Interacting Proteins Database Database Description General information of database Database... name Yeast Interacting Proteins Database Alternative name - DOI 10.18908/lsdba.nbdc00742-000 Creator C...-ken 277-8561 Tel: +81-4-7136-3989 FAX: +81-4-7136-3979 E-mail : Database classif...s cerevisiae Taxonomy ID: 4932 Database description Information on interactions and related information obta...l Acad Sci U S A. 2001 Apr 10;98(8):4569-74. Epub 2001 Mar 13. External Links: Original website information Database

  12. The prevalence of adrenal incidentaloma in routine clinical practice.

    LENUS (Irish Health Repository)

    Davenport, Colin

    2012-02-01

    The prevalence of adrenal incidentaloma (AI) on computed tomography (CT) in the general population has been reported to be as high as 4.2%. However, many of the previous studies in this field utilised a prospective approach with analysis of CT scans performed by one or more radiologists with a specialist interest in adrenal tumours and a specific focus on identifying the presence of an adrenal mass. A typical radiology department, with a focus on the patient\\'s presenting complaint as opposed to the adrenal gland, may not be expected to diagnose as many adrenal incidentalomas as would be identified in a dedicated research protocol. We hypothesised that the number of AI reported in routine clinical practice is significantly lower than the published figures would suggest. We retrospectively reviewed the reports of all CT thorax and abdomen scans performed in our hospital over a 2 year period. 3,099 patients underwent imaging, with 3,705 scans performed. The median age was 63 years (range 18-98). Thirty-seven true AI were diagnosed during the time period studied. Twenty-two were diagnosed by CT abdomen (22\\/2,227) and 12 by CT thorax (12\\/1,478), a prevalence of 0.98 and 0.81% with CT abdomen and thorax, respectively, for AI in routine clinical practice.

  13. 42 CFR 493.1210 - Condition: Routine chemistry.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Routine chemistry. 493.1210 Section 493....1210 Condition: Routine chemistry. If the laboratory provides services in the subspecialty of Routine chemistry, the laboratory must meet the requirements specified in §§ 493.1230 through 493.1256, § 493.1267...

  14. Update History of This Database - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Update History of This Database Date Update contents 2014/05/07 The co...ntact information is corrected. The features and manner of utilization of the database are corrected. 2014/02/04 Trypanosomes Databas...e English archive site is opened. 2011/04/04 Trypanosomes Database ( http://www.tan...paku.org/tdb/ ) is opened. About This Database Database Description Download Lice...nse Update History of This Database Site Policy | Contact Us Update History of This Database - Trypanosomes Database | LSDB Archive ...

  15. Database computing in HEP

    International Nuclear Information System (INIS)

    Day, C.T.; Loken, S.; MacFarlane, J.F.; May, E.; Lifka, D.; Lusk, E.; Price, L.E.; Baden, A.

    1992-01-01

    The major SSC experiments are expected to produce up to 1 Petabyte of data per year each. Once the primary reconstruction is completed by farms of inexpensive processors. I/O becomes a major factor in further analysis of the data. We believe that the application of database techniques can significantly reduce the I/O performed in these analyses. We present examples of such I/O reductions in prototype based on relational and object-oriented databases of CDF data samples

  16. ZAGRADA - A New Radiocarbon Database

    International Nuclear Information System (INIS)

    Portner, A.; Obelic, B.; Krajcar Bornic, I.

    2008-01-01

    In the Radiocarbon and Tritium Laboratory at the Rudjer Boskovic Institute three different techniques for 14C dating have been used: Gas Proportional Counting (GPC), Liquid Scintillation Counting (LSC) and preparation of milligram-sized samples for AMS dating (Accelerator Mass Spectrometry). The use of several measurement techniques has initiated a need for development of a new relational database ZAGRADA (Zagreb Radiocarbon Database) since the existing software package CARBO could not satisfy the requirements for parallel processing/using of several techniques. Using the SQL procedures, and constraints defined by primary and foreign keys, ZAGRADA enforces high data integrity and provides better performances in data filtering and sorting. Additionally, the new database for 14C samples is a multi-user oriented application that can be accessed from remote computers in the work group providing thus better efficiency of laboratory activities. In order to facilitate data handling and processing in ZAGRADA, the graphical user interface is designed to be user-friendly and to perform various actions on data like input, corrections, searching, sorting and output to printer. All invalid actions performed in user interface are registered with short textual description of an error occurred and appearing on screen in message boxes. Unauthorized access is also prevented by login control and each application window has implemented support to track last changes made by the user. The implementation of a new database for 14C samples has significant contribution to scientific research performed in the Radiocarbon and Tritium Laboratory and will provide better and easier communication with customers.(author)

  17. Report on the database structuring project in fiscal 1996 related to the 'surveys on making databases for energy saving (2)'; 1996 nendo database kochiku jigyo hokokusho. Sho energy database system ka ni kansuru chosa 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    With an objective to support promotion of energy conservation in such countries as Japan, China, Indonesia, the Philippines, Thailand, Malaysia, Taiwan and Korea, primary information on energy conservation in each country was collected, and the database was structured. This paper summarizes the achievements in fiscal 1996. Based on the survey result on the database project having been progressed to date, and on various data having been collected, this fiscal year has discussed structuring the database for distribution and proliferation of the database. In the discussion, requirements for the functions to be possessed by the database, items of data to be recorded in the database, and processing of the recorded data were put into order referring to propositions on the database circumstances. Demonstrations for the database of a proliferation version were performed in the Philippines, Indonesia and China. Three hundred CDs for distribution in each country were prepared. Adjustments and confirmation on operation of the supplied computers were carried out, and the operation explaining meetings were held in China and the Philippines. (NEDO)

  18. Generic environmental statement on the routine use of plutonium-powered cardiac pacemakers

    International Nuclear Information System (INIS)

    Shoup, R.L.; Robinson, T.W.; O'Donnell, F.R.

    1976-01-01

    The purpose of a continuing program at ORNL is to provide technical assistance to the NRC on writing and editing of the final environmental statement on the routine use of nuclear-powered (primarily 238 Pu) cardiac pacemakers. This environmental statement defines the safety and reliability standards that nuclear-powered pacemakers are required to meet. All aspects of the risks to the patients, the public, and the environment are evaluated both for the routine use of plutonium-powered pacemakers and for postulated accidents involving pacemaker patients. Benefits derived from the use of plutonium-powered units are discussed and weighed against the risks in order to determine whether routine use is justified. Available alternative pacemakers with various performance characteristics are compared with respect to costs and to the needs of pacemaker patients

  19. Large Science Databases – Are Cloud Services Ready for Them?

    Directory of Open Access Journals (Sweden)

    Ani Thakar

    2011-01-01

    Full Text Available We report on attempts to put an astronomical database – the Sloan Digital Sky Survey science archive – in the cloud. We find that it is very frustrating to impossible at this time to migrate a complex SQL Server database into current cloud service offerings such as Amazon (EC2 and Microsoft (SQL Azure. Certainly it is impossible to migrate a large database in excess of a TB, but even with (much smaller databases, the limitations of cloud services make it very difficult to migrate the data to the cloud without making changes to the schema and settings that would degrade performance and/or make the data unusable. Preliminary performance comparisons show a large performance discrepancy with the Amazon cloud version of the SDSS database. These difficulties suggest that much work and coordination needs to occur between cloud service providers and their potential clients before science databases – not just large ones but even smaller databases that make extensive use of advanced database features for performance and usability – can successfully and effectively be deployed in the cloud. We describe a powerful new computational instrument that we are developing in the interim – the Data-Scope – that will enable fast and efficient analysis of the largest (petabyte scale scientific datasets.

  20. PRISM: Processing routines in IDL for spectroscopic measurements (installation manual and user's guide, version 1.0)

    Science.gov (United States)

    Kokaly, Raymond F.

    2011-01-01

    This report describes procedures for installing and using the U.S. Geological Survey Processing Routines in IDL for Spectroscopic Measurements (PRISM) software. PRISM provides a framework to conduct spectroscopic analysis of measurements made using laboratory, field, airborne, and space-based spectrometers. Using PRISM functions, the user can compare the spectra of materials of unknown composition with reference spectra of known materials. This spectroscopic analysis allows the composition of the material to be identified and characterized. Among its other functions, PRISM contains routines for the storage of spectra in database files, import/export of ENVI spectral libraries, importation of field spectra, correction of spectra to absolute reflectance, arithmetic operations on spectra, interactive continuum removal and comparison of spectral features, correction of imaging spectrometer data to ground-calibrated reflectance, and identification and mapping of materials using spectral feature-based analysis of reflectance data. This report provides step-by-step instructions for installing the PRISM software and running its functions.

  1. Routines Are the Foundation of Classroom Management

    Science.gov (United States)

    Lester, Robin Rawlings; Allanson, Patricia Bolton; Notar, Charles E.

    2017-01-01

    Classroom management is the key to learning. Routines are the foundation of classroom management. Students require structure in their lives. Routines provide that in all of their life from the time they awake until the time they go to bed. Routines in a school and in the classroom provide the environment for learning to take place. The paper is…

  2. Evaluated and estimated solubility of some elements for performance assessment of geological disposal of high-level radioactive waste using updated version of thermodynamic database

    International Nuclear Information System (INIS)

    Kitamura, Akira; Doi, Reisuke; Yoshida, Yasushi

    2011-01-01

    Japan Atomic Energy Agency (JAEA) established the thermodynamic database (JAEA-TDB) for performance assessment of geological disposal of high-level radioactive waste (HLW) and TRU waste. Twenty-five elements which were important for the performance assessment of geological disposal were selected for the database. JAEA-TDB enhanced reliability of evaluation and estimation of their solubility through selecting the latest and the most reliable thermodynamic data at present. We evaluated and estimated solubility of the 25 elements in the simulated porewaters established in the 'Second Progress Report for Safety Assessment of Geological Disposal of HLW in Japan' using the JAEA-TDB and compared with those using the previous thermodynamic database (JNC-TDB). It was found that most of the evaluated and estimated solubility values were not changed drastically, but the solubility and speciation of dominant aqueous species for some elements using the JAEA-TDB were different from those using the JNC-TDB. We discussed about how to provide reliable solubility values for the performance assessment. (author)

  3. The Danish Testicular Cancer database.

    Science.gov (United States)

    Daugaard, Gedske; Kier, Maria Gry Gundgaard; Bandak, Mikkel; Mortensen, Mette Saksø; Larsson, Heidi; Søgaard, Mette; Toft, Birgitte Groenkaer; Engvad, Birte; Agerbæk, Mads; Holm, Niels Vilstrup; Lauritsen, Jakob

    2016-01-01

    The nationwide Danish Testicular Cancer database consists of a retrospective research database (DaTeCa database) and a prospective clinical database (Danish Multidisciplinary Cancer Group [DMCG] DaTeCa database). The aim is to improve the quality of care for patients with testicular cancer (TC) in Denmark, that is, by identifying risk factors for relapse, toxicity related to treatment, and focusing on late effects. All Danish male patients with a histologically verified germ cell cancer diagnosis in the Danish Pathology Registry are included in the DaTeCa databases. Data collection has been performed from 1984 to 2007 and from 2013 onward, respectively. The retrospective DaTeCa database contains detailed information with more than 300 variables related to histology, stage, treatment, relapses, pathology, tumor markers, kidney function, lung function, etc. A questionnaire related to late effects has been conducted, which includes questions regarding social relationships, life situation, general health status, family background, diseases, symptoms, use of medication, marital status, psychosocial issues, fertility, and sexuality. TC survivors alive on October 2014 were invited to fill in this questionnaire including 160 validated questions. Collection of questionnaires is still ongoing. A biobank including blood/sputum samples for future genetic analyses has been established. Both samples related to DaTeCa and DMCG DaTeCa database are included. The prospective DMCG DaTeCa database includes variables regarding histology, stage, prognostic group, and treatment. The DMCG DaTeCa database has existed since 2013 and is a young clinical database. It is necessary to extend the data collection in the prospective database in order to answer quality-related questions. Data from the retrospective database will be added to the prospective data. This will result in a large and very comprehensive database for future studies on TC patients.

  4. Evolution of the Configuration Database Design

    International Nuclear Information System (INIS)

    Salnikov, A.

    2006-01-01

    The BABAR experiment at SLAC successfully collects physics data since 1999. One of the major parts of its on-line system is the configuration database which provides other parts of the system with the configuration data necessary for data taking. Originally the configuration database was implemented in the Objectivity/DB ODBMS. Recently BABAR performed a successful migration of its event store from Objectivity/DB to ROOT and this prompted a complete phase-out of the Objectivity/DB in all other BABAR databases. It required the complete redesign of the configuration database to hide any implementation details and to support multiple storage technologies. In this paper we describe the process of the migration of the configuration database, its new design, implementation strategy and details

  5. Routine sputum culture

    Science.gov (United States)

    Sputum culture ... There, it is placed in a special dish (culture). It is then watched to see if bacteria ... Elsevier; 2018:chap 36. Chernecky CC, Berger BJ. Culture, routine. In: Chernecky CC, Berger BJ, eds. Laboratory ...

  6. Data collection for improved follow-up of operating experiences. SKI damage database. Contents and aims with database

    International Nuclear Information System (INIS)

    Gott, Karen

    1997-01-01

    The Stryk database is presented and discussed in conjunction with the Swedish regulations concerning structural components in nuclear installations. The database acts as a reference library for reported cracks and degradation and can be used to retrieve information about individual events or for compiling statistics and performing trend analyses

  7. Large scale access tests and online interfaces to ATLAS conditions databases

    International Nuclear Information System (INIS)

    Amorim, A; Lopes, L; Pereira, P; Simoes, J; Soloviev, I; Burckhart, D; Schmitt, J V D; Caprini, M; Kolos, S

    2008-01-01

    The access of the ATLAS Trigger and Data Acquisition (TDAQ) system to the ATLAS Conditions Databases sets strong reliability and performance requirements on the database storage and access infrastructures. Several applications were developed to support the integration of Conditions database access with the online services in TDAQ, including the interface to the Information Services (IS) and to the TDAQ Configuration Databases. The information storage requirements were the motivation for the ONline A Synchronous Interface to COOL (ONASIC) from the Information Service (IS) to LCG/COOL databases. ONASIC avoids the possible backpressure from Online Database servers by managing a local cache. In parallel, OKS2COOL was developed to store Configuration Databases into an Offline Database with history record. The DBStressor application was developed to test and stress the access to the Conditions database using the LCG/COOL interface while operating in an integrated way as a TDAQ application. The performance scaling of simultaneous Conditions database read accesses was studied in the context of the ATLAS High Level Trigger large computing farms. A large set of tests were performed involving up to 1000 computing nodes that simultaneously accessed the LCG central database server infrastructure at CERN

  8. Digital Dental X-ray Database for Caries Screening

    Science.gov (United States)

    Rad, Abdolvahab Ehsani; Rahim, Mohd Shafry Mohd; Rehman, Amjad; Saba, Tanzila

    2016-06-01

    Standard database is the essential requirement to compare the performance of image analysis techniques. Hence the main issue in dental image analysis is the lack of available image database which is provided in this paper. Periapical dental X-ray images which are suitable for any analysis and approved by many dental experts are collected. This type of dental radiograph imaging is common and inexpensive, which is normally used for dental disease diagnosis and abnormalities detection. Database contains 120 various Periapical X-ray images from top to bottom jaw. Dental digital database is constructed to provide the source for researchers to use and compare the image analysis techniques and improve or manipulate the performance of each technique.

  9. Update History of This Database - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Update History of This Database Date Update contents 2017/02/27 Arabidopsis Phenome Data...base English archive site is opened. - Arabidopsis Phenome Database (http://jphenom...e.info/?page_id=95) is opened. About This Database Database Description Download License Update History of This Database... Site Policy | Contact Us Update History of This Database - Arabidopsis Phenome Database | LSDB Archive ...

  10. FASTPLOT, Interface Routines to MS FORTRAN Graphics Library

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: FASTPLOT is a library of routines that can be used to interface with the Microsoft FORTRAN Graphics library (GRAPHICS.LIB). The FASTPLOT routines simplify the development of graphics applications and add capabilities such as histograms, Splines, symbols, and error bars. FASTPLOT also includes routines that can be used to create menus. 2 - Methods: FASTPLOT is a library of routines which must be linked with a user's FORTRAN programs that call any FASTPLOT routines. In addition, the user must link with the Microsoft FORTRAN Graphics library (GRAPHICS.LIB). 3 - Restrictions on the complexity of the problem: None noted

  11. Description of geological data in SKBs database GEOTAB

    International Nuclear Information System (INIS)

    Sehlstedt, S.; Stark, T.

    1991-01-01

    Since 1977 the Swedish Nuclear Fuel and Waste Management Co, SKB, has been performing a research and development programme for final disposal of spent nuclear fuel. The purpose of the programme is to acquire knowledge and data of radioactive waste. Measurement for the characterisation of geological, geophysical, hydrogeological and hydrochemical conditions are performed in specific site investigations as well as for geoscientific projects. Large data volumes have been produced since the start of the programme, both raw data and results. During the years these data were stored in various formats by the different institutions and companies that performed the investigations. It was therefore decided that all data from the research and development programme should be gathered in a database. The database, called GEOTAB, is a relational database. The database comprises six main groups of data volumes. These are: Background information, geological data, geophysical data, hydrological and meteorological data, hydrochemical data, and tracer tests. This report deals with geological data and described the dataflow from the measurements at the sites to the result tables in the database. The geological investigations have been divided into three categories, and each category is stored separately in the database. They are: Surface fractures, core mapping, and chemical analyses. (authors)

  12. Daily medication routine of adolescents with HIV/AIDS

    Directory of Open Access Journals (Sweden)

    Cristiane Cardoso de Paula

    2013-12-01

    Full Text Available The objective of this study was to describe the sociodemographic, clinical, and behavioral characteristics of the daily medication routine of adolescents with HIV/AIDS of ages 13 to 19 years, followed at a reference service. This descriptive cross-sectional study was performed with 23 adolescents, using a quantitative approach. Data were collected using a form during appointments at the outpatient clinic. Univariate analysis revealed: females, in the initial phase of adolescence, and vertical transmission. The highlights were: lack of assiduity to appointments; unprotected sex; and consumption of alcohol. Regarding the daily medication routine, subjects depend on their parents or guardians, use strategies to remember to take the medications, and are unaware about the laboratory test for disease management and treatment. There is a need for educative intervention using information and communication technology, such as the Internet, to promote health and autonomy among adolescents. Descriptors: Acquired Immunodeficiency Syndrome; Adolescent Health; Antiretroviral Therapy, Highly Active; Nursing.

  13. External Agents' Effect on Routine Dynamics

    DEFF Research Database (Denmark)

    Busse Hansen, Nicolai

    Prior investigations on organizational routines have called for re- search to enlighten our understanding of how social actors establish and main- tain of routines as well as the causes of their disruption. The present paper con- tributes to this call by conducting systematic microethnographic...... and affiliation are central to how routines are maintained but also susceptible to disruption in case of mis- management. Also the paper contributes with a more fine-tuned understanding of action in terms of them being organized in accordance with preference, which basically means that some actions are preferred...... over others. In producing an action, the relevant next action is projected. However the relevant next action is projected in a specific way and if this is not taken in to account then the routine becomes disrupted. Another core aspect is the notion of deontics that lends itself towards describing who...

  14. TWRS technical baseline database manager definition document

    International Nuclear Information System (INIS)

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  15. Comparing childhood meal frequency to current meal frequency, routines, and expectations among parents.

    Science.gov (United States)

    Friend, Sarah; Fulkerson, Jayne A; Neumark-Sztainer, Dianne; Garwick, Ann; Flattum, Colleen Freeh; Draxten, Michelle

    2015-02-01

    Little is known about the continuation of family meals from childhood to parenthood. This study aims to examine associations between parents' report of eating family meals while growing up and their current family meal frequency, routines, and expectations. Baseline data were used from the Healthy Home Offerings via the Mealtime Environment (HOME) Plus study, a randomized controlled trial with a program to promote healthful behaviors and family meals at home. Participants (160 parent/child dyads) completed data collection in 2011-2012 in the Minneapolis/St. Paul, MN metropolitan area. Parents were predominately female (95%) and white (77%) with a mean age of 41.3 years. General linear modeling examined relationships between parents' report of how often they ate family meals while growing up and their current family meal frequency, routines, and expectations as parents, controlling for parent age, education level, and race. Parental report of eating frequent family meals while growing up was positively and significantly associated with age, education, and self-identification as white (all p meals less than three times/week or four to five times/week, parents who ate six to seven family meals/week while growing up reported significantly more frequent family meals with their current family (4.0, 4.2 vs. 5.3 family meals/week, p = .001). Eating frequent family meals while growing up was also significantly and positively associated with having current regular meal routines and meal expectations about family members eating together (both p meals with children may have long-term benefits over generations. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  16. Update History of This Database - SKIP Stemcell Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SKIP Stemcell Database Update History of This Database Date Update contents 2017/03/13 SKIP Stemcell Database... English archive site is opened. 2013/03/29 SKIP Stemcell Database ( https://www.skip.med.k...eio.ac.jp/SKIPSearch/top?lang=en ) is opened. About This Database Database Description Download License Update History of This Databa...se Site Policy | Contact Us Update History of This Database - SKIP Stemcell Database | LSDB Archive ...

  17. Computer-based Creativity Enhanced Conceptual Design Model for Non-routine Design of Mechanical Systems

    Institute of Scientific and Technical Information of China (English)

    LI Yutong; WANG Yuxin; DUFFY Alex H B

    2014-01-01

    Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.

  18. Computer-based creativity enhanced conceptual design model for non-routine design of mechanical systems

    Science.gov (United States)

    Li, Yutong; Wang, Yuxin; Duffy, Alex H. B.

    2014-11-01

    Computer-based conceptual design for routine design has made great strides, yet non-routine design has not been given due attention, and it is still poorly automated. Considering that the function-behavior-structure(FBS) model is widely used for modeling the conceptual design process, a computer-based creativity enhanced conceptual design model(CECD) for non-routine design of mechanical systems is presented. In the model, the leaf functions in the FBS model are decomposed into and represented with fine-grain basic operation actions(BOA), and the corresponding BOA set in the function domain is then constructed. Choosing building blocks from the database, and expressing their multiple functions with BOAs, the BOA set in the structure domain is formed. Through rule-based dynamic partition of the BOA set in the function domain, many variants of regenerated functional schemes are generated. For enhancing the capability to introduce new design variables into the conceptual design process, and dig out more innovative physical structure schemes, the indirect function-structure matching strategy based on reconstructing the combined structure schemes is adopted. By adjusting the tightness of the partition rules and the granularity of the divided BOA subsets, and making full use of the main function and secondary functions of each basic structure in the process of reconstructing of the physical structures, new design variables and variants are introduced into the physical structure scheme reconstructing process, and a great number of simpler physical structure schemes to accomplish the overall function organically are figured out. The creativity enhanced conceptual design model presented has a dominant capability in introducing new deign variables in function domain and digging out simpler physical structures to accomplish the overall function, therefore it can be utilized to solve non-routine conceptual design problem.

  19. Searching the Literatura Latino Americana e do Caribe em Ciências da Saúde (LILACS) database improves systematic reviews.

    Science.gov (United States)

    Clark, Otavio Augusto Camara; Castro, Aldemar Araujo

    2002-02-01

    An unbiased systematic review (SR) should analyse as many articles as possible in order to provide the best evidence available. However, many SR use only databases with high English-language content as sources for articles. Literatura Latino Americana e do Caribe em Ciências da Saúde (LILACS) indexes 670 journals from the Latin American and Caribbean health literature but is seldom used in these SR. Our objective is to evaluate if LILACS should be used as a routine source of articles for SR. First we identified SR published in 1997 in five medical journals with a high impact factor. Then we searched LILACS for articles that could match the inclusion criteria of these SR. We also checked if the authors had already identified these articles located in LILACS. In all, 64 SR were identified. Two had already searched LILACS and were excluded. In 39 of 62 (63%) SR a LILACS search identified articles that matched the inclusion criteria. In 5 (8%) our search was inconclusive and in 18 (29%) no articles were found in LILACS. Therefore, in 71% (44/72) of cases, a LILACS search could have been useful to the authors. This proportion remains the same if we consider only the 37 SR that performed a meta-analysis. In only one case had the article identified in LILACS already been located elsewhere by the authors' strategy. LILACS is an under-explored and unique source of articles whose use can improve the quality of systematic reviews. This database should be used as a routine source to identify studies for systematic reviews.

  20. The STEP database through the end-users eyes--USABILITY STUDY.

    Science.gov (United States)

    Salunke, Smita; Tuleu, Catherine

    2015-08-15

    The user-designed database of Safety and Toxicity of Excipients for Paediatrics ("STEP") is created to address the shared need of drug development community to access the relevant information of excipients effortlessly. Usability testing was performed to validate if the database satisfies the need of the end-users. Evaluation framework was developed to assess the usability. The participants performed scenario based tasks and provided feedback and post-session usability ratings. Failure Mode Effect Analysis (FMEA) was performed to prioritize the problems and improvements to the STEP database design and functionalities. The study revealed several design vulnerabilities. Tasks such as limiting the results, running complex queries, location of data and registering to access the database were challenging. The three critical attributes identified to have impact on the usability of the STEP database included (1) content and presentation (2) the navigation and search features (3) potential end-users. Evaluation framework proved to be an effective method for evaluating database effectiveness and user satisfaction. This study provides strong initial support for the usability of the STEP database. Recommendations would be incorporated into the refinement of the database to improve its usability and increase user participation towards the advancement of the database. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Fitting model-based psychometric functions to simultaneity and temporal-order judgment data: MATLAB and R routines.

    Science.gov (United States)

    Alcalá-Quintana, Rocío; García-Pérez, Miguel A

    2013-12-01

    Research on temporal-order perception uses temporal-order judgment (TOJ) tasks or synchrony judgment (SJ) tasks in their binary SJ2 or ternary SJ3 variants. In all cases, two stimuli are presented with some temporal delay, and observers judge the order of presentation. Arbitrary psychometric functions are typically fitted to obtain performance measures such as sensitivity or the point of subjective simultaneity, but the parameters of these functions are uninterpretable. We describe routines in MATLAB and R that fit model-based functions whose parameters are interpretable in terms of the processes underlying temporal-order and simultaneity judgments and responses. These functions arise from an independent-channels model assuming arrival latencies with exponential distributions and a trichotomous decision space. Different routines fit data separately for SJ2, SJ3, and TOJ tasks, jointly for any two tasks, or also jointly for the three tasks (for common cases in which two or even the three tasks were used with the same stimuli and participants). Additional routines provide bootstrap p-values and confidence intervals for estimated parameters. A further routine is included that obtains performance measures from the fitted functions. An R package for Windows and source code of the MATLAB and R routines are available as Supplementary Files.

  2. The prevalence of adrenal incidentaloma in routine clinical practice.

    LENUS (Irish Health Repository)

    Davenport, Colin

    2011-03-10

    The prevalence of adrenal incidentaloma (AI) on computed tomography (CT) in the general population has been reported to be as high as 4.2%. However, many of the previous studies in this field utilised a prospective approach with analysis of CT scans performed by one or more radiologists with a specialist interest in adrenal tumours and a specific focus on identifying the presence of an adrenal mass. A typical radiology department, with a focus on the patient\\'s presenting complaint as opposed to the adrenal gland, may not be expected to diagnose as many adrenal incidentalomas as would be identified in a dedicated research protocol. We hypothesised that the number of AI reported in routine clinical practice is significantly lower than the published figures would suggest. We retrospectively reviewed the reports of all CT thorax and abdomen scans performed in our hospital over a 2 year period. 3,099 patients underwent imaging, with 3,705 scans performed. The median age was 63 years (range 18-98). Thirty-seven true AI were diagnosed during the time period studied. Twenty-two were diagnosed by CT abdomen (22\\/2,227) and 12 by CT thorax (12\\/1,478), a prevalence of 0.98 and 0.81% with CT abdomen and thorax, respectively, for AI in routine clinical practice.

  3. Routine Radiological Environmental Monitoring Plan

    International Nuclear Information System (INIS)

    Bechtel Nevada

    1998-01-01

    The U.S. Department of Energy manages the Nevada Test Site in a manner that meets evolving DOE Missions and responds to the concerns of affected and interested individuals and agencies. This Routine Radiological Monitoring Plan addressess complicance with DOE Orders 5400.1 and 5400.5 and other drivers requiring routine effluent monitoring and environmental surveillance on the Nevada Test Site. This monitoring plan, prepared in 1998, addresses the activities conducted onsite NTS under the Final Environmental Impact Statement and Record of Decision. This radiological monitoring plan, prepared on behalf of the Nevada Test Site Landlord, brings together sitewide environmental surveillance; site-specific effluent monitoring; and operational monitoring conducted by various missions, programs, and projects on the NTS. The plan provides an approach to identifying and conducting routine radiological monitoring at the NTS, based on integrated technical, scientific, and regulatory complicance data needs

  4. Neutron metrology file NMF-90. An integrated database for performing neutron spectrum adjustment calculations

    International Nuclear Information System (INIS)

    Kocherov, N.P.

    1996-01-01

    The Neutron Metrology File NMF-90 is an integrated database for performing neutron spectrum adjustment (unfolding) calculations. It contains 4 different adjustment codes, the dosimetry reaction cross-section library IRDF-90/NMF-G with covariances files, 6 input data sets for reactor benchmark neutron fields and a number of utility codes for processing and plotting the input and output data. The package consists of 9 PC HD diskettes and manuals for the codes. It is distributed by the Nuclear Data Section of the IAEA on request free of charge. About 10 MB of diskspace is needed to install and run a typical reactor neutron dosimetry unfolding problem. (author). 8 refs

  5. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  6. The Barcelona Hospital Clínic therapeutic apheresis database.

    Science.gov (United States)

    Cid, Joan; Carbassé, Gloria; Cid-Caballero, Marc; López-Púa, Yolanda; Alba, Cristina; Perea, Dolores; Lozano, Miguel

    2017-09-22

    A therapeutic apheresis (TA) database helps to increase knowledge about indications and type of apheresis procedures that are performed in clinical practice. The objective of the present report was to describe the type and number of TA procedures that were performed at our institution in a 10-year period, from 2007 to 2016. The TA electronic database was created by transferring patient data from electronic medical records and consultation forms into a Microsoft Access database developed exclusively for this purpose. Since 2007, prospective data from every TA procedure were entered in the database. A total of 5940 TA procedures were performed: 3762 (63.3%) plasma exchange (PE) procedures, 1096 (18.5%) hematopoietic progenitor cell (HPC) collections, and 1082 (18.2%) TA procedures other than PEs and HPC collections. The overall trend for the time-period was progressive increase in total number of TA procedures performed each year (from 483 TA procedures in 2007 to 822 in 2016). The tracking trend of each procedure during the 10-year period was different: the number of PE and other type of TA procedures increased 22% and 2818%, respectively, and the number of HPC collections decreased 28%. The TA database helped us to increase our knowledge about various indications and type of TA procedures that were performed in our current practice. We also believe that this database could serve as a model that other institutions can use to track service metrics. © 2017 Wiley Periodicals, Inc.

  7. IAEA Post Irradiation Examination Facilities Database

    International Nuclear Information System (INIS)

    Jenssen, Haakon; Blanc, J.Y.; Dobuisson, P.; Manzel, R.; Egorov, A.A.; Golovanov, V.; Souslov, D.

    2005-01-01

    The number of hot cells in the world in which post irradiation examination (PIE) can be performed has diminished during the last few decades. This creates problems for countries that have nuclear power plants and require PIE for surveillance, safety and fuel development. With this in mind, the IAEA initiated the issue of a catalogue within the framework of a coordinated research program (CRP), started in 1992 and completed in 1995, under the title of ''Examination and Documentation Methodology for Water Reactor Fuel (ED-WARF-II)''. Within this program, a group of technical consultants prepared a questionnaire to be completed by relevant laboratories. From these questionnaires a catalogue was assembled. The catalogue lists the laboratories and PIE possibilities worldwide in order to make it more convenient to arrange and perform contractual PIE within hot cells on water reactor fuels and core components, e.g. structural and absorber materials. This catalogue was published as working material in the Agency in 1996. During 2002 and 2003, the catalogue was converted to a database and updated through questionnaires to the laboratories in the Member States of the Agency. This activity was recommended by the IAEA Technical Working Group on Water Reactor Fuel Performance and Technology (TWGFPT) at its plenary meeting in April 2001. The database consists of five main areas about PIE facilities: acceptance criteria for irradiated components; cell characteristics; PIE techniques; refabrication/instrumentation capabilities; and storage and conditioning capabilities. The content of the database represents the status of the listed laboratories as of 2003. With the database utilizing a uniform format for all laboratories and details of technique, it is hoped that the IAEA Member States will be able to use this catalogue to select laboratories most relevant to their particular needs. The database can also be used to compare the PIE capabilities worldwide with current and future

  8. Database Description - Open TG-GATEs Pathological Image Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Open TG-GATEs Pathological Image Database Database Description General information of database Database... name Open TG-GATEs Pathological Image Database Alternative name - DOI 10.18908/lsdba.nbdc00954-0...iomedical Innovation 7-6-8, Saito-asagi, Ibaraki-city, Osaka 567-0085, Japan TEL:81-72-641-9826 Email: Database... classification Toxicogenomics Database Organism Taxonomy Name: Rattus norvegi... Article title: Author name(s): Journal: External Links: Original website information Database

  9. The Danish Sarcoma Database

    DEFF Research Database (Denmark)

    Jørgensen, Peter Holmberg; Lausten, Gunnar Schwarz; Pedersen, Alma B

    2016-01-01

    AIM: The aim of the database is to gather information about sarcomas treated in Denmark in order to continuously monitor and improve the quality of sarcoma treatment in a local, a national, and an international perspective. STUDY POPULATION: Patients in Denmark diagnosed with a sarcoma, both...... skeletal and ekstraskeletal, are to be registered since 2009. MAIN VARIABLES: The database contains information about appearance of symptoms; date of receiving referral to a sarcoma center; date of first visit; whether surgery has been performed elsewhere before referral, diagnosis, and treatment; tumor...... of Diseases - tenth edition codes and TNM Classification of Malignant Tumours, and date of death (after yearly coupling to the Danish Civil Registration System). Data quality and completeness are currently secured. CONCLUSION: The Danish Sarcoma Database is population based and includes sarcomas occurring...

  10. Summary of typical routine maintenance activities at Tokai Reprocessing Plant. Supplement (March, 2002)

    International Nuclear Information System (INIS)

    2002-03-01

    Typical maintenance activities, such as replacement of worn out parts and cleaning of filter elements, routinely performed during steady operation are summarized. [The Summary of Typical Routine Maintenance Activities at Tokai Reprocessing Plant] (JNC TN 8450 2001-006) was already prepared in September, 2001. The purpose of this summary is to give elementary understanding on these activities to people who are responsible for explanation them to the public. At this time, the same kind of summary is prepared as a supplement of the previous one. (author)

  11. Domain Regeneration for Cross-Database Micro-Expression Recognition

    Science.gov (United States)

    Zong, Yuan; Zheng, Wenming; Huang, Xiaohua; Shi, Jingang; Cui, Zhen; Zhao, Guoying

    2018-05-01

    In this paper, we investigate the cross-database micro-expression recognition problem, where the training and testing samples are from two different micro-expression databases. Under this setting, the training and testing samples would have different feature distributions and hence the performance of most existing micro-expression recognition methods may decrease greatly. To solve this problem, we propose a simple yet effective method called Target Sample Re-Generator (TSRG) in this paper. By using TSRG, we are able to re-generate the samples from target micro-expression database and the re-generated target samples would share same or similar feature distributions with the original source samples. For this reason, we can then use the classifier learned based on the labeled source samples to accurately predict the micro-expression categories of the unlabeled target samples. To evaluate the performance of the proposed TSRG method, extensive cross-database micro-expression recognition experiments designed based on SMIC and CASME II databases are conducted. Compared with recent state-of-the-art cross-database emotion recognition methods, the proposed TSRG achieves more promising results.

  12. The magnet database system

    International Nuclear Information System (INIS)

    Baggett, P.; Delagi, N.; Leedy, R.; Marshall, W.; Robinson, S.L.; Tompkins, J.C.

    1991-01-01

    This paper describes the current status of MagCom, a central database of SSC magnet information that is available to all magnet scientists via network connections. The database has been designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will help magnet scientists to track and control the production process and to correlate the performance of magnets with the properties of their constituents

  13. Properties of electret ionization chambers for routine dosimetry in photon radiation fields

    International Nuclear Information System (INIS)

    Doerschel, B.; Pretzsch, G.

    1985-01-01

    The main properties of photon routine dosemeters are their energy and angular dependence as well as their measuring range and accuracy. The determination of radiation exposure from dosemeter response is based on the choice of an appropriate conversion factor taking into account the influence of body backscattering on the dosemeter response. Measuring range and accuracy of an electret ionization chamber first of all depend on electret stability, methods of charge measurement, and geometry of the chamber. The dosemeter performance is described for an electret ionization chamber which was designed for application to routine monitoring of radiation workers. (author)

  14. Database Cancellation: The "Hows" and "Whys"

    Science.gov (United States)

    Shapiro, Steven

    2012-01-01

    Database cancellation is one of the most difficult tasks performed by a librarian. This may seem counter-intuitive but, psychologically, it is certainly true. When a librarian or a team of librarians has invested a great deal of time doing research, talking to potential users, and conducting trials before deciding to subscribe to a database, they…

  15. In search for effective methods of routine formation

    Directory of Open Access Journals (Sweden)

    Kandora Marcin

    2017-05-01

    Full Text Available Organizational routines are a frequently researched phenomenon in contemporary management science. Although the available theoretical foundations of Routine Theory seem to have reached a significant degree of maturity over the last thirty years, the same could not be said about the availability of material advice for the management practice. This paper addresses this gap and proposes a framework for an effective routine shaping process. It builds on a brief analysis of available literature on routine formation, supported by case study findings. The approach proposed stresses the importance of the controlled learning process and underlines the importance of deliberate implementation, in contrast to the evolutionary and engineering views on routine emergence.

  16. JCZS: An Intermolecular Potential Database for Performing Accurate Detonation and Expansion Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Baer, M.R.; Hobbs, M.L.; McGee, B.C.

    1998-11-03

    Exponential-13,6 (EXP-13,6) potential pammeters for 750 gases composed of 48 elements were determined and assembled in a database, referred to as the JCZS database, for use with the Jacobs Cowperthwaite Zwisler equation of state (JCZ3-EOS)~l) The EXP- 13,6 force constants were obtained by using literature values of Lennard-Jones (LJ) potential functions, by using corresponding states (CS) theory, by matching pure liquid shock Hugoniot data, and by using molecular volume to determine the approach radii with the well depth estimated from high-pressure isen- tropes. The JCZS database was used to accurately predict detonation velocity, pressure, and temperature for 50 dif- 3 Accurate predictions were also ferent explosives with initial densities ranging from 0.25 glcm3 to 1.97 g/cm . obtained for pure liquid shock Hugoniots, static properties of nitrogen, and gas detonations at high initial pressures.

  17. Integration of Oracle and Hadoop: Hybrid Databases Affordable at Scale

    Science.gov (United States)

    Canali, L.; Baranowski, Z.; Kothuri, P.

    2017-10-01

    This work reports on the activities aimed at integrating Oracle and Hadoop technologies for the use cases of CERN database services and in particular on the development of solutions for offloading data and queries from Oracle databases into Hadoop-based systems. The goal and interest of this investigation is to increase the scalability and optimize the cost/performance footprint for some of our largest Oracle databases. These concepts have been applied, among others, to build offline copies of CERN accelerator controls and logging databases. The tested solution allows to run reports on the controls data offloaded in Hadoop without affecting the critical production database, providing both performance benefits and cost reduction for the underlying infrastructure. Other use cases discussed include building hybrid database solutions with Oracle and Hadoop, offering the combined advantages of a mature relational database system with a scalable analytics engine.

  18. Concurrency control in distributed database systems

    CERN Document Server

    Cellary, W; Gelenbe, E

    1989-01-01

    Distributed Database Systems (DDBS) may be defined as integrated database systems composed of autonomous local databases, geographically distributed and interconnected by a computer network.The purpose of this monograph is to present DDBS concurrency control algorithms and their related performance issues. The most recent results have been taken into consideration. A detailed analysis and selection of these results has been made so as to include those which will promote applications and progress in the field. The application of the methods and algorithms presented is not limited to DDBSs but a

  19. 40 CFR 141.621 - Routine monitoring.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Routine monitoring. 141.621 Section....621 Routine monitoring. (a) Monitoring. (1) If you submitted an IDSE report, you must begin monitoring..., you must monitor at the location(s) and dates identified in your monitoring plan in § 141.132(f...

  20. Sorption, Diffusion and Solubility Databases for Performance Assessment; Base de Datos de Sorcion, Difusion y Solubilidad para la Evacuacion del Comportamiento

    Energy Technology Data Exchange (ETDEWEB)

    Garcia Gutierrez, M [Ciemat, Madrid (Spain)

    2000-07-01

    This report presents a deterministic and probabilistic databases for application in Performance Assessment of a high-level radioactive waste disposal. This work includes a theoretical description of sorption, diffusion and solubility phenomena of radionuclides in geological media. The report presents and compares the database of different nuclear wastes management agencies, describes the materials in the Spanish reference system, and the results of sorption diffusion and solubility in this system, with both the deterministic and probabilistic approximation. the probabilistic approximation is presented in the form of probability density functions (pdf). (Author) 52 refs.

  1. The Quality of Clinical Maternal and Neonatal Healthcare – A Strategy for Identifying ‘Routine Care Signal Functions’

    Science.gov (United States)

    Brenner, Stephan; De Allegri, Manuela; Gabrysch, Sabine; Chinkhumba, Jobiba; Sarker, Malabika; Muula, Adamson S.

    2015-01-01

    Background A variety of clinical process indicators exists to measure the quality of care provided by maternal and neonatal health (MNH) programs. To allow comparison across MNH programs in low- and middle-income countries (LMICs), a core set of essential process indicators is needed. Although such a core set is available for emergency obstetric care (EmOC), the ‘EmOC signal functions’, a similar approach is currently missing for MNH routine care evaluation. We describe a strategy for identifying core process indicators for routine care and illustrate their usefulness in a field example. Methods We first developed an indicator selection strategy by combining epidemiological and programmatic aspects relevant to MNH in LMICs. We then identified routine care process indicators meeting our selection criteria by reviewing existing quality of care assessment protocols. We grouped these indicators into three categories based on their main function in addressing risk factors of maternal or neonatal complications. We then tested this indicator set in a study assessing MNH quality of clinical care in 33 health facilities in Malawi. Results Our strategy identified 51 routine care processes: 23 related to initial patient risk assessment, 17 to risk monitoring, 11 to risk prevention. During the clinical performance assessment a total of 82 cases were observed. Birth attendants’ adherence to clinical standards was lowest in relation to risk monitoring processes. In relation to major complications, routine care processes addressing fetal and newborn distress were performed relatively consistently, but there were major gaps in the performance of routine care processes addressing bleeding, infection, and pre-eclampsia risks. Conclusion The identified set of process indicators could identify major gaps in the quality of obstetric and neonatal care provided during the intra- and immediate postpartum period. We hope our suggested indicators for essential routine care processes

  2. Gamification of Clinical Routine: The Dr. Fill Approach.

    Science.gov (United States)

    Bukowski, Mark; Kühn, Martin; Zhao, Xiaoqing; Bettermann, Ralf; Jonas, Stephan

    2016-01-01

    Gamification is used in clinical context in the health care education. Furthermore, it has shown great promises to improve the performance of the health care staff in their daily routine. In this work we focus on the medication sorting task, which is performed manually in hospitals. This task is very error prone and needs to be performed daily. Nevertheless, errors in the medication are crucial and lead to serious complications. In this work we present a real world gamification approach of the medication sorting task in a patient's daily pill organizer. The player of the game needs to sort the correct medication into the correct dispenser slots and is rewarded or punished in real time. At the end of the game, a score is given and the user can register in a leaderboard.

  3. Development and Exploration of a Regional Stormwater BMP Performance Database to Parameterize an Integrated Decision Support Tool (i-DST)

    Science.gov (United States)

    Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.

    2017-12-01

    Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.

  4. Update History of This Database - Yeast Interacting Proteins Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Yeast Interacting Proteins Database Update History of This Database Date Update contents 201...0/03/29 Yeast Interacting Proteins Database English archive site is opened. 2000/12/4 Yeast Interacting Proteins Database...( http://itolab.cb.k.u-tokyo.ac.jp/Y2H/ ) is released. About This Database Database Description... Download License Update History of This Database Site Policy | Contact Us Update History of This Database... - Yeast Interacting Proteins Database | LSDB Archive ...

  5. What is the yield of routine chest radiography following tube thoracostomy for trauma?

    Science.gov (United States)

    Kong, Victor Y; Oosthuizen, George V; Clarke, Damian L

    2015-01-01

    Routine chest radiography (CXR) following tube thoracostomy (TT) is a standard practice in most trauma centres worldwide. Evidence supporting this routine practice is lacking and the actual yield is unknown. We performed a retrospective review of 1042 patients over a 4-year period who had a routine post-insertion CXR performed in accordance with current ATLS® recommendations. A total 1042 TTs were performed on 1004 patients. Ninety-one per cent of patients (913/1004) were males, and the median age for all patients was 24 years. Seventy-five per cent of all injuries (756/1004) were from penetrating trauma, and the remaining 25% (248/1004) were from blunt. The initial pathologies requiring TT were: haemopneumothorax: 34% (339/1042), haemothroax: 31% (314/1042), simple pneumothorax: 25% (256/1042), tension pneumothorax: 8% (77/1042) and open pneumothorax: 5% (54/1042). One hundred and three patients had TTs performed on clinical grounds alone without a pre-insertion CXR [Group A]. One hundred and ninety-one patients had a pre-insertion CXR but had persistent clinical concerns following insertion [Group B]. Seven hundred and ten patients had pre-insertion CXR but no clinical concerns following insertion [Group C]. Overall, 15% (152/1004) [9 from Group A, 111 from Group B and 32 from Group C] of all patients had their clinical management influenced as a direct result of the post-insertion CXR. Despite the widely accepted practice of routine CXR following tube thoracostomy, the yield is relatively low. In many cases, good clinical examination post tube insertion will provide warnings as to whether problems are likely to result. However, in the more rural setting, and in resource challenged environments, there is a relatively high yield from the CXR, which alters management. Further prospective studies are needed to establish or refute the role of the existing ATLS® guidelines in these specific environments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Database Description - RMOS | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name RMOS Alternative nam...arch Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice Microarray Data and other Gene Expression Database...s Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description The Ric...19&lang=en Whole data download - Referenced database Rice Expression Database (RED) Rice full-length cDNA Database... (KOME) Rice Genome Integrated Map Database (INE) Rice Mutant Panel Database (Tos17) Rice Genome Annotation Database

  7. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  8. Update of a thermodynamic database for radionuclides to assist solubility limits calculation for performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Duro, L.; Grive, M.; Cera, E.; Domenech, C.; Bruno, J. (Enviros Spain S.L., Barcelona (ES))

    2006-12-15

    This report presents and documents the thermodynamic database used in the assessment of the radionuclide solubility limits within the SR-Can Exercise. It is a supporting report to the solubility assessment. Thermodynamic data are reviewed for 20 radioelements from Groups A and B, lanthanides and actinides. The development of this database is partially based on the one prepared by PSI and NAGRA. Several changes, updates and checks for internal consistency and completeness to the reference NAGRA-PSI 01/01 database have been conducted when needed. These modifications are mainly related to the information from the various experimental programmes and scientific literature available until the end of 2003. Some of the discussions also refer to a previous database selection conducted by Enviros Spain on behalf of ANDRA, where the reader can find additional information. When possible, in order to optimize the robustness of the database, the description of the solubility of the different radionuclides calculated by using the reported thermodynamic database is tested in front of experimental data available in the open scientific literature. When necessary, different procedures to estimate gaps in the database have been followed, especially accounting for temperature corrections. All the methodologies followed are discussed in the main text

  9. Update of a thermodynamic database for radionuclides to assist solubility limits calculation for performance assessment

    International Nuclear Information System (INIS)

    Duro, L.; Grive, M.; Cera, E.; Domenech, C.; Bruno, J.

    2006-12-01

    This report presents and documents the thermodynamic database used in the assessment of the radionuclide solubility limits within the SR-Can Exercise. It is a supporting report to the solubility assessment. Thermodynamic data are reviewed for 20 radioelements from Groups A and B, lanthanides and actinides. The development of this database is partially based on the one prepared by PSI and NAGRA. Several changes, updates and checks for internal consistency and completeness to the reference NAGRA-PSI 01/01 database have been conducted when needed. These modifications are mainly related to the information from the various experimental programmes and scientific literature available until the end of 2003. Some of the discussions also refer to a previous database selection conducted by Enviros Spain on behalf of ANDRA, where the reader can find additional information. When possible, in order to optimize the robustness of the database, the description of the solubility of the different radionuclides calculated by using the reported thermodynamic database is tested in front of experimental data available in the open scientific literature. When necessary, different procedures to estimate gaps in the database have been followed, especially accounting for temperature corrections. All the methodologies followed are discussed in the main text

  10. Impact of routine cerebral CT angiography on treatment decisions in infective endocarditis.

    Directory of Open Access Journals (Sweden)

    Marwa Sayed Meshaal

    Full Text Available Infective endocarditis (IE is commonly complicated by cerebral embolization and hemorrhage secondary to intracranial mycotic aneurysms (ICMAs. These complications are associated with poor outcome and may require diagnostic and therapeutic plans to be modified. However, routine screening by brain CT and CT angiography (CTA is not standard practice. We aimed to study the impact of routine cerebral CTA on treatment decisions for patients with IE.From July 2007 to December 2012, we prospectively recruited 81 consecutive patients with definite left-sided IE according to modified Duke's criteria. All patients had routine brain CTA conducted within one week of admission. All patients with ICMA underwent four-vessel conventional angiography. Invasive treatment was performed for ruptured aneurysms, aneurysms ≥ 5 mm, and persistent aneurysms despite appropriate therapy. Surgical clipping was performed for leaking aneurysms if not amenable to intervention.The mean age was 30.43 ± 8.8 years and 60.5% were males. Staph aureus was the most common organism (32.3%. Among the patients, 37% had underlying rheumatic heart disease, 26% had prosthetic valves, 23.5% developed IE on top of a structurally normal heart and 8.6% had underlying congenital heart disease. Brain CT/CTA revealed that 51 patients had evidence of cerebral embolization, of them 17 were clinically silent. Twenty-six patients (32% had ICMA, of whom 15 were clinically silent. Among the patients with ICMAs, 11 underwent endovascular treatment and 2 underwent neurovascular surgery. The brain CTA findings prompted different treatment choices in 21 patients (25.6%. The choices were aneurysm treatment before cardiac surgery rather than at follow-up, valve replacement by biological valve instead of mechanical valve, and withholding anticoagulation in patients with prosthetic valve endocarditis for fear of aneurysm rupture.Routine brain CT/CTA resulted in changes in the treatment plan in a significant

  11. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  12. Switching the Fermilab Accelerator Control System to a relational database

    International Nuclear Information System (INIS)

    Shtirbu, S.

    1993-01-01

    The accelerator control system (open-quotes ACNETclose quotes) at Fermilab is using a made-in-house, Assembly language, database. The database holds device information, which is mostly used for finding out how to read/set devices and how to interpret alarms. This is a very efficient implementation, but it lacks the needed flexibility and forces applications to store data in private/shared files. This database is being replaced by an off-the-shelf relational database (Sybase 2 ). The major constraints on switching are the necessity to maintain/improve response time and to minimize changes to existing applications. Innovative methods are used to help achieve the required performance, and a layer seven gateway simulates the old database for existing programs. The new database is running on a DEC ALPHA/VMS platform, and provides better performance. The switch is also exposing problems with the data currently stored in the database, and is helping in cleaning up erroneous data. The flexibility of the new relational database is going to facilitate many new applications in the future (e.g. a 3D presentation of device location). The new database is expected to fully replace the old database during this summer's shutdown

  13. PEP725 Pan European Phenological Database

    Science.gov (United States)

    Koch, E.; Adler, S.; Lipa, W.; Ungersböck, M.; Zach-Hermann, S.

    2010-09-01

    Europe is in the fortunate situation that it has a long tradition in phenological networking: the history of collecting phenological data and using them in climatology has its starting point in 1751 when Carl von Linné outlined in his work Philosophia Botanica methods for compiling annual plant calendars of leaf opening, flowering, fruiting and leaf fall together with climatological observations "so as to show how areas differ". Recently in most European countries, phenological observations have been carried out routinely for more than 50 years by different governmental and non governmental organisations and following different observation guidelines, the data stored at different places in different formats. This has been really hampering pan European studies as one has to address many network operators to get access to the data before one can start to bring them in a uniform style. From 2004 to 2009 the COST-action 725 established a European wide data set of phenological observations. But the deliverables of this COST action was not only the common phenological database and common observation guidelines - COST725 helped to trigger a revival of some old networks and to establish new ones as for instance in Sweden. At the end of 2009 the COST action the database comprised about 8 million data in total from 15 European countries plus the data from the International Phenological Gardens IPG. In January 2010 PEP725 began its work as follow up project with funding from EUMETNET the network of European meteorological services and of ZAMG the Austrian national meteorological service. PEP725 not only will take over the part of maintaining, updating the COST725 database, but also to bring in phenological data from the time before 1951, developing better quality checking procedures and ensuring an open access to the database. An attractive webpage will make phenology and climate impacts on vegetation more visible in the public enabling a monitoring of vegetation development.

  14. The Danish Hysterectomy and Hysteroscopy Database

    DEFF Research Database (Denmark)

    Topsøe, Märta Fink; Ibfelt, Else Helene; Settnes, Annette

    2016-01-01

    AIM OF THE DATABASE: The steering committee of the Danish Hysterectomy and Hysteroscopy Database (DHHD) has defined the objective of the database: the aim is firstly to reduce complications, readmissions, reoperations; secondly to specify the need for hospitalization after hysterectomy; thirdly...... DATA: Annually approximately 4,300 hysterectomies and 3,200 operative hysteroscopies are performed in Denmark. Since the establishment of the database in 2003, 50,000 hysterectomies have been registered. DHHD's nationwide cooperation and research have led to national guidelines and regimes. Annual...... national meetings and nationwide workshops have been organized. CONCLUSION: The use of vaginal and laparoscopic hysterectomy methods has increased during the past decade and the overall complication rate and hospital stay have declined. The regional variation in operation methods has also decreased....

  15. The routine use of post-operative drains in thyroid surgery: an outdated concept.

    LENUS (Irish Health Repository)

    Prichard, R S

    2010-01-01

    The use of surgical drains in patients undergoing thyroid surgery is standard surgical teaching. Life-threatening complications, arising from post-operative haematomas, mandates their utilization. There is increasing evidence to suggest that this is an outdated practice. This paper determines whether thyroid surgery can be safely performed without the routine use of drains. A retrospective review of patients undergoing thyroid surgery, over a three year period was performed and post-operative complications documented. One hundred and four thyroidectomies were performed. 63 (60.6%) patients had a partial thyroidectomy, 27 (25.9%) had a total thyroidectomy and 14 (13.5%) had a sub-total thyroidectomy. Suction drains were not inserted in any patient. A cervical haematoma did not develop in any patient in this series and no patient required re-operation. There is no evidence to suggest the routine use of surgical drains following uncomplicated thyroid surgery reduces the rate of haematoma formation or re-operation rates and indeed is now unwarranted.

  16. Design and implementation of the modified signed digit multiplication routine on a ternary optical computer.

    Science.gov (United States)

    Xu, Qun; Wang, Xianchao; Xu, Chao

    2017-06-01

    Multiplication with traditional electronic computers is faced with a low calculating accuracy and a long computation time delay. To overcome these problems, the modified signed digit (MSD) multiplication routine is established based on the MSD system and the carry-free adder. Also, its parallel algorithm and optimization techniques are studied in detail. With the help of a ternary optical computer's characteristics, the structured data processor is designed especially for the multiplication routine. Several ternary optical operators are constructed to perform M transformations and summations in parallel, which has accelerated the iterative process of multiplication. In particular, the routine allocates data bits of the ternary optical processor based on digits of multiplication input, so the accuracy of the calculation results can always satisfy the users. Finally, the routine is verified by simulation experiments, and the results are in full compliance with the expectations. Compared with an electronic computer, the MSD multiplication routine is not only good at dealing with large-value data and high-precision arithmetic, but also maintains lower power consumption and fewer calculating delays.

  17. Efficient Partitioning of Large Databases without Query Statistics

    Directory of Open Access Journals (Sweden)

    Shahidul Islam KHAN

    2016-11-01

    Full Text Available An efficient way of improving the performance of a database management system is distributed processing. Distribution of data involves fragmentation or partitioning, replication, and allocation process. Previous research works provided partitioning based on empirical data about the type and frequency of the queries. These solutions are not suitable at the initial stage of a distributed database as query statistics are not available then. In this paper, I have presented a fragmentation technique, Matrix based Fragmentation (MMF, which can be applied at the initial stage as well as at later stages of distributed databases. Instead of using empirical data, I have developed a matrix, Modified Create, Read, Update and Delete (MCRUD, to partition a large database properly. Allocation of fragments is done simultaneously in my proposed technique. So using MMF, no additional complexity is added for allocating the fragments to the sites of a distributed database as fragmentation is synchronized with allocation. The performance of a DDBMS can be improved significantly by avoiding frequent remote access and high data transfer among the sites. Results show that proposed technique can solve the initial partitioning problem of large distributed databases.

  18. High Performance Protein Sequence Database Scanning on the Cell Broadband Engine

    Directory of Open Access Journals (Sweden)

    Adrianto Wirawan

    2009-01-01

    Full Text Available The enormous growth of biological sequence databases has caused bioinformatics to be rapidly moving towards a data-intensive, computational science. As a result, the computational power needed by bioinformatics applications is growing rapidly as well. The recent emergence of low cost parallel multicore accelerator technologies has made it possible to reduce execution times of many bioinformatics applications. In this paper, we demonstrate how the Cell Broadband Engine can be used as a computational platform to accelerate two approaches for protein sequence database scanning: exhaustive and heuristic. We present efficient parallelization techniques for two representative algorithms: the dynamic programming based Smith–Waterman algorithm and the popular BLASTP heuristic. Their implementation on a Playstation®3 leads to significant runtime savings compared to corresponding sequential implementations.

  19. JT-60 database system, 2

    International Nuclear Information System (INIS)

    Itoh, Yasuhiro; Kurihara, Kenichi; Kimura, Toyoaki.

    1987-07-01

    The JT-60 central control system, ''ZENKEI'' collects the control and instrumentation data relevant to discharge and device status data for plant monitoring. The former of the engineering data amounts to about 3 Mbytes per shot of discharge. The ''ZENKEI'' control system which consists of seven minicomputers for on-line real-time control has little performance of handling such a large amount of data for physical and engineering analysis. In order to solve this problem, it was planned to establish the experimental database on the Front-end Processor (FEP) of general purpose large computer in JAERI Computer Center. The database management system (DBMS), therefore, has been developed for creating the database during the shot interval. The engineering data are shipped up from ''ZENKEI'' to FEP through the dedicated communication line after the shot. The hierarchical data model has been adopted in this database, which consists of the data files with tree structure of three keys of system, discharge type and shot number. The JT-60 DBMS provides the data handling packages of subroutines for interfacing the database with user's application programs. The subroutine packages for supporting graphic processing and the function of access control for security of the database are also prepared in this DBMS. (author)

  20. Solutions for medical databases optimal exploitation.

    Science.gov (United States)

    Branescu, I; Purcarea, V L; Dobrescu, R

    2014-03-15

    The paper discusses the methods to apply OLAP techniques for multidimensional databases that leverage the existing, performance-enhancing technique, known as practical pre-aggregation, by making this technique relevant to a much wider range of medical applications, as a logistic support to the data warehousing techniques. The transformations have practically low computational complexity and they may be implemented using standard relational database technology. The paper also describes how to integrate the transformed hierarchies in current OLAP systems, transparently to the user and proposes a flexible, "multimodel" federated system for extending OLAP querying to external object databases.

  1. Creation of the NaSCoRD Database

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jankovsky, Zachary Kyle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stuart, William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    This report was written as part of a United States Department of Energy (DOE), Office of Nuclear Energy, Advanced Reactor Technologies program funded project to re-create the capabilities of the legacy Centralized Reliability Database Organization (CREDO) database. The CREDO database provided a record of component design and performance documentation across various systems that used sodium as a working fluid. Regaining this capability will allow the DOE complex and the domestic sodium reactor industry to better understand how previous systems were designed and built for use in improving the design and operations of future loops. The contents of this report include: overview of the current state of domestic sodium reliability databases; summary of the ongoing effort to improve, understand, and process the CREDO information; summary of the initial efforts to develop a unified sodium reliability database called the Sodium System Component Reliability Database (NaSCoRD); and explain both how potential users can access the domestic sodium reliability databases and the type of information that can be accessed from these databases.

  2. Database Description - SAHG | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name SAHG Alternative nam...h: Contact address Chie Motono Tel : +81-3-3599-8067 E-mail : Database classification Structure Databases - ...e databases - Protein properties Organism Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database description... Links: Original website information Database maintenance site The Molecular Profiling Research Center for D...stration Not available About This Database Database Description Download License Update History of This Database Site Policy | Contact Us Database Description - SAHG | LSDB Archive ...

  3. Improving care coordination using organisational routines

    DEFF Research Database (Denmark)

    Prætorius, Thim

    2016-01-01

    Purpose – The purpose of this paper is to systematically apply theory of organisational routines to standardised care pathways. The explanatory power of routines is used to address open questions in the care pathway literature about their coordinating and organising role, the way they change......: care pathways and coordination, change, replication, the organisation and health care professionals. Research limitations/implications – The paper is conceptual and uses care pathways as illustrative instances of hospital routines. The propositions provide a starting point for empirical research....... Practical implications – The analysis highlights implications that health care professionals and managers have to consider in relation to coordination, change, replication, the way the organisation influences care pathways and the way care pathways influence health care professionals. Originality...

  4. Feasibility of opportunistic osteoporosis screening in routine contrast-enhanced multi detector computed tomography (MDCT) using texture analysis.

    Science.gov (United States)

    Mookiah, M R K; Rohrmeier, A; Dieckmeyer, M; Mei, K; Kopp, F K; Noel, P B; Kirschke, J S; Baum, T; Subburaj, K

    2018-04-01

    This study investigated the feasibility of opportunistic osteoporosis screening in routine contrast-enhanced MDCT exams using texture analysis. The results showed an acceptable reproducibility of texture features, and these features could discriminate healthy/osteoporotic fracture cohort with an accuracy of 83%. This aim of this study is to investigate the feasibility of opportunistic osteoporosis screening in routine contrast-enhanced MDCT exams using texture analysis. We performed texture analysis at the spine in routine MDCT exams and investigated the effect of intravenous contrast medium (IVCM) (n = 7), slice thickness (n = 7), the long-term reproducibility (n = 9), and the ability to differentiate healthy/osteoporotic fracture cohort (n = 9 age and gender matched pairs). Eight texture features were extracted using gray level co-occurrence matrix (GLCM). The independent sample t test was used to rank the features of healthy/fracture cohort and classification was performed using support vector machine (SVM). The results revealed significant correlations between texture parameters derived from MDCT scans with and without IVCM (r up to 0.91) slice thickness of 1 mm versus 2 and 3 mm (r up to 0.96) and scan-rescan (r up to 0.59). The performance of the SVM classifier was evaluated using 10-fold cross-validation and revealed an average classification accuracy of 83%. Opportunistic osteoporosis screening at the spine using specific texture parameters (energy, entropy, and homogeneity) and SVM can be performed in routine contrast-enhanced MDCT exams.

  5. The web-enabled ODIN portal - useful databases for the European Nuclear Society

    International Nuclear Information System (INIS)

    Over, H.H.; Wolfart, E.

    2005-01-01

    Materials databases (MDBs) are powerful tools to store, retrieve and present experimental materials data of various categories adapted to specific needs of users. In combination with analysis tools experimental data are necessary for e.g. mechanical design, construction and lifetime predictions of complex components. The effective and efficient handling of large amounts of generic and detailed materials properties data related to e.g. fabrication processes is one of the basic elements of data administration within ongoing European research projects and networks. Over the last 20 years, the JRC/Institute of Energy of the European Commission at Petten has developed and continuously improved a database for experimental materials properties data (Mat-DB). The Mat-DB database structure is oriented to international material standards and recommendations. The database and associated analysis routines are accessible through a web-enabled interface on the On-line Data Information Network (ODIN: http://odin.jrc.nl). ODIN provides controlled access to Mat-DB and other related databases (e.g. the document database DoMa) and thus allows European R and D projects to securely manage and disseminate their experimental test data as well as any type of supporting documentation (e.g. unfiltered raw data, reports, minutes, etc). Using the Internet project partners can instantly access and evaluate data sets entered and validated by one of the members. This paper describes the structure and functionality of Mat-DB and gives examples how these tools can be used for the benefit of European nuclear R and D community. (author)

  6. Active in-database processing to support ambient assisted living systems.

    Science.gov (United States)

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  7. Database Description - PSCDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name PSCDB Alternative n...rial Science and Technology (AIST) Takayuki Amemiya E-mail: Database classification Structure Databases - Protein structure Database...554-D558. External Links: Original website information Database maintenance site Graduate School of Informat...available URL of Web services - Need for user registration Not available About This Database Database Descri...ption Download License Update History of This Database Site Policy | Contact Us Database Description - PSCDB | LSDB Archive ...

  8. An event database for rotational seismology

    Science.gov (United States)

    Salvermoser, Johannes; Hadziioannou, Celine; Hable, Sarah; Chow, Bryant; Krischer, Lion; Wassermann, Joachim; Igel, Heiner

    2016-04-01

    The ring laser sensor (G-ring) located at Wettzell, Germany, routinely observes earthquake-induced rotational ground motions around a vertical axis since its installation in 2003. Here we present results from a recently installed event database which is the first that will provide ring laser event data in an open access format. Based on the GCMT event catalogue and some search criteria, seismograms from the ring laser and the collocated broadband seismometer are extracted and processed. The ObsPy-based processing scheme generates plots showing waveform fits between rotation rate and transverse acceleration and extracts characteristic wavefield parameters such as peak ground motions, noise levels, Love wave phase velocities and waveform coherence. For each event, these parameters are stored in a text file (json dictionary) which is easily readable and accessible on the website. The database contains >10000 events starting in 2007 (Mw>4.5). It is updated daily and therefore provides recent events at a time lag of max. 24 hours. The user interface allows to filter events for epoch, magnitude, and source area, whereupon the events are displayed on a zoomable world map. We investigate how well the rotational motions are compatible with the expectations from the surface wave magnitude scale. In addition, the website offers some python source code examples for downloading and processing the openly accessible waveforms.

  9. [The controversy of routine articulator mounting in orthodontics].

    Science.gov (United States)

    Wang, Li; Han, Xianglong; Bai, Ding

    2013-06-01

    Articulators have been widely used by clinicians of dentistry. But routine articulator mounting is still controversial in orthodontics. Orthodontists oriented by gnathology approve routine articulator mounting while nongnathologic orthodontists disapprove it. This article reviews the thoughts of orthodontist that they agree or disagree with routine articulator mounting based on the considerations of biting, temporomandibular disorder (TMD), periodontitis, and so on.

  10. Database Description - ASTRA | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name ASTRA Alternative n...tics Journal Search: Contact address Database classification Nucleotide Sequence Databases - Gene structure,...3702 Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description The database represents classified p...(10):1211-6. External Links: Original website information Database maintenance site National Institute of Ad... for user registration Not available About This Database Database Description Dow

  11. The European Tracer Experiment - experimental results and database

    International Nuclear Information System (INIS)

    Nodop, K.; Connolly, R.; Girardi, F.

    1997-01-01

    As part of the European Tracer Experiment (ETEX) two successful atmospheric experiments were carried out in October and November, 1994. Perfluorocarbon (PFC) tracers were released into the atmosphere in Monterfil, Brittany, and air samples were taken at 168 stations in 17 European countries for 72 hours after the release. Upper air tracer measurements were made from three aircraft. During the first experiment a westerly air flow transported the tracer plume north-eastwards across Europe. During the second release the flow was eastwards. The results from the ground sampling network allowed the determination of the cloud evolution as far as Sweden, Poland and Bulgaria. Typical background concentrations of the tracer used are around 5 to 7 fl/l in ambient air. Concentrations in the plume ranged from 10 to above 200 fl/l. The tracer release characteristics, the tracer concentrations at the ground and in upper air, the routine and additional meteorological observations at the ground level and in upper air, trajectories derived from constant-level balloons and the meteorological input fields for long-range transport (LRT) models are assembled in the ETEX database. The ETEX database is accessible via the Internet

  12. Incremental View Maintenance for Deductive Graph Databases Using Generalized Discrimination Networks

    Directory of Open Access Journals (Sweden)

    Thomas Beyhl

    2016-12-01

    Full Text Available Nowadays, graph databases are employed when relationships between entities are in the scope of database queries to avoid performance-critical join operations of relational databases. Graph queries are used to query and modify graphs stored in graph databases. Graph queries employ graph pattern matching that is NP-complete for subgraph isomorphism. Graph database views can be employed that keep ready answers in terms of precalculated graph pattern matches for often stated and complex graph queries to increase query performance. However, such graph database views must be kept consistent with the graphs stored in the graph database. In this paper, we describe how to use incremental graph pattern matching as technique for maintaining graph database views. We present an incremental maintenance algorithm for graph database views, which works for imperatively and declaratively specified graph queries. The evaluation shows that our maintenance algorithm scales when the number of nodes and edges stored in the graph database increases. Furthermore, our evaluation shows that our approach can outperform existing approaches for the incremental maintenance of graph query results.

  13. Reactome graph database: Efficient access to complex pathway data

    Science.gov (United States)

    Korninger, Florian; Viteri, Guilherme; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D’Eustachio, Peter

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types. PMID:29377902

  14. Reactome graph database: Efficient access to complex pathway data.

    Directory of Open Access Journals (Sweden)

    Antonio Fabregat

    2018-01-01

    Full Text Available Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j as well as the new ContentService (REST API that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  15. Reactome graph database: Efficient access to complex pathway data.

    Science.gov (United States)

    Fabregat, Antonio; Korninger, Florian; Viteri, Guilherme; Sidiropoulos, Konstantinos; Marin-Garcia, Pablo; Ping, Peipei; Wu, Guanming; Stein, Lincoln; D'Eustachio, Peter; Hermjakob, Henning

    2018-01-01

    Reactome is a free, open-source, open-data, curated and peer-reviewed knowledgebase of biomolecular pathways. One of its main priorities is to provide easy and efficient access to its high quality curated data. At present, biological pathway databases typically store their contents in relational databases. This limits access efficiency because there are performance issues associated with queries traversing highly interconnected data. The same data in a graph database can be queried more efficiently. Here we present the rationale behind the adoption of a graph database (Neo4j) as well as the new ContentService (REST API) that provides access to these data. The Neo4j graph database and its query language, Cypher, provide efficient access to the complex Reactome data model, facilitating easy traversal and knowledge discovery. The adoption of this technology greatly improved query efficiency, reducing the average query time by 93%. The web service built on top of the graph database provides programmatic access to Reactome data by object oriented queries, but also supports more complex queries that take advantage of the new underlying graph-based data storage. By adopting graph database technology we are providing a high performance pathway data resource to the community. The Reactome graph database use case shows the power of NoSQL database engines for complex biological data types.

  16. Database Description - RPD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ase Description General information of database Database name RPD Alternative name Rice Proteome Database...titute of Crop Science, National Agriculture and Food Research Organization Setsuko Komatsu E-mail: Database... classification Proteomics Resources Plant databases - Rice Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database... description Rice Proteome Database contains information on protei...and entered in the Rice Proteome Database. The database is searchable by keyword,

  17. Database Description - PLACE | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name PLACE Alternative name A Database...Kannondai, Tsukuba, Ibaraki 305-8602, Japan National Institute of Agrobiological Sciences E-mail : Databas...e classification Plant databases Organism Taxonomy Name: Tracheophyta Taxonomy ID: 58023 Database...99, Vol.27, No.1 :297-300 External Links: Original website information Database maintenance site National In...- Need for user registration Not available About This Database Database Descripti

  18. JNC thermodynamic database for performance assessment of high-level radioactive waste disposal system

    Energy Technology Data Exchange (ETDEWEB)

    Yui, Mikazu; Azuma, Jiro; Shibata, Masahiro [Japan Nuclear Cycle Development Inst., Tokai Works, Waste Isolation Research Division, Tokai, Ibaraki (Japan)

    1999-11-01

    This report is a summary of status, frozen datasets, and future tasks of the JNC (Japan Nuclear Cycle Development Institute) thermodynamic database (JNC-TDB) for assessing performance of high-level radioactive waste in geological environments. The JNC-TDB development was carried out after the first progress report on geological disposal research in Japan (H-3). In the development, thermodynamic data (equilibrium constants at 25degC, I=0) for important radioactive elements were selected/determined based on original experimental data using different models (e.g., SIT, Pitzer). As a result, the reliability and traceability of the data for most of the important elements were improved over those of the PNC-TDB used in H-3 report. For detailed information of data analysis and selections for each element, see the JNC technical reports listed in this document. (author)

  19. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry: protocol standardization and database expansion for rapid identification of clinically important molds.

    Science.gov (United States)

    Paul, Saikat; Singh, Pankaj; Rudramurthy, Shivaprakash M; Chakrabarti, Arunaloke; Ghosh, Anup K

    2017-12-01

    To standardize the matrix-assisted laser desorption ionization-time of flight mass spectrometry protocols and expansion of existing Bruker Biotyper database for mold identification. Four different sample preparation methods (protocol A, B, C and D) were evaluated. On analyzing each protein extraction method, reliable identification and best log scores were achieved through protocol D. The same protocol was used to identify 153 clinical isolates. Of these 153, 123 (80.3%) were accurately identified by using existing database and remaining 30 (19.7%) were not identified due to unavailability in database. On inclusion of missing main spectrum profile in existing database, all 153 isolates were identified. Matrix-assisted laser desorption ionization-time of flight mass spectrometry can be used for routine identification of clinically important molds.

  20. Toward An Unstructured Mesh Database

    Science.gov (United States)

    Rezaei Mahdiraji, Alireza; Baumann, Peter Peter

    2014-05-01

    -incidence relationships. We instrument ImG model with sets of optional and application-specific constraints which can be used to check validity of meshes for a specific class of object such as manifold, pseudo-manifold, and simplicial manifold. We conducted experiments to measure the performance of the graph database solution in processing mesh queries and compare it with GrAL mesh library and PostgreSQL database on synthetic and real mesh datasets. The experiments show that each system perform well on specific types of mesh queries, e.g., graph databases perform well on global path-intensive queries. In the future, we investigate database operations for the ImG model and design a mesh query language.

  1. Real-Time Whole-Genome Sequencing for Routine Typing, Surveillance, and Outbreak Detection of Verotoxigenic Escherichia coli

    Science.gov (United States)

    Scheutz, Flemming; Lund, Ole; Hasman, Henrik; Kaas, Rolf S.; Nielsen, Eva M.; Aarestrup, Frank M.

    2014-01-01

    Fast and accurate identification and typing of pathogens are essential for effective surveillance and outbreak detection. The current routine procedure is based on a variety of techniques, making the procedure laborious, time-consuming, and expensive. With whole-genome sequencing (WGS) becoming cheaper, it has huge potential in both diagnostics and routine surveillance. The aim of this study was to perform a real-time evaluation of WGS for routine typing and surveillance of verocytotoxin-producing Escherichia coli (VTEC). In Denmark, the Statens Serum Institut (SSI) routinely receives all suspected VTEC isolates. During a 7-week period in the fall of 2012, all incoming isolates were concurrently subjected to WGS using IonTorrent PGM. Real-time bioinformatics analysis was performed using web-tools (www.genomicepidemiology.org) for species determination, multilocus sequence type (MLST) typing, and determination of phylogenetic relationship, and a specific VirulenceFinder for detection of E. coli virulence genes was developed as part of this study. In total, 46 suspected VTEC isolates were characterized in parallel during the study. VirulenceFinder proved successful in detecting virulence genes included in routine typing, explicitly verocytotoxin 1 (vtx1), verocytotoxin 2 (vtx2), and intimin (eae), and also detected additional virulence genes. VirulenceFinder is also a robust method for assigning verocytotoxin (vtx) subtypes. A real-time clustering of isolates in agreement with the epidemiology was established from WGS, enabling discrimination between sporadic and outbreak isolates. Overall, WGS typing produced results faster and at a lower cost than the current routine. Therefore, WGS typing is a superior alternative to conventional typing strategies. This approach may also be applied to typing and surveillance of other pathogens. PMID:24574290

  2. Real-time whole-genome sequencing for routine typing, surveillance, and outbreak detection of verotoxigenic Escherichia coli.

    Science.gov (United States)

    Joensen, Katrine Grimstrup; Scheutz, Flemming; Lund, Ole; Hasman, Henrik; Kaas, Rolf S; Nielsen, Eva M; Aarestrup, Frank M

    2014-05-01

    Fast and accurate identification and typing of pathogens are essential for effective surveillance and outbreak detection. The current routine procedure is based on a variety of techniques, making the procedure laborious, time-consuming, and expensive. With whole-genome sequencing (WGS) becoming cheaper, it has huge potential in both diagnostics and routine surveillance. The aim of this study was to perform a real-time evaluation of WGS for routine typing and surveillance of verocytotoxin-producing Escherichia coli (VTEC). In Denmark, the Statens Serum Institut (SSI) routinely receives all suspected VTEC isolates. During a 7-week period in the fall of 2012, all incoming isolates were concurrently subjected to WGS using IonTorrent PGM. Real-time bioinformatics analysis was performed using web-tools (www.genomicepidemiology.org) for species determination, multilocus sequence type (MLST) typing, and determination of phylogenetic relationship, and a specific VirulenceFinder for detection of E. coli virulence genes was developed as part of this study. In total, 46 suspected VTEC isolates were characterized in parallel during the study. VirulenceFinder proved successful in detecting virulence genes included in routine typing, explicitly verocytotoxin 1 (vtx1), verocytotoxin 2 (vtx2), and intimin (eae), and also detected additional virulence genes. VirulenceFinder is also a robust method for assigning verocytotoxin (vtx) subtypes. A real-time clustering of isolates in agreement with the epidemiology was established from WGS, enabling discrimination between sporadic and outbreak isolates. Overall, WGS typing produced results faster and at a lower cost than the current routine. Therefore, WGS typing is a superior alternative to conventional typing strategies. This approach may also be applied to typing and surveillance of other pathogens.

  3. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    Science.gov (United States)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available

  4. Database Description - JSNP | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name JSNP Alternative nam...n Science and Technology Agency Creator Affiliation: Contact address E-mail : Database...sapiens Taxonomy ID: 9606 Database description A database of about 197,000 polymorphisms in Japanese populat...1):605-610 External Links: Original website information Database maintenance site Institute of Medical Scien...er registration Not available About This Database Database Description Download License Update History of This Database

  5. Database Description - RED | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ase Description General information of database Database name RED Alternative name Rice Expression Database...enome Research Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice Database classifi...cation Microarray, Gene Expression Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database descripti... Article title: Rice Expression Database: the gateway to rice functional genomics...nt Science (2002) Dec 7 (12):563-564 External Links: Original website information Database maintenance site

  6. Comparative performance measures of relational and object-oriented databases using High Energy Physics data

    International Nuclear Information System (INIS)

    Marstaller, J.

    1993-12-01

    The major experiments at the SSC are expected to produce up to 1 Petabyte of data per year. The use of database techniques can significantly reduce the time it takes to access data. The goal of this project was to test which underlying data model, the relational or the object-oriented, would be better suited for archival and accessing high energy data. We describe the relational and the object-oriented data model and their implementation in commercial database management systems. To determine scalability we tested both implementations for 10-MB and 100-MB databases using storage and timing criteria

  7. The Endogenous Origins of Experience, Routines and Organizational Capabilities

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Felin, Teppo

    2011-01-01

    In this paper we discuss the origins and emergence of organizational routines and capabilities. We first argue that there are theoretical and endogeneity-related concerns associated with the key antecedents and mechanisms specified by the extant routines and capabilities literature. Specifically,...... or rationalist, choice-based approach can provide a more fruitful (though preliminary) foundation for understanding organizational behavior and capabilities.......In this paper we discuss the origins and emergence of organizational routines and capabilities. We first argue that there are theoretical and endogeneity-related concerns associated with the key antecedents and mechanisms specified by the extant routines and capabilities literature. Specifically......, we explicate the behaviorist and empiricist foundations of the organizational routines and capabilities literature and the extant emphasis placed on experience, repetition, and observation as the key antecedents and mechanisms of routines and capabilities. Based on this discussion we highlight...

  8. Database Description - ConfC | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name ConfC Alternative name Database...amotsu Noguchi Tel: 042-495-8736 E-mail: Database classification Structure Database...s - Protein structure Structure Databases - Small molecules Structure Databases - Nucleic acid structure Database... services - Need for user registration - About This Database Database Description Download License Update History of This Database... Site Policy | Contact Us Database Description - ConfC | LSDB Archive ...

  9. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  10. Clinical Outcome of Degenerative Mitral Regurgitation: Critical Importance of Echocardiographic Quantitative Assessment in Routine Practice.

    Science.gov (United States)

    Antoine, Clemence; Benfari, Giovanni; Michelena, Hector I; Malouf, Joseph F; Nkomo, Vuyisile T; Thapa, Prabin; Enriquez-Sarano, Maurice

    2018-05-31

    Background -Echocardiographic quantitation of degenerative mitral regurgitation (DMR) is recommended whenever possible in clinical guidelines but is criticized and its scalability to routine clinical practice doubted. We hypothesized that echocardiographic DMR quantitation, performed in routine clinical practice by multiple practitioners predicts independently long-term survival, and thus is essential to DMR management. Methods -We included patients diagnosed with isolated mitral-valve-prolapse 2003-2011 and any degree of MR quantified by any physician/sonographer in routine clinical practice. Clinical/echocardiographic data acquired at diagnosis were retrieved electronically. Endpoint was mortality under medical treatment analyzed by Kaplan-Meir method and Proportional-Hazard models. Results -The cohort included 3914 patients (55% male) aged 62±17 years, with left ventricular ejection fraction (LVEF) 63±8% and routinely measured effective regurgitant orifice area (EROA) 19[0-40] mm 2 During follow-up (6.7±3.1 years) 696 patients died under medical management and 1263 underwent mitral surgery. In multivariate analysis, routinely measured EROA was associated with mortality (adjusted-hazard-ratio 1.19[1.13-1.24] p 40 mm 2 threshold. Conclusions -Echocardiographic DMR quantitation is scalable to routine practice and is independently associated with clinical outcome. Routinely measured EROA is strongly associated with long-term survival under medical treatment. Excess mortality vs. the general population appears in the "moderate" DMR range and steadily increases with higher EROA. Hence, individual EROA values should be integrated into therapeutic considerations, additionally to categorical DMR grading.

  11. The institutionalization of a routine

    DEFF Research Database (Denmark)

    Nickelsen, Niels Christian

    2008-01-01

    -which has before largely been treated in overview by institutionalism-plays an important role in the making of a routine. In my empirical study, I demonstrate that the concept and practice of the valve changes, and that it is identified in a number of ways, as it passes through the testing phase...... of production. I argue that the negotiation of these changes during test production is the fulcrum in the routinization of the production procedure. It is through these identity shifts that the valve is both reified, and rendered producible and applicable in the customer world....

  12. Routine filtration for decubitus radiography during double contrast barium enema examinations

    International Nuclear Information System (INIS)

    Camellini, V.; Abelli, P.; Marconi, G.

    1987-01-01

    A wedge-shaped plexiglass compensation filter for use in lateral decubitus radiographs of double contrast barium enema examinations has been designed. This filter, which has been used since February 1985, was compared with a second plexiglass filter, made by E-Z-EM. Voltage and amperage were kept contrast. Two different experiments were conducted in order to demonstrate the benefits of routine use of this compensation filter. First, the changes in skin dose were assessed using an ionization chamber on phantom. Secondly, three radiologists examined a series of 80 consecutive barium enemas without knowing which had been performed using the new filter. Out of the 70 examinations they considered excellent, as many as 45 had been performed with the new filter. Personal experience and the studies described show that the use of a compensation filter improves the accuracy and thus the diagnostic quality of the examinations as it enhances the detail of the anatomic structure of the colon; moreover the filter reduces skin exposure by up to 73.1% (EZ-E-EM filter 47.3%) and, at the same time, less radiographic films are needed. Routine use of a plexiglass compensation filter in lateral decubitus radiographs while performing a double contrast barium enema examination, is strongly recommended especially in obese patients

  13. Software and Database Usage on Metabolomic Studies: Using XCMS on LC-MS Data Analysis

    Directory of Open Access Journals (Sweden)

    Mustafa Celebier

    2014-04-01

    Full Text Available Metabolome is the complete set of small-molecule metabolites to be found in a cell or a single organism. Metabolomics is the scientific study to determine and identify the chemicals in metabolome with advanced analytical techniques. Nowadays, the elucidation of the molecular mechanism of any disease with genome analysis and proteome analysis is not sufficient. Instead of these, a holistic assessment including metabolomic studies provides rational and accurate results. Metabolite levels in an organism are associated with the cellular functions. Thus, determination of the metabolite amounts identifies the phenotype of a cell or tissue related with the genetic and some other variations. Even though, the analysis of metabolites for medical diagnosis and therapy have been performed for a long time, the studies to improve the analysis methods for metabolite profiling are recently increased. The application of metabolomics includes the identification of biomarkers, enzyme-substract interactions, drug-activity studies, metabolic pathway analysis and some other studies related with the system biology. The preprocessing and computing of the data obtained from LC-MS, GC-MS, CE-MS and NMR for metabolite profiling are helpful for preventing from time consuming manual data analysis processes and possible random errors on profiling period. In addition, such preprocesses allow us to identify low amount of metabolites which are not possible to be analyzed by manual processing. Therefore, the usage of software and databases for this purpose could not be ignored. In this study, it is briefly presented the software and database used on metabolomics and it is evaluated the capability of these software on metabolite profiling. Particularly, the performance of one of the most popular software called XCMS on the evaluation of LC-MS results for metabolomics was overviewed. In the near future, metabolomics with software and database support is estimated to be a routine

  14. Guidelines for the Calibration of Routine Dosimetry Systems for use in Radiation Processing

    DEFF Research Database (Denmark)

    Sharpe, Peter; Miller, Arne

    A set of guidelines has been developed to assist in the calibration of routine dosimetry systems for use in industrial radiation processing plants. Topics covered include the calibration of equipment, the performance of calibration irradiations and the derivation of mathematical functions...

  15. Oracle database 12c the complete reference

    CERN Document Server

    Bryla, Bob

    2014-01-01

    Maintain a scalable, highly available enterprise platform and reduce complexity by leveraging the powerful new tools and cloud enhancements of Oracle Database 12c. This authoritative Oracle Press guide offers complete coverage of installation, configuration, tuning, and administration. Find out how to build and populate Oracle databases, perform effective queries, design applications, and secure your enterprise data

  16. Active In-Database Processing to Support Ambient Assisted Living Systems

    Directory of Open Access Journals (Sweden)

    Wagner O. de Morais

    2014-08-01

    Full Text Available As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  17. Scale out databases for CERN use cases

    International Nuclear Information System (INIS)

    Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Garcia, Daniel Lanza; Surdy, Kacper

    2015-01-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database. (paper)

  18. Danish Colorectal Cancer Group Database.

    Science.gov (United States)

    Ingeholm, Peter; Gögenur, Ismail; Iversen, Lene H

    2016-01-01

    The aim of the database, which has existed for registration of all patients with colorectal cancer in Denmark since 2001, is to improve the prognosis for this patient group. All Danish patients with newly diagnosed colorectal cancer who are either diagnosed or treated in a surgical department of a public Danish hospital. The database comprises an array of surgical, radiological, oncological, and pathological variables. The surgeons record data such as diagnostics performed, including type and results of radiological examinations, lifestyle factors, comorbidity and performance, treatment including the surgical procedure, urgency of surgery, and intra- and postoperative complications within 30 days after surgery. The pathologists record data such as tumor type, number of lymph nodes and metastatic lymph nodes, surgical margin status, and other pathological risk factors. The database has had >95% completeness in including patients with colorectal adenocarcinoma with >54,000 patients registered so far with approximately one-third rectal cancers and two-third colon cancers and an overrepresentation of men among rectal cancer patients. The stage distribution has been more or less constant until 2014 with a tendency toward a lower rate of stage IV and higher rate of stage I after introduction of the national screening program in 2014. The 30-day mortality rate after elective surgery has been reduced from >7% in 2001-2003 to database is a national population-based clinical database with high patient and data completeness for the perioperative period. The resolution of data is high for description of the patient at the time of diagnosis, including comorbidities, and for characterizing diagnosis, surgical interventions, and short-term outcomes. The database does not have high-resolution oncological data and does not register recurrences after primary surgery. The Danish Colorectal Cancer Group provides high-quality data and has been documenting an increase in short- and long

  19. Ethnographic study of ICT-supported collaborative work routines in general practice.

    Science.gov (United States)

    Swinglehurst, Deborah; Greenhalgh, Trisha; Myall, Michelle; Russell, Jill

    2010-12-29

    Health informatics research has traditionally been dominated by experimental and quasi-experimental designs. An emerging area of study in organisational sociology is routinisation (how collaborative work practices become business-as-usual). There is growing interest in the use of ethnography and other in-depth qualitative approaches to explore how collaborative work routines are enacted and develop over time, and how electronic patient records (EPRs) are used to support collaborative work practices within organisations. Following Feldman and Pentland, we will use 'the organisational routine' as our unit of analysis. In a sample of four UK general practices, we will collect narratives, ethnographic observations, multi-modal (video and screen capture) data, documents and other artefacts, and analyse these to map and compare the different understandings and enactments of three common routines (repeat prescribing, coding and summarising, and chronic disease surveillance) which span clinical and administrative spaces and which, though 'mundane', have an important bearing on quality and safety of care. In a detailed qualitative analysis informed by sociological theory, we aim to generate insights about how complex collaborative work is achieved through the process of routinisation in healthcare organisations. Our study offers the potential not only to identify potential quality failures (poor performance, errors, failures of coordination) in collaborative work routines but also to reveal the hidden work and workarounds by front-line staff which bridge the model-reality gap in EPR technologies and via which "automated" safety features have an impact in practice.

  20. Database Description - RMG | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ase Description General information of database Database name RMG Alternative name ...raki 305-8602, Japan National Institute of Agrobiological Sciences E-mail : Database... classification Nucleotide Sequence Databases Organism Taxonomy Name: Oryza sativa Japonica Group Taxonomy ID: 39947 Database...rnal: Mol Genet Genomics (2002) 268: 434–445 External Links: Original website information Database...available URL of Web services - Need for user registration Not available About This Database Database Descri

  1. ORGANIZATIONAL ROUTINES IN RUSSIAN COMPANIES: REVIEW OF PRACTICES

    Directory of Open Access Journals (Sweden)

    Olga Valieva

    2014-10-01

    Full Text Available Results of the first stage of the researches conducted in 2012-2013 are presented in article. Researches are connected with studying of transformational processes intra corporate of managemetn practices in the Russian companies and their subsequent institutionalization. Preliminary results showed that in the companies there is a standard set of organizational routines which part are information, and administrative routines, routines of the power of the founder, genetic, institutional and development routines. During research statistically significant connection between types of organizational structures, the sizes of the organization, information processing and administrative practices is established. It is revealed as change of approaches to management of the organization can affect a corruption component.

  2. The Danish Testicular Cancer database

    Directory of Open Access Journals (Sweden)

    Daugaard G

    2016-10-01

    Full Text Available Gedske Daugaard,1 Maria Gry Gundgaard Kier,1 Mikkel Bandak,1 Mette Saksø Mortensen,1 Heidi Larsson,2 Mette Søgaard,2 Birgitte Groenkaer Toft,3 Birte Engvad,4 Mads Agerbæk,5 Niels Vilstrup Holm,6 Jakob Lauritsen1 1Department of Oncology 5073, Copenhagen University Hospital, Rigshospitalet, Copenhagen, 2Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus, 3Department of Pathology, Copenhagen University Hospital, Rigshospitalet, Copenhagen, 4Department of Pathology, Odense University Hospital, Odense, 5Department of Oncology, Aarhus University Hospital, Aarhus, 6Department of Oncology, Odense University Hospital, Odense, Denmark Aim: The nationwide Danish Testicular Cancer database consists of a retrospective research database (DaTeCa database and a prospective clinical database (Danish Multidisciplinary Cancer Group [DMCG] DaTeCa database. The aim is to improve the quality of care for patients with testicular cancer (TC in Denmark, that is, by identifying risk factors for relapse, toxicity related to treatment, and focusing on late effects. Study population: All Danish male patients with a histologically verified germ cell cancer diagnosis in the Danish Pathology Registry are included in the DaTeCa databases. Data collection has been performed from 1984 to 2007 and from 2013 onward, respectively. Main variables and descriptive data: The retrospective DaTeCa database contains detailed information with more than 300 variables related to histology, stage, treatment, relapses, pathology, tumor markers, kidney function, lung function, etc. A questionnaire related to late effects has been conducted, which includes questions regarding social relationships, life situation, general health status, family background, diseases, symptoms, use of medication, marital status, psychosocial issues, fertility, and sexuality. TC survivors alive on October 2014 were invited to fill in this questionnaire including 160 validated questions

  3. Advanced technologies for scalable ATLAS conditions database access on the grid

    CERN Document Server

    Basset, R; Dimitrov, G; Girone, M; Hawkings, R; Nevski, P; Valassi, A; Vaniachine, A; Viegas, F; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysi...

  4. Practice databases and their uses in clinical research.

    Science.gov (United States)

    Tierney, W M; McDonald, C J

    1991-04-01

    A few large clinical information databases have been established within larger medical information systems. Although they are smaller than claims databases, these clinical databases offer several advantages: accurate and timely data, rich clinical detail, and continuous parameters (for example, vital signs and laboratory results). However, the nature of the data vary considerably, which affects the kinds of secondary analyses that can be performed. These databases have been used to investigate clinical epidemiology, risk assessment, post-marketing surveillance of drugs, practice variation, resource use, quality assurance, and decision analysis. In addition, practice databases can be used to identify subjects for prospective studies. Further methodologic developments are necessary to deal with the prevalent problems of missing data and various forms of bias if such databases are to grow and contribute valuable clinical information.

  5. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  6. Nuclear data processing using a database management system

    International Nuclear Information System (INIS)

    Castilla, V.; Gonzalez, L.

    1991-01-01

    A database management system that permits the design of relational models was used to create an integrated database with experimental and evaluated nuclear data.A system that reduces the time and cost of processing was created for computers type EC or compatibles.A set of programs for the conversion from nuclear calculated data output format to EXFOR format was developed.A dictionary to perform a retrospective search in the ENDF database was created too

  7. Adaptive intrusion data system (AIDS) software routines

    International Nuclear Information System (INIS)

    Corlis, N.E.

    1980-07-01

    An Adaptive Intrusion Data System (AIDS) was developed to collect information from intrusion alarm sensors as part of an evaluation system to improve sensor performance. AIDS is a unique digital data-compression, storage, and formatting system; it also incorporates a capability for video selection and recording for assessment of the sensors monitored by the system. The system is software reprogrammable to numerous configurations that may be used for the collection of environmental, bilevel, analog, and video data. This report describes the software routines that control the different AIDS data-collection modes, the diagnostic programs to test the operating hardware, and the data format. Sample data printouts are also included

  8. Establishing a National Maternal Morbidity Outcome Indicator in England: A Population-Based Study Using Routine Hospital Data.

    Directory of Open Access Journals (Sweden)

    Manisha Nair

    Full Text Available As maternal deaths become rarer, monitoring near-miss or severe maternal morbidity becomes important as a tool to measure changes in care quality. Many calls have been made to use routinely available hospital administration data to monitor the quality of maternity care. We investigated 1 the feasibility of developing an English Maternal Morbidity Outcome Indicator (EMMOI by reproducing an Australian indicator using routinely available hospital data, 2 the impact of modifications to the indicator to address potential data quality issues, 3 the reliability of the indicator.We used data from 6,389,066 women giving birth in England from April 2003 to March 2013 available in the Hospital Episode Statistics (HES database of the Health and Social care Information centre (HSCIC. A composite indicator, EMMOI, was generated from the diagnoses and procedure codes. Rates of individual morbid events included in the EMMOI were compared with the rates in the UK reported by population-based studies.EMMOI included 26 morbid events (17 diagnosis and 9 procedures. Selection of the individual morbid events was guided by the Australian indicator and published literature for conditions associated with maternal morbidity and mortality in the UK, but was mainly driven by the quality of the routine hospital data. Comparing the rates of individual morbid events of the indicator with figures from population-based studies showed that the possibility of false positive and false negative cases cannot be ruled out.While routine English hospital data can be used to generate a composite indicator to monitor trends in maternal morbidity during childbirth, the quality and reliability of this monitoring indicator depends on the quality of the hospital data, which is currently inadequate.

  9. The Danish national quality database for births

    DEFF Research Database (Denmark)

    Andersson, Charlotte Brix; Flems, Christina; Kesmodel, Ulrik Schiøler

    2016-01-01

    Aim of the database: The aim of the Danish National Quality Database for Births (DNQDB) is to measure the quality of the care provided during birth through specific indicators. Study population: The database includes all hospital births in Denmark. Main variables: Anesthesia/pain relief, continuous...... Medical Birth Registry. Registration to the Danish Medical Birth Registry is mandatory for all maternity units in Denmark. During the 5 years, performance has improved in the areas covered by the process indicators and for some of the outcome indicators. Conclusion: Measuring quality of care during...

  10. Analysis/design of tensile property database system

    International Nuclear Information System (INIS)

    Park, S. J.; Kim, D. H.; Jeon, I.; Lyu, W. S.

    2001-01-01

    The data base construction using the data produced from tensile experiment can increase the application of test results. Also, we can get the basic data ease from database when we prepare the new experiment and can produce high quality result by compare the previous data. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, the analysis and design was performed to develop the database for tensile extension property

  11. Database Description - DGBY | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name DGBY Alternative name Database...EL: +81-29-838-8066 E-mail: Database classification Microarray Data and other Gene Expression Databases Orga...nism Taxonomy Name: Saccharomyces cerevisiae Taxonomy ID: 4932 Database descripti...-called phenomics). We uploaded these data on this website which is designated DGBY(Database for Gene expres...ma J, Ando A, Takagi H. Journal: Yeast. 2008 Mar;25(3):179-90. External Links: Original website information Database

  12. Database Description - KOME | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name KOME Alternative nam... Sciences Plant Genome Research Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice ...Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description Information about approximately ...Hayashizaki Y, Kikuchi S. Journal: PLoS One. 2007 Nov 28; 2(11):e1235. External Links: Original website information Database...OS) Rice mutant panel database (Tos17) A Database of Plant Cis-acting Regulatory

  13. A high-energy nuclear database proposal

    International Nuclear Information System (INIS)

    Brown, D.A.; Vogt, R.; UC Davis, CA

    2006-01-01

    We propose to develop a high-energy heavy-ion experimental database and make it accessible to the scientific community through an on-line interface. This database will be searchable and cross-indexed with relevant publications, including published detector descriptions. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This database should eventually contain all published data from the Bevalac, AGS and SPS to RHIC and LHC energies, proton-proton to nucleus-nucleus collisions as well as other relevant systems, and all measured observables. Such a database would have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models to a broad range of old and new experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion and target and source development for upcoming facilities such as the Next Linear Collider. To enhance the utility of this database, we propose periodically performing evaluations of the data and summarizing the results in topical reviews. (author)

  14. ESTIMATION OF ROUTINE DISCHARGE OF RADIONUCLIDES ON POWER REACTOR EXPERIMENTAL RDE

    Directory of Open Access Journals (Sweden)

    P.M. Udiyani

    2017-02-01

    Full Text Available Experimental power reactor (RDE which is planned to be constructed by BATAN is a kind of High Temperature Gas Cooled Reactor (HTGR with 10 MWth power. HTGR is a helium gas-cooled reactor with TRISO-coated fuel that is able to confine fission products remained in the core. Although the fission products released into the environment are very small, in order to comply the regulations the study about environmental radiation on normal or routine operation condition need to be performed. Estimation of radiology in the environment involves the source term released into the environment under routine operation condition. The purpose of this study is to estimate the source term released into the environment based on postulation of normal or routine operations of RDE. The research approach starts with an assumption that there are defects and impurities in the TRISO fuel because of limitation during the fabrication. Mechanism of fission products release from the fuel to the environment was created based on the safety features design of RDE. Radionuclides inventories in the reactor were calculated using ORIGEN-2 whose library has been modified for HTGR type, and the assumptions of defects of the TRISO fuel and release fraction for each compartment of RDE safety system used a reference parameter. The results showed that the important source terms of RDE are group of noble gases (Kr and Xe, halogen (I, Sr, Cs, H-3, and Ag. Activities of RDE source terms for routine operations have no significant difference with the HTGR source terms with the same power. Keywords: routine discharge, radionuclide, source term, RDE, HTGR

  15. Audit Database and Information Tracking System

    Data.gov (United States)

    Social Security Administration — This database contains information about the Social Security Administration's audits regarding SSA agency performance and compliance. These audits can be requested...

  16. A portable and independent edge fluctuation diagnostic. Final performance report, March 1992--March 1993

    International Nuclear Information System (INIS)

    Tsui, H.; Wootton, A.

    1994-01-01

    A compact self contained portable probe system has been designed and developed to diagnose the edge plasma of devices of different sizes and configurations. The system measures both the mean and the fluctuation quantities of density, temperature and potential from a standardized Langmuir probe array using a fast reciprocating probe drive. It can also be used for other fluctuation diagnostics, such as magnetic probes. The data acquisition and analysis is performed on a Macintosh IIfx which provides a user-friendly environment. The results obtained by the signal processing routines are stored in a tabloid format to allow comparative studies. The resulting database is a core part of the protable signal analysis system. To date measurements have been performed on the stellarator ATF, the reversed field pinch ZT40(m), and the tokamaks TEXT, Versator, Phaedrus-T and TFTR. The data are presently being analyzed and the results collected into the database for the purpose of edge turbulence and transport studies. Existing published data are also being included. The edge database, an output of this project, will provide readily available information for other experimental groups to compare their results with, and for theoretical groups to validate (or otherwise) the predictions of their models

  17. The role of routine preoperative EUS when performed after contrast enhanced CT in the diagnostic work-up in patients suspected of pancreatic or periampullary cancer.

    Science.gov (United States)

    Cieslak, Kasia P; van Santvoort, Hjalmar C; Vleggaar, Frank P; van Leeuwen, Maarten S; ten Kate, Fibo J; Besselink, Marc G; Molenaar, I Quintus

    2014-01-01

    In patients suspected of pancreatic or periampullary cancer, abdominal contrast-enhanced computed tomography (CT) is the standard diagnostic modality. A supplementary endoscopic ultrasonography (EUS) is often performed, although there is only limited evidence of its additional diagnostic value. The aim of the study is to evaluate the additional diagnostic value of EUS over CT in deciding on exploratory laparotomy in patients suspected of pancreatic or periampullary cancer. We retrospectively analyzed 86 consecutive patients who routinely underwent CT and EUS before exploratory laparotomy with or without pancreatoduodenectomy for suspected pancreatic or periampullary carcinoma between 2007 and 2010. Primary outcomes were visibility of a mass, resectability on CT/EUS and resection with curative intent. A mass was visible on CT in 72/86 (84%) patients. In these 72 patients, EUS demonstrated a mass in 64/72 (89%) patients. Resectability was accurately predicted by CT in 65/72 (90%) and by EUS in 58/72 (81%) patients. In 14/86 (16%) patients no mass was seen on CT. EUS showed a mass in 12/14 (86%) of these patients. A malignant lesion was histological proven in 11/12 (92%) of these patients. Overall, resectability was accurately predicted by CT and EUS in 90% (77/86) and 84% (72/86), respectively. In patients with a visible mass on CT, suspected for pancreatic or periampullary cancer, EUS has no additional diagnostic value, does not influence the decision to perform laparotomy and should therefore not be performed routinely. In patients without a visible mass on CT, EUS is useful to confirm the presence of a tumor. Copyright © 2014 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  18. Yield of yearly routine physical examination in HIV-1 infected patients is limited : A retrospective cohort study in the Netherlands

    NARCIS (Netherlands)

    van Amsterdam, Marleen A.; van Assen, Sander; Sprenger, Herman G.; Wilting, Kasper R.; Stienstra, Ymkje; Bierman, Wouter F. W.

    2017-01-01

    Background Routine physical examinations might be of value in HIV-infected patients, but the yield is unknown. We determined the diagnoses that would have been missed without performing annual routine physical examinations in HIV-infected patients with stable disease. Methods Data were collected

  19. Breaking the Waves: Routines and Rituals in Entrepreneurship Education

    Science.gov (United States)

    Neergaard, Helle; Christensen, Dorthe Refslund

    2017-01-01

    Learning is related to the environment created for the learning experience. This environment is often highly routinized and involves a certain social structure, but in entrepreneurship education, such routinization and structure may actually counteract the learning goals. This article investigates how classroom routines and rituals impact on…

  20. Performance analysis of automated evaluation of Crithidia luciliae-based indirect immunofluorescence tests in a routine setting - strengths and weaknesses.

    Science.gov (United States)

    Hormann, Wymke; Hahn, Melanie; Gerlach, Stefan; Hochstrate, Nicola; Affeldt, Kai; Giesen, Joyce; Fechner, Kai; Damoiseaux, Jan G M C

    2017-11-27

    Antibodies directed against dsDNA are a highly specific diagnostic marker for the presence of systemic lupus erythematosus and of particular importance in its diagnosis. To assess anti-dsDNA antibodies, the Crithidia luciliae-based indirect immunofluorescence test (CLIFT) is one of the assays considered to be the best choice. To overcome the drawback of subjective result interpretation that inheres indirect immunofluorescence assays in general, automated systems have been introduced into the market during the last years. Among these systems is the EUROPattern Suite, an advanced automated fluorescence microscope equipped with different software packages, capable of automated pattern interpretation and result suggestion for ANA, ANCA and CLIFT analysis. We analyzed the performance of the EUROPattern Suite with its automated fluorescence interpretation for CLIFT in a routine setting, reflecting the everyday life of a diagnostic laboratory. Three hundred and twelve consecutive samples were collected, sent to the Central Diagnostic Laboratory of the Maastricht University Medical Centre with a request for anti-dsDNA analysis over a period of 7 months. Agreement between EUROPattern assay analysis and the visual read was 93.3%. Sensitivity and specificity were 94.1% and 93.2%, respectively. The EUROPattern Suite performed reliably and greatly supported result interpretation. Automated image acquisition is readily performed and automated image classification gives a reliable recommendation for assay evaluation to the operator. The EUROPattern Suite optimizes workflow and contributes to standardization between different operators or laboratories.

  1. Online Database Allows for Quick and Easy Monitoring and Reporting of Supplementary Feeding Program Performance: An Analysis of World Vision CMAM Programs (2006-2013)

    International Nuclear Information System (INIS)

    Emary, Colleen; Aidam, Bridget; Roberton, Tim

    2014-01-01

    Full text: Background: Despite the widespread implementation of interventions to address moderate acute malnutrition (MAM), lack of robust monitoring systems have hindered evaluation of the effectiveness of approaches to prevent and treat MAM. Since 2006, World Vision (WV) has provided supplementary feeding to 280,518 children 6-59 months of age (U5) and 105,949 pregnant and lactating women (PLW) as part of Community Based Management of Acute Malnutrition (CMAM) programming. The Excel-based system initially used for monitoring individual site programs faced numerous challenges. It was time consuming, prone to human error, lost data as a result of staff turnover and hence use of data to inform program performance was limited. In 2010, World Vision International (WVI)’s Nutrition Centre of Expertise (NCOE) established an online database to overcome these limitations. The aim of the database was to improve monitoring and reporting of WV’s CMAM programs. As of December 2013, the database has been rolled out in 14 countries Burundi, Chad, DRC, Ethiopia, Kenya, Mali, Mauritania, Niger, Sudan, Pakistan, South Sudan, Somalia, Zimbabwe and Zambia. Methods: The database includes data on admissions (mid-upper arm circumference, weight for height, oedema, referral) and discharge outcomes (recovered, died, defaulted, non-recovered, referral) for Supplementary Feeding Programs (SFPs) for children U5 as well as PLWs. A quantitative analysis of the data sets available was conducted to identify issues with data quality and draw findings from the data itself. Variations in program performance as compared to Sphere standards were determined by country and aggregated over the 14 countries. In addition, time trend analyses were conducted to determine significant different and seasonality effects. Results: Most data was related to program admissions from 2010 to July 2013, though some retrospective program data was available from 2006 to 2009. The countries with the largest number

  2. An Open-Source Auto-Calibration Routine Supporting the Stormwater Management Model

    Science.gov (United States)

    Tiernan, E. D.; Hodges, B. R.

    2017-12-01

    The stormwater management model (SWMM) is a clustered model that relies on subcatchment-averaged parameter assignments to correctly capture catchment stormwater runoff behavior. Model calibration is considered a critical step for SWMM performance, an arduous task that most stormwater management designers undertake manually. This research presents an open-source, automated calibration routine that increases the efficiency and accuracy of the model calibration process. The routine makes use of a preliminary sensitivity analysis to reduce the dimensions of the parameter space, at which point a multi-objective function, genetic algorithm (modified Non-dominated Sorting Genetic Algorithm II) determines the Pareto front for the objective functions within the parameter space. The solutions on this Pareto front represent the optimized parameter value sets for the catchment behavior that could not have been reasonably obtained through manual calibration.

  3. Architecture of Automated Database Tuning Using SGA Parameters

    Directory of Open Access Journals (Sweden)

    Hitesh KUMAR SHARMA

    2012-05-01

    Full Text Available Business Data always growth from kilo byte, mega byte, giga byte, tera byte, peta byte, and so far. There is no way to avoid this increasing rate of data till business still running. Because of this issue, database tuning be critical part of a information system. Tuning a database in a cost-effective manner is a growing challenge. The total cost of ownership (TCO of information technology needs to be significantly reduced by minimizing people costs. In fact, mistakes in operations and administration of information systems are the single most reasons for system outage and unacceptable performance [3]. One way of addressing the challenge of total cost of ownership is by making information systems more self-managing. A particularly difficult piece of the ambitious vision of making database systems self-managing is the automation of database performance tuning. In this paper, we will explain the progress made thus far on this important problem. Specifically, we will propose the architecture and Algorithm for this problem.

  4. Examining daily activity routines of older adults using workflow.

    Science.gov (United States)

    Chung, Jane; Ozkaynak, Mustafa; Demiris, George

    2017-07-01

    We evaluated the value of workflow analysis supported by a novel visualization technique to better understand the daily routines of older adults and highlight their patterns of daily activities and normal variability in physical functions. We used a self-reported activity diary to obtain data from six community-dwelling older adults for 14 consecutive days. Workflow for daily routine was analyzed using the EventFlow tool, which aggregates workflow information to highlight patterns and variabilities. A total of 1453 events were included in the data analysis. To demonstrate the patterns and variability of each individual's daily activities, participant activity workflows were visualized and compared. The workflow analysis revealed great variability in activity types, regularity, frequency, duration, and timing of performing certain activities across individuals. Also, when workflow approach was applied to spatial information of activities, the analysis revealed the ability to provide meaningful data on individuals' mobility in different levels of life spaces from home to community. Results suggest that using workflows to characterize the daily activities of older adults will be helpful for clinicians and researchers in understanding their daily routines and preparing education and prevention strategies tailored to each individual's activity level. This tool also has the potential to be integrated into consumer informatics technologies, such as patient portals or personal health records, so that consumers may be encouraged to become actively involved in monitoring and managing their health. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  6. 42 CFR 493.841 - Standard; Routine chemistry.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard; Routine chemistry. 493.841 Section 493.841 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... These Tests § 493.841 Standard; Routine chemistry. (a) Failure to attain a score of at least 80 percent...

  7. Query Processing and Interlinking of Fuzzy Object-Oriented Database

    OpenAIRE

    Shweta Dwivedi; Santosh Kumar

    2017-01-01

    Due to the many limitation and poor data handling in the existing relational database, the software professional and researchers moves towards the object-oriented database which has much better capability to handling the real and complex real world data i.e. clear and crisp data and also have the capability to perform some huge and complex queries in an effective manner. On the other hand, a new approach in database is introduced named as Fuzzy Object-Oriented Database (FOOD); it has all the ...

  8. High performance technique for database applicationsusing a hybrid GPU/CPU platform

    KAUST Repository

    Zidan, Mohammed A.; Bonny, Talal; Salama, Khaled N.

    2012-01-01

    Hybrid GPU/CPU platform. In particular, our technique solves the problem of the low efficiency result- ing from running short-length sequences in a database on a GPU. To verify our technique, we applied it to the widely used Smith-Waterman algorithm

  9. Radiation protection databases of nuclear safety regulatory authority

    International Nuclear Information System (INIS)

    Janzekovic, H.; Vokal, B.; Krizman, M.

    2003-01-01

    Radiation protection and nuclear safety of nuclear installations have a common objective, protection against ionising radiation. The operational safety of a nuclear power plant is evaluated using performance indicators as for instance collective radiation exposure, unit capability factor, unplanned capability loss factor, etc. As stated by WANO (World Association of Nuclear Operators) the performance indicators are 'a management tool so each operator can monitor its own performance and progress, set challenging goals for improvement and consistently compare performance with that of other plants or industry'. In order to make the analysis of the performance indicators feasible to an operator as well as to regulatory authorities a suitable database should be created based on the data related to a facility or facilities. Moreover, the international bodies found out that the comparison of radiation protection in nuclear facilities in different countries could be feasible only if the databases with well defined parameters are established. The article will briefly describe the development of international databases regarding radiation protection related to nuclear facilities. The issues related to the possible development of the efficient radiation protection control of a nuclear facility based on experience of the Slovenian Nuclear Safety Administration will be presented. (author)

  10. JAEA thermodynamic database for performance assessment of geological disposal of high-level and TRU wastes. Selection of thermodynamic data of cobalt and nickel

    International Nuclear Information System (INIS)

    Kitamura, Akira; Yui, Mikazu; Kirishima, Akira; Saito, Takumi; Shibutani, Sanae; Tochiyama, Osamu

    2009-11-01

    Within the scope of the JAEA thermodynamic database project for performance assessment of geological disposal of high-level and TRU wastes, the selection of the thermodynamic data on the inorganic compounds and complexes of cobalt and nickel have been carried out. For cobalt, extensive literature survey has been performed and all the obtained literatures have been carefully reviewed to select the thermodynamic data. Selection of thermodynamic data of nickel has been based on a thermodynamic database published by the Nuclear Energy Agency in the Organisation for Economic Co-operation and Development (OECD/NEA), which has been carefully reviewed by the authors, and then thermodynamic data have been selected after surveying latest literatures. Based on the similarity of chemical properties between cobalt and nickel, complementary thermodynamic data of nickel and cobalt species expected under the geological disposal condition have been selected to complete the thermodynamic data set for the performance assessment of geological disposal of radioactive wastes. (author)

  11. The Diagnostic Value of Routine Contrast Esophagram in Anastomotic Leaks After Esophagectomy.

    Science.gov (United States)

    Hu, Zhongwu; Wang, Xiaowe; An, Xush; Li, Wenjin; Feng, Yun; You, Zhenbing

    2017-08-01

    Routine contrast esophagram has been shown to be increasingly limited in diagnosing anastomotic leaks after esophagectomy. Patients undergoing esophagectomy from 2013 to 2014 at Huai'an First Peoples' Hospital were identified. We retrospectively analyzed patients who underwent routine contrast esophagram on postoperative day 7 (range 6-10) to preclude anastomotic leaks after esophagectomy. In 846 patients who underwent esophagectomy, a cervical anastomosis was performed in 286 patients and an intrathoracic anastomosis in 560 patients. There were 57 (6.73%) cases with anastomotic leaks, including cervical leaks in 36 and intrathoracic leaks in 21 patients. In the cervical anastomotic leak patients, 13 were diagnosed by early local clinical symptoms and 23 underwent routine contrast esophagram. There were 7 (30.4%) true-positive, 11 (47.8%) false-negative, and five (21.8%) equivocal cases. In the intrathoracic anastomotic leak patients, four (19%) were diagnosed by clinical symptoms, 16 (76.2%) were true positives, and one (4.8%) was a false negative. Aspiration occurred in five patients with cervical anastomoses and in eight patients with intrathoracic anastomoses; aspiration pneumonitis did not occur in these cases. Gastrografin and barium are safe contrast agents to use in post-esophagectomy contrast esophagram. Because of the low sensitivity in detecting cervical anastomotic leaks, routine contrast esophagram is not advised. For patients with intrathoracic anastomoses, it is still an effective method for detecting anastomotic leaks.

  12. Cost-effectiveness of routine imaging of suspected appendicitis.

    Science.gov (United States)

    D'Souza, N; Marsden, M; Bottomley, S; Nagarajah, N; Scutt, F; Toh, S

    2018-01-01

    Introduction The misdiagnosis of appendicitis and consequent removal of a normal appendix occurs in one in five patients in the UK. On the contrary, in healthcare systems with routine cross-sectional imaging of suspected appendicitis, the negative appendicectomy rate is around 5%. If we could reduce the rate in the UK to similar numbers, would this be cost effective? This study aimed to calculate the financial impact of negative appendicectomy at the Queen Alexandra Hospital and to explore whether a policy of routine imaging of such patients could reduce hospital costs. Materials and methods We performed a retrospective analysis of all appendicectomies over a 1-year period at our institution. Data were extracted on outcomes including appendix histology, operative time and length of stay to calculate the negative appendicectomy rate and to analyse costs. Results A total of 531 patients over 5 years of age had an appendicectomy. The negative appendicectomy rate was 22% (115/531). The additional financial costs of negative appendicectomy to the hospital during this period were £270,861. Universal imaging of all patients with right iliac fossa pain that could result in a 5% negative appendicectomy rate would cost between £67,200 and £165,600 per year but could save £33,896 (magnetic resonance imaging), £105,896 (computed tomography) or £132,296 (ultrasound) depending on imaging modality used. Conclusions Negative appendicectomy is still too frequent and results in additional financial burden to the health service. Routine imaging of patients with suspected appendicitis would not only reduce the negative appendicectomy rate but could lead to cost savings and a better service for our patients.

  13. Database Description - SSBD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name SSBD Alternative nam...ss 2-2-3 Minatojima-minamimachi, Chuo-ku, Kobe 650-0047, Japan, RIKEN Quantitative Biology Center Shuichi Onami E-mail: Database... classification Other Molecular Biology Databases Database classification Dynamic databa...elegans Taxonomy ID: 6239 Taxonomy Name: Escherichia coli Taxonomy ID: 562 Database description Systems Scie...i Onami Journal: Bioinformatics/April, 2015/Volume 31, Issue 7 External Links: Original website information Database

  14. Database Description - GETDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name GETDB Alternative n...ame Gal4 Enhancer Trap Insertion Database DOI 10.18908/lsdba.nbdc00236-000 Creator Creator Name: Shigeo Haya... Chuo-ku, Kobe 650-0047 Tel: +81-78-306-3185 FAX: +81-78-306-3183 E-mail: Database classification Expression... Invertebrate genome database Organism Taxonomy Name: Drosophila melanogaster Taxonomy ID: 7227 Database des...riginal website information Database maintenance site Drosophila Genetic Resource

  15. Development of a biomarkers database for the National Children's Study

    Energy Technology Data Exchange (ETDEWEB)

    Lobdell, Danelle T [US Environmental Protection Agency, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Human Studies Division, Epidemiology and Biomarkers Branch, MD 58A, Research Triangle Park, NC 27711 (United States); Mendola, Pauline [US Environmental Protection Agency, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Human Studies Division, Epidemiology and Biomarkers Branch, MD 58A, Research Triangle Park, NC 27711 (United States)

    2005-08-07

    The National Children's Study (NCS) is a federally-sponsored, longitudinal study of environmental influences on the health and development of children across the United States (www.nationalchildrensstudy.gov). Current plans are to study approximately 100,000 children and their families beginning before birth up to age 21 years. To explore potential biomarkers that could be important measurements in the NCS, we compiled the relevant scientific literature to identify both routine or standardized biological markers as well as new and emerging biological markers. Although the search criteria encouraged examination of factors that influence the breadth of child health and development, attention was primarily focused on exposure, susceptibility, and outcome biomarkers associated with four important child health outcomes: autism and neurobehavioral disorders, injury, cancer, and asthma. The Biomarkers Database was designed to allow users to: (1) search the biomarker records compiled by type of marker (susceptibility, exposure or effect), sampling media (e.g., blood, urine, etc.), and specific marker name; (2) search the citations file; and (3) read the abstract evaluations relative to our search criteria. A searchable, user-friendly database of over 2000 articles was created and is publicly available at: http://cfpub.epa.gov/ncea/cfm/recordisplay.cfm?deid=85844. PubMed was the primary source of references with some additional searches of Toxline, NTIS, and other reference databases. Our initial focus was on review articles, beginning as early as 1996, supplemented with searches of the recent primary research literature from 2001 to 2003. We anticipate this database will have applicability for the NCS as well as other studies of children's environmental health.

  16. Development of a biomarkers database for the National Children's Study

    International Nuclear Information System (INIS)

    Lobdell, Danelle T.; Mendola, Pauline

    2005-01-01

    The National Children's Study (NCS) is a federally-sponsored, longitudinal study of environmental influences on the health and development of children across the United States (www.nationalchildrensstudy.gov). Current plans are to study approximately 100,000 children and their families beginning before birth up to age 21 years. To explore potential biomarkers that could be important measurements in the NCS, we compiled the relevant scientific literature to identify both routine or standardized biological markers as well as new and emerging biological markers. Although the search criteria encouraged examination of factors that influence the breadth of child health and development, attention was primarily focused on exposure, susceptibility, and outcome biomarkers associated with four important child health outcomes: autism and neurobehavioral disorders, injury, cancer, and asthma. The Biomarkers Database was designed to allow users to: (1) search the biomarker records compiled by type of marker (susceptibility, exposure or effect), sampling media (e.g., blood, urine, etc.), and specific marker name; (2) search the citations file; and (3) read the abstract evaluations relative to our search criteria. A searchable, user-friendly database of over 2000 articles was created and is publicly available at: http://cfpub.epa.gov/ncea/cfm/recordisplay.cfm?deid=85844. PubMed was the primary source of references with some additional searches of Toxline, NTIS, and other reference databases. Our initial focus was on review articles, beginning as early as 1996, supplemented with searches of the recent primary research literature from 2001 to 2003. We anticipate this database will have applicability for the NCS as well as other studies of children's environmental health

  17. Integrated database for rapid mass movements in Norway

    Directory of Open Access Journals (Sweden)

    C. Jaedicke

    2009-03-01

    terrain of the Norwegian west coast, but major events are recorded all over the country. Snow avalanches account for most fatalities, while large rock slides causing flood waves and huge quick clay slides are the most damaging individual events in terms of damage to infrastructure and property and for causing multiple fatalities. The quality of the data is strongly influenced by the personal engagement of local observers and varying observation routines. This database is a unique source for statistical analysis including, risk analysis and the relation between rapid mass movements and climate. The database of rapid mass movement events will also facilitate validation of national hazard and risk maps.

  18. JICST Factual DatabaseJICST Chemical Substance Safety Regulation Database

    Science.gov (United States)

    Abe, Atsushi; Sohma, Tohru

    JICST Chemical Substance Safety Regulation Database is based on the Database of Safety Laws for Chemical Compounds constructed by Japan Chemical Industry Ecology-Toxicology & Information Center (JETOC) sponsored by the Sience and Technology Agency in 1987. JICST has modified JETOC database system, added data and started the online service through JOlS-F (JICST Online Information Service-Factual database) in January 1990. JICST database comprises eighty-three laws and fourteen hundred compounds. The authors outline the database, data items, files and search commands. An example of online session is presented.

  19. Performance Improvement with Web Based Database on Library Information System of Smk Yadika 5

    Directory of Open Access Journals (Sweden)

    Pualam Dipa Nusantara

    2015-12-01

    Full Text Available The difficulty in managing the data of books collection in the library is a problem that is often faced by the librarian that effect the quality of service. Arrangement and recording a collection of books in the file system of separate applications Word and Excel, as well as transaction handling borrowing and returning books, therehas been no integrated records. Library system can manage the book collection. This system can reduce the problems often experienced by library staff when serving students in borrowing books. There so frequent difficulty in managing the books that still in borrowed state. This system will also record a collection of late fees or lost library conducted by students (borrowers. The conclusion of this study is library performance can be better with the library system using web database.

  20. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  1. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  2. Exploration of a Vision for Actor Database Systems

    DEFF Research Database (Denmark)

    Shah, Vivek

    of these services. Existing popular approaches to building these services either use an in-memory database system or an actor runtime. We observe that these approaches have complementary strengths and weaknesses. In this dissertation, we propose the integration of actor programming models in database systems....... In doing so, we lay down a vision for a new class of systems called actor database systems. To explore this vision, this dissertation crystallizes the notion of an actor database system by defining its feature set in light of current application and hardware trends. In order to explore the viability...... of the outlined vision, a new programming model named Reactors has been designed to enrich classic relational database programming models with logical actor programming constructs. To support the reactor programming model, a high-performance in-memory multi-core OLTP database system named REACTDB has been built...

  3. Routine Cross-Sectional Head Imaging Before Electroconvulsive Therapy: A Tertiary Center Experience.

    Science.gov (United States)

    Sajedi, Payam I; Mitchell, Jason; Herskovits, Edward H; Raghavan, Prashant

    2016-04-01

    Electroconvulsive therapy (ECT) is generally contraindicated in patients with intracranial mass lesions or in the presence of increased intracranial pressure. The purpose of this study was to determine the prevalence of incidental abnormalities on routine cross-sectional head imaging, including CT and MRI, that would preclude subsequent ECT. This retrospective study involved a review of the electronic medical records of 105 patients (totaling 108 imaging studies) between April 27, 2007, and March 20, 2015, referred for cranial CT or MRI with the primary indication of pre-ECT evaluation. The probability of occurrence of imaging findings that would preclude ECT was computed. A cost analysis was also performed on the practice of routine pre-ECT imaging. Of the 105 patients who presented with the primary indication of ECT clearance (totaling 108 scans), 1 scan (0.93%) revealed findings that precluded ECT. None of the studies demonstrated findings that indicated increased intracranial pressure. A cost analysis revealed that at least $18,662.70 and 521.97 relative value units must be expended to identify one patient with intracranial pathology precluding ECT. The findings of this study demonstrate an extremely low prevalence of findings that preclude ECT on routine cross-sectional head imaging. The costs incurred in identifying a potential contraindication are high. The authors suggest that the performance of pre-ECT neuroimaging be driven by the clinical examination. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  4. Generic Entity Resolution in Relational Databases

    Science.gov (United States)

    Sidló, Csaba István

    Entity Resolution (ER) covers the problem of identifying distinct representations of real-world entities in heterogeneous databases. We consider the generic formulation of ER problems (GER) with exact outcome. In practice, input data usually resides in relational databases and can grow to huge volumes. Yet, typical solutions described in the literature employ standalone memory resident algorithms. In this paper we utilize facilities of standard, unmodified relational database management systems (RDBMS) to enhance the efficiency of GER algorithms. We study and revise the problem formulation, and propose practical and efficient algorithms optimized for RDBMS external memory processing. We outline a real-world scenario and demonstrate the advantage of algorithms by performing experiments on insurance customer data.

  5. Improvement of database on glass dissolution

    International Nuclear Information System (INIS)

    Hayashi, Maki; Sasamoto, Hiroshi; Yoshikawa, Hideki

    2008-03-01

    In geological disposal system, high-level radioactive waste (HLW) glass is expected to retain radionuclide for the long term as the first barrier to prevent radionuclide release. The advancement of its performance assessment technology leads to the reliability improvement of the safety assessment of entire geological disposal system. For this purpose, phenomenological studies for improvement of scientific understanding of dissolution/alteration mechanisms, and development of robust dissolution/alteration model based on the study outcomes are indispensable. The database on glass dissolution has been developed for supporting these studies. This report describes improvement of the prototype glass database. Also, this report gives an example of the application of the database for reliability assessment of glass dissolution model. (author)

  6. Management of virtualized infrastructure for physics databases

    International Nuclear Information System (INIS)

    Topurov, Anton; Gallerani, Luigi; Chatal, Francois; Piorkowski, Mariusz

    2012-01-01

    Demands for information storage of physics metadata are rapidly increasing together with the requirements for its high availability. Most of the HEP laboratories are struggling to squeeze more from their computer centers, thus focus on virtualizing available resources. CERN started investigating database virtualization in early 2006, first by testing database performance and stability on native Xen. Since then we have been closely evaluating the constantly evolving functionality of virtualisation solutions for database and middle tier together with the associated management applications – Oracle's Enterprise Manager and VM Manager. This session will detail our long experience in dealing with virtualized environments, focusing on newest Oracle OVM 3.0 for x86 and Oracle Enterprise Manager functionality for efficiently managing your virtualized database infrastructure.

  7. Clinical Neuropathology practice news 1-2014: Pyrosequencing meets clinical and analytical performance criteria for routine testing of MGMT promoter methylation status in glioblastoma

    Science.gov (United States)

    Preusser, Matthias; Berghoff, Anna S.; Manzl, Claudia; Filipits, Martin; Weinhäusel, Andreas; Pulverer, Walter; Dieckmann, Karin; Widhalm, Georg; Wöhrer, Adelheid; Knosp, Engelbert; Marosi, Christine; Hainfellner, Johannes A.

    2014-01-01

    Testing of the MGMT promoter methylation status in glioblastoma is relevant for clinical decision making and research applications. Two recent and independent phase III therapy trials confirmed a prognostic and predictive value of the MGMT promoter methylation status in elderly glioblastoma patients. Several methods for MGMT promoter methylation testing have been proposed, but seem to be of limited test reliability. Therefore, and also due to feasibility reasons, translation of MGMT methylation testing into routine use has been protracted so far. Pyrosequencing after prior DNA bisulfite modification has emerged as a reliable, accurate, fast and easy-to-use method for MGMT promoter methylation testing in tumor tissues (including formalin-fixed and paraffin-embedded samples). We performed an intra- and inter-laboratory ring trial which demonstrates a high analytical performance of this technique. Thus, pyrosequencing-based assessment of MGMT promoter methylation status in glioblastoma meets the criteria of high analytical test performance and can be recommended for clinical application, provided that strict quality control is performed. Our article summarizes clinical indications, practical instructions and open issues for MGMT promoter methylation testing in glioblastoma using pyrosequencing. PMID:24359605

  8. Mobile object retrieval in server-based image databases

    Science.gov (United States)

    Manger, D.; Pagel, F.; Widak, H.

    2013-05-01

    The increasing number of mobile phones equipped with powerful cameras leads to huge collections of user-generated images. To utilize the information of the images on site, image retrieval systems are becoming more and more popular to search for similar objects in an own image database. As the computational performance and the memory capacity of mobile devices are constantly increasing, this search can often be performed on the device itself. This is feasible, for example, if the images are represented with global image features or if the search is done using EXIF or textual metadata. However, for larger image databases, if multiple users are meant to contribute to a growing image database or if powerful content-based image retrieval methods with local features are required, a server-based image retrieval backend is needed. In this work, we present a content-based image retrieval system with a client server architecture working with local features. On the server side, the scalability to large image databases is addressed with the popular bag-of-word model with state-of-the-art extensions. The client end of the system focuses on a lightweight user interface presenting the most similar images of the database highlighting the visual information which is common with the query image. Additionally, new images can be added to the database making it a powerful and interactive tool for mobile contentbased image retrieval.

  9. Database Description - KAIKOcDNA | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us KAIKOcDNA Database Description General information of database Database name KAIKOcDNA Alter...National Institute of Agrobiological Sciences Akiya Jouraku E-mail : Database cla...ssification Nucleotide Sequence Databases Organism Taxonomy Name: Bombyx mori Taxonomy ID: 7091 Database des...rnal: G3 (Bethesda) / 2013, Sep / vol.9 External Links: Original website information Database maintenance si...available URL of Web services - Need for user registration Not available About This Database Database

  10. Download - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Download First of all, please read the license of this database. Data ...1.4 KB) Simple search and download Downlaod via FTP FTP server is sometimes jammed. If it is, access [here]. About This Database Data...base Description Download License Update History of This Database Site Policy | Contact Us Download - Trypanosomes Database | LSDB Archive ...

  11. License - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database License License to Use This Database Last updated : 2017/02/27 You may use this database...cense specifies the license terms regarding the use of this database and the requirements you must follow in using this database.... The license for this database is specified in the Creative ...Commons Attribution-Share Alike 4.0 International . If you use data from this database, please be sure attribute this database...ative Commons Attribution-Share Alike 4.0 International is found here . With regard to this database, you ar

  12. Databases for BaBar Datastream Calibrations and Prompt Reconstruction Processes

    International Nuclear Information System (INIS)

    Bartelt, John E

    1998-01-01

    We describe the design of databases used for performing datastream calibrations in the BABAR experiment, involving data accumulated on multiple processors and possibly over several blocks of events (''ConsBlocks''). The database for tracking the history and status of the ConsBlocks, along with similar databases needed by ''Prompt Reconstruction'' are also described

  13. The use of the hybrid K-edge densitometer for routine analysis of safeguards verification samples of reprocessing input liquor

    International Nuclear Information System (INIS)

    Ottmar, H.; Eberle, H.

    1991-01-01

    Following successful tests of a hybrid K-edge instrument at TUI Karlsruhe and the routine use of a K-edge densitometer for safeguards verification at the same laboratory, the Euratom Safeguards Directorate of the Commission of the European Communities decided to install the first such instrument into a large industrial reprocessing plant for the routine verification of samples taken from the input accountancy tanks. This paper reports on the installation, calibration, sample handling procedure and the performance of this instrument after one year of routine operation

  14. A review of patient safety measures based on routinely collected hospital data.

    Science.gov (United States)

    Tsang, Carmen; Palmer, William; Bottle, Alex; Majeed, Azeem; Aylin, Paul

    2012-01-01

    The literature on patient safety measures derived from routinely collected hospital data was reviewed to inform indicator development. MEDLINE and Embase databases and Web sites were searched. Of 1738 citations, 124 studies describing the application, evaluation, or validation of hospital-based medical error or complication of care measures were reviewed. Studies were frequently conducted in the United States (n = 88) between 2005 and 2009 (n = 77) using Agency for Healthcare Research and Quality patient safety indicators (PSIs; n = 79). The most frequently cited indicators included "postoperative hemorrhage or hematoma" and "accidental puncture and laceration." Indicator refinement is supported by international coding algorithm translations but is hampered by data issues, including coding inconsistencies. The validity of PSIs and similar adverse event screens beyond internal measurement and the effects of organizational factors on patient harm remain uncertain. Development of PSIs in ambulatory care settings, including general practice and psychiatric care, needs consideration.

  15. A Unit-Test Framework for Database Applications

    DEFF Research Database (Denmark)

    Christensen, Claus Abildgaard; Gundersborg, Steen; de Linde, Kristian

    The outcome of a test of an application that stores data in a database naturally depends on the state of the database. It is therefore important that test developers are able to set up and tear down database states in a simple and efficient manner. In existing unit-test frameworks, setting up...... test can be minimized. In addition, the reuse between unit tests can speed up the execution of test suites. A performance test on a medium-size project shows a 40% speed up and an estimated 25% reduction in the number of lines of test code....

  16. Influence of the centrifuge time of primary plasma tubes on routine coagulation testing.

    Science.gov (United States)

    Lippi, Giuseppe; Salvagno, Gian Luca; Montagnana, Martina; Manzato, Franco; Guidi, Gian Cesare

    2007-07-01

    Preparation of blood specimens is a major bottleneck in the laboratory throughput. Reliable strategies for reducing the time required for specimen processing without affecting quality should be acknowledged, especially for laboratories performing stat analyses. The present investigation was planned to establish a minimal suitable centrifuge time for primary samples collected for routine coagulation testing. Five sequential primary vacuum tubes containing 0.109 mol/l buffered trisodium citrate were collected from 10 volunteers and were immediately centrifuged on a conventional centrifuge at 1500 x g, at room temperature for 1, 2, 5, 10 and 15 min, respectively. Hematological and routine coagulation testing, including prothrombin time, activated partial thromboplastin time and fibrinogen, were performed. The centrifugation time was inversely associated with residual blood cell elements in plasma, especially platelets. Statistically significant variations from the reference 15-min centrifuge specimens were observed for fibrinogen in samples centrifuged for 5 min at most and for the activated partial thromboplastin time in samples centrifuged for 2 min at most. Meaningful biases related to the desirable bias were observed for fibrinogen in samples centrifuged for 2 min at most, and for the activated partial thromboplastin time in samples centrifuged for 1 min at most. According to our experimental conditions, a 5-10 min centrifuge time at 1500 x g may be suitable for primary tubes collected for routine coagulation testing.

  17. Acute and Time-Course Effects of Traditional and Dynamic Warm-Up Routines in Young Elite Junior Tennis Players.

    Directory of Open Access Journals (Sweden)

    Francisco Ayala

    Full Text Available Despite the large number of studies that have examined the acute effects of different warm up modalities (WU on physical performance, none of them have documented the time course of potential performance recovery in tennis players. The aim of this study was twofold: (a to analyze and compare the acute effects of two different WU modalities (traditional WU [TWU] and dynamic WU [DWU] on physical performance (i.e., CMJ, sprint, serve speed and accuracy in elite junior players, as well as (b to monitor the time course of any WU-induced changes after 30 and 60 min of simulated match-play. Twelve junior elite players completed both WUs modalities (TWU and DWU in a counterbalanced order on separate days. In each experimental session, counter movement jump (CMJ, 20-m sprint, tennis serve speed and accuracy tests were performed before (immediately after TWU or DWU during (30 min and after 60 min of a simulated match play. Measures were compared via four factorial (WU intervention and time repeated measures ANOVAs. There were main effects of WU (TWU and DWU throughout the time for all the variables analysed. The results indicate that DWU routine led to significantly faster 20 m sprint times and higher CMJs as well as faster and more accurate tennis serves at both post warm-up and 30 min match-play testing moments in comparison with the scores reported by the TWU routine (p 75-99%. No significant intergroup differences were found at 60-min match-play testing moment in any variable (except for the 20 m sprint. Therefore, the findings of this study recommend for optimal performance in these elite tennis players, DWU routines should be performed prior to formal training and competition rather than TWU routines.

  18. Developing of database on nuclear power engineering and purchase of ORACLE system

    International Nuclear Information System (INIS)

    Liu Renkang

    1996-01-01

    This paper presents a point of view according development of database on the nuclear power engineering and performance of ORACLE database manager system. ORACLE system is a practical database system for purchasing

  19. Proposal for a High Energy Nuclear Database

    International Nuclear Information System (INIS)

    Brown, David A.; Vogt, Ramona

    2005-01-01

    We propose to develop a high-energy heavy-ion experimental database and make it accessible to the scientific community through an on-line interface. This database will be searchable and cross-indexed with relevant publications, including published detector descriptions. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This database should eventually contain all published data from Bevalac and AGS to RHIC to CERN-LHC energies, proton-proton to nucleus-nucleus collisions as well as other relevant systems, and all measured observables. Such a database would have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models to a broad range of old and new experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion and target and source development for upcoming facilities such as the Next Linear Collider. To enhance the utility of this database, we propose periodically performing evaluations of the data and summarizing the results in topical reviews

  20. Data Cleaning and Semantic Improvement in Biological Databases

    Directory of Open Access Journals (Sweden)

    Apiletti Daniele

    2006-12-01

    Full Text Available Public genomic and proteomic databases can be affected by a variety of errors. These errors may involve either the description or the meaning of data (namely, syntactic or semantic errors. We focus our analysis on the detection of semantic errors, in order to verify the accuracy of the stored information. In particular, we address the issue of data constraints and functional dependencies among attributes in a given relational database. Constraints and dependencies show semantics among attributes in a database schema and their knowledge may be exploited to improve data quality and integration in database design, and to perform query optimization and dimensional reduction.

  1. Database Description - AcEST | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name AcEST Alternative n...hi, Tokyo-to 192-0397 Tel: +81-42-677-1111(ext.3654) E-mail: Database classificat...eneris Taxonomy ID: 13818 Database description This is a database of EST sequences of Adiantum capillus-vene...(3): 223-227. External Links: Original website information Database maintenance site Plant Environmental Res...base Database Description Download License Update History of This Database Site Policy | Contact Us Database Description - AcEST | LSDB Archive ...

  2. The NAGRA/PSI thermochemical database: new developments

    International Nuclear Information System (INIS)

    Hummel, W.; Berner, U.; Thoenen, T.; Pearson, F.J.Jr.

    2000-01-01

    The development of a high quality thermochemical database for performance assessment is a scientifically fascinating and demanding task, and is not simply collecting and recording numbers. The final product can by visualised as a complex building with different storeys representing different levels of complexity. The present status report illustrates the various building blocks which we believe are integral to such a database structure. (authors)

  3. The NAGRA/PSI thermochemical database: new developments

    Energy Technology Data Exchange (ETDEWEB)

    Hummel, W.; Berner, U.; Thoenen, T. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Pearson, F.J.Jr. [Ground-Water Geochemistry, New Bern, NC (United States)

    2000-07-01

    The development of a high quality thermochemical database for performance assessment is a scientifically fascinating and demanding task, and is not simply collecting and recording numbers. The final product can by visualised as a complex building with different storeys representing different levels of complexity. The present status report illustrates the various building blocks which we believe are integral to such a database structure. (authors)

  4. Ethnographic study of ICT-supported collaborative work routines in general practice

    Science.gov (United States)

    2010-01-01

    Background Health informatics research has traditionally been dominated by experimental and quasi-experimental designs. An emerging area of study in organisational sociology is routinisation (how collaborative work practices become business-as-usual). There is growing interest in the use of ethnography and other in-depth qualitative approaches to explore how collaborative work routines are enacted and develop over time, and how electronic patient records (EPRs) are used to support collaborative work practices within organisations. Methods/design Following Feldman and Pentland, we will use 'the organisational routine' as our unit of analysis. In a sample of four UK general practices, we will collect narratives, ethnographic observations, multi-modal (video and screen capture) data, documents and other artefacts, and analyse these to map and compare the different understandings and enactments of three common routines (repeat prescribing, coding and summarising, and chronic disease surveillance) which span clinical and administrative spaces and which, though 'mundane', have an important bearing on quality and safety of care. In a detailed qualitative analysis informed by sociological theory, we aim to generate insights about how complex collaborative work is achieved through the process of routinisation in healthcare organisations. Discussion Our study offers the potential not only to identify potential quality failures (poor performance, errors, failures of coordination) in collaborative work routines but also to reveal the hidden work and workarounds by front-line staff which bridge the model-reality gap in EPR technologies and via which "automated" safety features have an impact in practice. PMID:21190583

  5. License - SKIP Stemcell Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SKIP Stemcell Database License License to Use This Database Last updated : 2017/03/13 You may use this database...specifies the license terms regarding the use of this database and the requirements you must follow in using this database.... The license for this database is specified in the Creative Common...s Attribution-Share Alike 4.0 International . If you use data from this database, please be sure attribute this database...al ... . The summary of the Creative Commons Attribution-Share Alike 4.0 International is found here . With regard to this database

  6. Distortion-Free Watermarking Approach for Relational Database Integrity Checking

    Directory of Open Access Journals (Sweden)

    Lancine Camara

    2014-01-01

    Full Text Available Nowadays, internet is becoming a suitable way of accessing the databases. Such data are exposed to various types of attack with the aim to confuse the ownership proofing or the content protection. In this paper, we propose a new approach based on fragile zero watermarking for the authentication of numeric relational data. Contrary to some previous databases watermarking techniques which cause some distortions in the original database and may not preserve the data usability constraints, our approach simply seeks to generate the watermark from the original database. First, the adopted method partitions the database relation into independent square matrix groups. Then, group-based watermarks are securely generated and registered in a trusted third party. The integrity verification is performed by computing the determinant and the diagonal’s minor for each group. As a result, tampering can be localized up to attribute group level. Theoretical and experimental results demonstrate that the proposed technique is resilient against tuples insertion, tuples deletion, and attributes values modification attacks. Furthermore, comparison with recent related effort shows that our scheme performs better in detecting multifaceted attacks.

  7. JAEA thermodynamic database for performance assessment of geological disposal of high-level and TRU wastes. Refinement of thermodynamic data for trivalent actinoids and samarium

    International Nuclear Information System (INIS)

    Kitamura, Akira; Fujiwara, Kenso; Yui, Mikazu

    2010-01-01

    Within the scope of the JAEA thermodynamic database project for performance assessment of geological disposal of high-level radioactive and TRU wastes, the refinement of the thermodynamic data for the inorganic compounds and complexes of trivalent actinoids (actinium(III), plutonium(III), americium(III) and curium(III)) and samarium(III) was carried out. Refinement of thermodynamic data for these elements was based on the thermodynamic database for americium published by the Nuclear Energy Agency in the Organisation for Economic Co-operation and Development (OECD/NEA). Based on the similarity of chemical properties among trivalent actinoids and samarium, complementary thermodynamic data for their species expected under the geological disposal conditions were selected to complete the thermodynamic data set for the performance assessment of geological disposal of radioactive wastes. (author)

  8. Design and implementation of the ITPA confinement profile database

    Energy Technology Data Exchange (ETDEWEB)

    Walters, Malcolm E-mail: malcolm.walters@ukaea.org.uk; Roach, Colin

    2004-06-01

    One key goal of the fusion program is to improve the accuracy of physics models in describing existing experiments, so as to make better predictions of the performance of future fusion devices. To support this goal, databases of experimental results from multiple machines have been assembled to facilitate the testing of physics models over a wide range of operating conditions and plasma parameters. One such database was the International Multi-Tokamak Profile Database. This database has more recently been substantially revamped to exploit newer technologies, and is now known as the ITPA confinement profile database http://www.tokamak-profiledb.ukaea.org.uk. The overall design of the updated system will be outlined and the implementation of the relational database part will be described in detail.

  9. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  10. KALIMER database development

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  11. Using relational databases to collect and store discrete-event simulation results

    DEFF Research Database (Denmark)

    Poderys, Justas; Soler, José

    2016-01-01

    , export the results to a data carrier file and then process the results stored in a file using the data processing software. In this work, we propose to save the simulation results directly from a simulation tool to a computer database. We implemented a link between the discrete-even simulation tool...... and the database and performed performance evaluation of 3 different open-source database systems. We show, that with a right choice of a database system, simulation results can be collected and exported up to 2.67 times faster, and use 1.78 times less disk space when compared to using simulation software built...

  12. Comparison of performance indicators of different types of reactors based on ISOE database

    International Nuclear Information System (INIS)

    Janzekovic, H.; Krizman, M.

    2005-01-01

    The optimisation of the operation of a nuclear power plant (NPP) is a challenging issue due to the fact that besides general management issues, a risk associated to nuclear facilities should be included. In order to optimise the radiation protection programmes in around 440 reactors in operation with more than 500 000 monitored workers each year, the international exchange of performance indicators (PI) related to radiation protection issues seems to be essential. Those indicators are a function of a type of a reactor as well as the age and the quality of the management of the reactor. in general three main types of radiation protection PI could be recognised. These are: occupational exposure of workers, public exposure and management of PI related to radioactive waste. The occupational exposure could be efficiently studied using ISOC database. The dependence of occupational exposure on different types of reactors, e.g. PWR, BWR, are given, analysed and compared. (authors)

  13. Effects of reviewing routine practices on learning outcomes in continuing education.

    Science.gov (United States)

    Mamede, Silvia; Loyens, Sofie; Ezequiel, Oscarina; Tibiriçá, Sandra; Penaforte, Júlio; Schmidt, Henk

    2013-07-01

    Conventional continuing medical education (CME) has been shown to have modest effects on doctor performance. New educational approaches based on the review of routine practices have brought better results. Little is known about factors that affect the outcomes of these approaches, especially in middle-income countries. This study aimed to investigate factors that influence the learning and quality of clinical performance in CME based on reflection upon experiences. A questionnaire and a clinical performance test were administered to 165 general practitioners engaged in a CME programme in Brazil. The questionnaire assessed behaviours related to four input variables (individual reflection on practices, peer review of experiences, self-regulated learning and learning skills) and two mediating variables (identification of learning needs and engagement in learning activities, the latter consisting of self-study of scientific literature, consultations about patient problems, and attendance at courses). Structural equation modelling was used to test a hypothesised model of relationships between these variables and the outcome variable of clinical performance, measured by the clinical performance test. After minor adjustments, the hypothesised model fit the empirical data well. Individual reflection fostered identification of learning needs, but also directly positively influenced the quality of clinical performance. Peer review did not affect identification of learning needs, but directly positively affected clinical performance. Learning skills and self-regulation did not help in identifying learning needs, but self-regulation enhanced study of the scientific literature, the learning activity that most positively influenced clinical performance. Consultation with colleagues, the activity most frequently triggered by the identification of learning needs, did not affect performance, and attendance of courses had only limited effect. This study shed light on the factors

  14. Database Description - RPSD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name RPSD Alternative nam...e Rice Protein Structure Database DOI 10.18908/lsdba.nbdc00749-000 Creator Creator Name: Toshimasa Yamazaki ... Ibaraki 305-8602, Japan National Institute of Agrobiological Sciences Toshimasa Yamazaki E-mail : Databas...e classification Structure Databases - Protein structure Organism Taxonomy Name: Or...or name(s): Journal: External Links: Original website information Database maintenance site National Institu

  15. Managing Rock and Paleomagnetic Data Flow with the MagIC Database: from Measurement and Analysis to Comprehensive Archive and Visualization

    Science.gov (United States)

    Koppers, A. A.; Minnett, R. C.; Tauxe, L.; Constable, C.; Donadini, F.

    2008-12-01

    The Magnetics Information Consortium (MagIC) is commissioned to implement and maintain an online portal to a relational database populated by rock and paleomagnetic data. The goal of MagIC is to archive all measurements and derived properties for studies of paleomagnetic directions (inclination, declination) and intensities, and for rock magnetic experiments (hysteresis, remanence, susceptibility, anisotropy). Organizing data for presentation in peer-reviewed publications or for ingestion into databases is a time-consuming task, and to facilitate these activities, three tightly integrated tools have been developed: MagIC-PY, the MagIC Console Software, and the MagIC Online Database. A suite of Python scripts is available to help users port their data into the MagIC data format. They allow the user to add important metadata, perform basic interpretations, and average results at the specimen, sample and site levels. These scripts have been validated for use as Open Source software under the UNIX, Linux, PC and Macintosh© operating systems. We have also developed the MagIC Console Software program to assist in collating rock and paleomagnetic data for upload to the MagIC database. The program runs in Microsoft Excel© on both Macintosh© computers and PCs. It performs routine consistency checks on data entries, and assists users in preparing data for uploading into the online MagIC database. The MagIC website is hosted under EarthRef.org at http://earthref.org/MAGIC/ and has two search nodes, one for paleomagnetism and one for rock magnetism. Both nodes provide query building based on location, reference, methods applied, material type and geological age, as well as a visual FlashMap interface to browse and select locations. Users can also browse the database by data type (inclination, intensity, VGP, hysteresis, susceptibility) or by data compilation to view all contributions associated with previous databases, such as PINT, GMPDB or TAFI or other user

  16. Database Description - FANTOM5 | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us FANTOM5 Database Description General information of database Database name FANTOM5 Alternati...me: Rattus norvegicus Taxonomy ID: 10116 Taxonomy Name: Macaca mulatta Taxonomy ID: 9544 Database descriptio...l Links: Original website information Database maintenance site RIKEN Center for Life Science Technologies, ...ilable Web services Not available URL of Web services - Need for user registration Not available About This Database Database... Description Download License Update History of This Database Site Policy | Contact Us Database Description - FANTOM5 | LSDB Archive ...

  17. Advanced technologies for scalable ATLAS conditions database access on the grid

    International Nuclear Information System (INIS)

    Basset, R; Canali, L; Girone, M; Hawkings, R; Valassi, A; Viegas, F; Dimitrov, G; Nevski, P; Vaniachine, A; Walker, R; Wong, A

    2010-01-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysis of server performance under stress tests indicates that Conditions Db data access is limited by the disk I/O throughput. An unacceptable side-effect of the disk I/O saturation is a degradation of the WLCG 3D Services that update Conditions Db data at all ten ATLAS Tier-1 sites using the technology of Oracle Streams. To avoid such bottlenecks we prototyped and tested a novel approach for database peak load avoidance in Grid computing. Our approach is based upon the proven idea of pilot job submission on the Grid: instead of the actual query, an ATLAS utility library sends to the database server a pilot query first.

  18. NoSQL databases

    OpenAIRE

    Mrozek, Jakub

    2012-01-01

    This thesis deals with database systems referred to as NoSQL databases. In the second chapter, I explain basic terms and the theory of database systems. A short explanation is dedicated to database systems based on the relational data model and the SQL standardized query language. Chapter Three explains the concept and history of the NoSQL databases, and also presents database models, major features and the use of NoSQL databases in comparison with traditional database systems. In the fourth ...

  19. Utility of multispectral imaging for nuclear classification of routine clinical histopathology imagery

    Directory of Open Access Journals (Sweden)

    Harvey Neal R

    2007-07-01

    Full Text Available Abstract Background We present an analysis of the utility of multispectral versus standard RGB imagery for routine H&E stained histopathology images, in particular for pixel-level classification of nuclei. Our multispectral imagery has 29 spectral bands, spaced 10 nm within the visual range of 420–700 nm. It has been hypothesized that the additional spectral bands contain further information useful for classification as compared to the 3 standard bands of RGB imagery. We present analyses of our data designed to test this hypothesis. Results For classification using all available image bands, we find the best performance (equal tradeoff between detection rate and false alarm rate is obtained from either the multispectral or our "ccd" RGB imagery, with an overall increase in performance of 0.79% compared to the next best performing image type. For classification using single image bands, the single best multispectral band (in the red portion of the spectrum gave a performance increase of 0.57%, compared to performance of the single best RGB band (red. Additionally, red bands had the highest coefficients/preference in our classifiers. Principal components analysis of the multispectral imagery indicates only two significant image bands, which is not surprising given the presence of two stains. Conclusion Our results indicate that multispectral imagery for routine H&E stained histopathology provides minimal additional spectral information for a pixel-level nuclear classification task than would standard RGB imagery.

  20. The role of routine post-natal abdominal ultrasound for newborns in a resource-poor setting: a longitudinal study

    Directory of Open Access Journals (Sweden)

    Omokhodion Samuel I

    2011-07-01

    Full Text Available Abstract Background- Neonatal abdominal ultrasound is usually performed in Nigeria to investigate neonatal symptoms rather than as a follow up to evaluate fetal abnormalities which were detected on prenatal ultrasound. The role of routine obstetric ultrasonography in the monitoring of pregnancy and identification of fetal malformations has partly contributed to lowering of fetal mortality rates. In Nigeria which has a high maternal and fetal mortality rate, many pregnant women do not have ante-natal care and not infrequently, women also deliver their babies at home and only bring the newborns to the clinics for immunization. Even when performed, most routine obstetric scans are not targeted towards the detection of fetal abnormalities. The aim of the present study is to evaluate the benefit of routinely performing abdominal scans on newborns with a view to detecting possible abnormalities which may have been missed ante-natally. Methods- This was a longitudinal study of 202 consecutive, apparently normal newborns. Routine clinical examination and abdominal ultrasound scans were performed on the babies by their mother's bedside, before discharge. Neonates with abnormal initial scans had follow-up scans. Results- There were 108 males and 94 females. There were 12 (5.9% abnormal scans seen in five male and seven female neonates. Eleven of the twelve abnormalities were in the kidneys, six on the left and five on the right. Three of the four major renal anomalies- absent kidney, ectopic/pelvic kidney and two cases of severe hydronephrosis were however on the left side. There was one suprarenal abnormality on the right suspected to be a possible infected adrenal haemorrage. Nine of the abnormal cases reported for follow- up and of these, two cases had persistent severe abnormalities. Conclusions- This study demonstrated a 5.9% incidence of genito urinary anomalies on routine neonatal abdominal ultrasound in this small population. Routine obstetric USS

  1. DOE technology information management system database study report

    Energy Technology Data Exchange (ETDEWEB)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  2. Development of database management system for monitoring of radiation workers for actinides

    International Nuclear Information System (INIS)

    Kalyane, G.N.; Mishra, L.; Nadar, M.Y.; Singh, I.S.; Rao, D.D.

    2012-01-01

    Annually around 500 radiation workers are monitored for estimation of lung activities and internal dose due to Pu/Am and U from various divisions of Bhabha Atomic Research Centre (Trombay) and from PREFRE and A3F facilities (Tarapur) in lung counting laboratory located at Bhabha Atomic Research Centre hospital under Routine and Special monitoring program. A 20 cm diameter phoswich and an array of HPGe detector were used for this purpose. In case of positive contamination, workers are followed up and monitored using both the detection systems in different geometries. Management of this huge data becomes difficult and therefore an easily retrievable database system containing all the relevant data of the monitored radiation workers. Materials and methods: The database management system comprises of three main modules integrated together: 1) Apache server installed on a Windows (XP) platform (Apache version 2.2.17) 2) MySQL database management system (MySQL version 5.5.8) 3) PHP (Preformatted Hypertext) programming language (PHP version 5.3.5). All the 3 modules work together seamlessly as a single software program. The front end user interaction is through an user friendly and interactive local web page where internet connection is not required. This front page has hyperlinks to many other pages, which have different utilities for the user. The user has to log in using username and password. Results and Conclusions: Database management system is used for entering, updating and management of lung monitoring data of radiation workers, The program is having following utilities: bio-data entry of new subjects, editing of bio-data of old subjects (only one subject at a time), entry of counting data of that day's lung monitoring, retrieval of old records based on a number of parameters and filters like date of counting, employee number, division, counts fulfilling a given criterion, etc. and calculation of MEQ CWT (Muscle Equivalent Chest Wall Thickness), energy

  3. CMS experience with online and offline Databases

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The CMS experiment is made of many detectors which in total sum up to more than 75 million channels. The online database stores the configuration data used to configure the various parts of the detector and bring it in all possible running states. The database also stores the conditions data, detector monitoring parameters of all channels (temperatures, voltages), detector quality information, beam conditions, etc. These quantities are used by the experts to monitor the detector performance in detail, as they occupy a very large space in the online database they cannot be used as-is for offline data reconstruction. For this, a "condensed" set of the full information, the "conditions data", is created and copied to a separate database used in the offline reconstruction. The offline conditions database contains the alignment and calibrations data for the various detectors. Conditions data sets are accessed by a tag and an interval of validity through the offline reconstruction program CMSSW, written in C++. Pe...

  4. Scale out databases for CERN use cases

    CERN Document Server

    Baranowski, Zbigniew; Canali, Luca; Garcia, Daniel Lanza; Surdy, Kacper

    2015-01-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log dat...

  5. Routine imaging for diffuse large B-cell lymphoma in first remission is not associated with better survival

    DEFF Research Database (Denmark)

    El-Galaly, Tarec; Jakobsen, Lasse Hjort; Hutchings, Martin

    2015-01-01

    Background: Routine surveillance imaging plays a limited role in detecting recurrent diffuse large B-cell lymphoma (DLBCL), and the value of routine imaging is controversial. The present population-based study compares the post-remission survival of Danish and Swedish DLBCL patients-two neighbour......Background: Routine surveillance imaging plays a limited role in detecting recurrent diffuse large B-cell lymphoma (DLBCL), and the value of routine imaging is controversial. The present population-based study compares the post-remission survival of Danish and Swedish DLBCL patients...... are fully publicly funded. Follow-up (FU) for Swedish patients included symptom assessment, clinical examinations, and blood tests with 3-month intervals for 2 years and with longer intervals later in follow-up. Imaging was only performed in response to suspected relapse. FU for Danish patients...... was equivalent but included additional routine surveillance imaging (usually half-yearly CT for 2 years as a minimum). Clinico-pathological features were retrieved from the national lymphoma registries, and vital status was updated using the civil registries. OS was defined as the time from end of treatment...

  6. Biogas composition and engine performance, including database and biogas property model

    NARCIS (Netherlands)

    Bruijstens, A.J.; Beuman, W.P.H.; Molen, M. van der; Rijke, J. de; Cloudt, R.P.M.; Kadijk, G.; Camp, O.M.G.C. op den; Bleuanus, W.A.J.

    2008-01-01

    In order to enable this evaluation of the current biogas quality situation in the EU; results are presented in a biogas database. Furthermore the key gas parameter Sonic Bievo Index (influence on open loop A/F-ratio) is defined and other key gas parameters like the Methane Number (knock resistance)

  7. Diffusivity database (DDB) for major rocks. Database for the second progress report

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Haruo

    1999-10-01

    A database for diffusivity for a data setting of effective diffusion coefficients in rock matrices in the second progress report, was developed. In this database, 3 kinds of diffusion coefficients: effective diffusion coefficient (De), apparent diffusion coefficient (Da) and free water diffusion coefficient (Do) were treated. The database, based on literatures published between 1980 and 1998, was developed considering the following points. (1) Since Japanese geological environment is focused in the second progress report, data for diffusion are collected focused on Japanese major rocks. (2) Although 22 elements are considered to be important in performance assessment for geological disposal, all elements and aquatic tracers are treated in this database development considering general purpose. (3) Since limestone, which belongs to sedimentary rock, can become one of the natural resources and is inappropriate as a host rock, it is omitted in this database development. Rock was categorized into 4 kinds of rocks; acid crystalline rock, alkaline crystalline rock, sedimentary rock (argillaceous/tuffaceous rock) and sedimentary rock (psammitic rock/sandy stone) from the viewpoint of geology and mass transport. In addition, rocks around neutrality among crystalline rock were categorized into the alkaline crystalline rock in this database. The database is composed of sub-databases for 4 kinds of rocks. Furthermore, the sub-databases for 4 kinds of the rocks are composed of databases to individual elements, in which totally, 24 items such as species, rock name, diffusion coefficients (De, Da, Do), obtained conditions (method, porewater, pH, Eh, temperature, atmosphere, etc.), etc. are input. As a result of literature survey, for De values for acid crystalline rock, totally, 207 data for 18 elements and one tracer (hydrocarbon) have been reported and all data were for granitic rocks such as granite, granodiorite and biotitic granite. For alkaline crystalline rock, totally, 32

  8. A multidisciplinary database for geophysical time series management

    Science.gov (United States)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  9. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI): A Completed Reference Database of Lung Nodules on CT Scans

    International Nuclear Information System (INIS)

    2011-01-01

    Purpose: The development of computer-aided diagnostic (CAD) methods for lung nodule detection, classification, and quantitative assessment can be facilitated through a well-characterized repository of computed tomography (CT) scans. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) completed such a database, establishing a publicly available reference for the medical imaging research community. Initiated by the National Cancer Institute (NCI), further advanced by the Foundation for the National Institutes of Health (FNIH), and accompanied by the Food and Drug Administration (FDA) through active participation, this public-private partnership demonstrates the success of a consortium founded on a consensus-based process. Methods: Seven academic centers and eight medical imaging companies collaborated to identify, address, and resolve challenging organizational, technical, and clinical issues to provide a solid foundation for a robust database. The LIDC/IDRI Database contains 1018 cases, each of which includes images from a clinical thoracic CT scan and an associated XML file that records the results of a two-phase image annotation process performed by four experienced thoracic radiologists. In the initial blinded-read phase, each radiologist independently reviewed each CT scan and marked lesions belonging to one of three categories (''nodule≥3 mm,''''nodule<3 mm,'' and ''non-nodule≥3 mm''). In the subsequent unblinded-read phase, each radiologist independently reviewed their own marks along with the anonymized marks of the three other radiologists to render a final opinion. The goal of this process was to identify as completely as possible all lung nodules in each CT scan without requiring forced consensus. Results: The Database contains 7371 lesions marked ''nodule'' by at least one radiologist. 2669 of these lesions were marked ''nodule≥3 mm'' by at least one radiologist, of which 928 (34.7%) received such marks from all

  10. Adoption of routine telemedicine in Norway: the current picture

    Science.gov (United States)

    Zanaboni, Paolo; Knarvik, Undine; Wootton, Richard

    2014-01-01

    Background Telemedicine appears to be ready for wider adoption. Although existing research evidence is useful, the adoption of routine telemedicine in healthcare systems has been slow. Objective We conducted a study to explore the current use of routine telemedicine in Norway, at national, regional, and local levels, to provide objective and up-to-date information and to estimate the potential for wider adoption of telemedicine. Design A top-down approach was used to collect official data on the national use of telemedicine from the Norwegian Patient Register. A bottom-up approach was used to collect complementary information on the routine use of telemedicine through a survey conducted at the five largest publicly funded hospitals. Results Results show that routine telemedicine has been adopted in all health regions in Norway and in 68% of hospitals. Despite being widely adopted, the current level of use of telemedicine is low compared to the number of face-to-face visits. Examples of routine telemedicine can be found in several clinical specialties. Most services connect different hospitals in secondary care, and they are mostly delivered as teleconsultations via videoconference. Conclusions Routine telemedicine in Norway has been widely adopted, probably for geographical reasons, as in other settings. However, the level of use of telemedicine in Norway is rather low, and it has significant potential for further development as an alternative to face-to-face outpatient visits. This study is a first attempt to map routine telemedicine at regional, institutional, and clinical levels, and it provides useful information to understand the adoption of telemedicine in routine healthcare and to measure change in future updates. PMID:24433942

  11. Understanding teachers’ routines to inform classroom technology design

    NARCIS (Netherlands)

    An, P.; Bakker, S.; Eggen, J.H.

    2017-01-01

    Secondary school teachers have quite busy and complex routines in their classrooms. However, present classroom technologies usually require focused attention from teachers while being interacted with, which restricts their use in teachers’ daily routines. Peripheral interaction is a human-computer

  12. Opportunistic screening for osteoporosis on routine computed tomography? An external validation study

    Energy Technology Data Exchange (ETDEWEB)

    Buckens, Constantinus F. [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Universitair Medisch Centrum Utrecht, Department of Radiology, Utrecht (Netherlands); Dijkhuis, Gawein; Jong, Pim A. de [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Keizer, Bart de [University Medical Center Utrecht, Department of Nuclear Medicine, Utrecht (Netherlands); Verhaar, Harald J. [University Medical Center Utrecht, Department of Geriatric Medicine, Utrecht (Netherlands)

    2015-07-15

    Opportunistic screening for osteoporosis using computed tomography (CT) examinations that happen to visualise the spine can be used to identify patients with osteoporosis. We sought to verify the diagnostic performance of vertebral Hounsfield unit (HU) measurements on routine CT examinations for diagnosing osteoporosis in a separate, external population. Consecutive patients who underwent a CT examination of the chest or abdomen and had also received a dual-energy X-ray absorptiometry (DXA) test were retrospectively included. CTs were evaluated for vertebral fractures and vertebral attenuation (density) values were measured. Diagnostic performance measures and the area under the receiver operator characteristics curve (AUC) for diagnosing osteoporosis were calculated. Three hundred and two patients with a mean age of 57.9 years were included, of which 82 (27 %) had osteoporosis according to DXA and 65 (22 %) had vertebral fractures. The diagnostic performance for vertebral HU measurements was modest, with a maximal AUC of 0.74 (0.68 - 0.80). At that optimal threshold the sensitivity was 62 % (51 - 72 %) and the specificity was 79 % (74 - 84 %). We confirmed that simple trabecular vertebral density measurements on routine CT contain diagnostic information related to bone mineral density as measured by DXA, albeit with substantially lower diagnostic accuracy than previously reported. (orig.)

  13. Opportunistic screening for osteoporosis on routine computed tomography? An external validation study

    International Nuclear Information System (INIS)

    Buckens, Constantinus F.; Dijkhuis, Gawein; Jong, Pim A. de; Keizer, Bart de; Verhaar, Harald J.

    2015-01-01

    Opportunistic screening for osteoporosis using computed tomography (CT) examinations that happen to visualise the spine can be used to identify patients with osteoporosis. We sought to verify the diagnostic performance of vertebral Hounsfield unit (HU) measurements on routine CT examinations for diagnosing osteoporosis in a separate, external population. Consecutive patients who underwent a CT examination of the chest or abdomen and had also received a dual-energy X-ray absorptiometry (DXA) test were retrospectively included. CTs were evaluated for vertebral fractures and vertebral attenuation (density) values were measured. Diagnostic performance measures and the area under the receiver operator characteristics curve (AUC) for diagnosing osteoporosis were calculated. Three hundred and two patients with a mean age of 57.9 years were included, of which 82 (27 %) had osteoporosis according to DXA and 65 (22 %) had vertebral fractures. The diagnostic performance for vertebral HU measurements was modest, with a maximal AUC of 0.74 (0.68 - 0.80). At that optimal threshold the sensitivity was 62 % (51 - 72 %) and the specificity was 79 % (74 - 84 %). We confirmed that simple trabecular vertebral density measurements on routine CT contain diagnostic information related to bone mineral density as measured by DXA, albeit with substantially lower diagnostic accuracy than previously reported. (orig.)

  14. Database Description - DMPD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name DMPD Alternative nam...e Dynamic Macrophage Pathway CSML Database DOI 10.18908/lsdba.nbdc00558-000 Creator Creator Name: Masao Naga...ty of Tokyo 4-6-1 Shirokanedai, Minato-ku, Tokyo 108-8639 Tel: +81-3-5449-5615 FAX: +83-3-5449-5442 E-mail: Database...606 Taxonomy Name: Mammalia Taxonomy ID: 40674 Database description DMPD collects...e(s) Article title: Author name(s): Journal: External Links: Original website information Database maintenan

  15. Database Dump - fRNAdb | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us fRNAdb Database Dump Data detail Data name Database Dump DOI 10.18908/lsdba.nbdc00452-002 De... data (tab separeted text) Data file File name: Database_Dump File URL: ftp://ftp....biosciencedbc.jp/archive/frnadb/LATEST/Database_Dump File size: 673 MB Simple search URL - Data acquisition...s. Data analysis method - Number of data entries 4 files - About This Database Database Description Download... License Update History of This Database Site Policy | Contact Us Database Dump - fRNAdb | LSDB Archive ...

  16. KaBOB: ontology-based semantic integration of biomedical databases.

    Science.gov (United States)

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for

  17. The emergence and change of management accounting routines

    NARCIS (Netherlands)

    van der Steen, M.P.

    2011-01-01

    Purpose - The purpose of this paper is to explore the dynamics involved in the emergence and change of management accounting routines. It seeks to provide an understanding of the ways in which these complex routines foster stability and change in management accounting practices.

  18. MetReS, an Efficient Database for Genomic Applications.

    Science.gov (United States)

    Vilaplana, Jordi; Alves, Rui; Solsona, Francesc; Mateo, Jordi; Teixidó, Ivan; Pifarré, Marc

    2018-02-01

    MetReS (Metabolic Reconstruction Server) is a genomic database that is shared between two software applications that address important biological problems. Biblio-MetReS is a data-mining tool that enables the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the processes of interest and their function. The main goal of this work was to identify the areas where the performance of the MetReS database performance could be improved and to test whether this improvement would scale to larger datasets and more complex types of analysis. The study was started with a relational database, MySQL, which is the current database server used by the applications. We also tested the performance of an alternative data-handling framework, Apache Hadoop. Hadoop is currently used for large-scale data processing. We found that this data handling framework is likely to greatly improve the efficiency of the MetReS applications as the dataset and the processing needs increase by several orders of magnitude, as expected to happen in the near future.

  19. YMDB: the Yeast Metabolome Database

    Science.gov (United States)

    Jewison, Timothy; Knox, Craig; Neveu, Vanessa; Djoumbou, Yannick; Guo, An Chi; Lee, Jacqueline; Liu, Philip; Mandal, Rupasri; Krishnamurthy, Ram; Sinelnikov, Igor; Wilson, Michael; Wishart, David S.

    2012-01-01

    The Yeast Metabolome Database (YMDB, http://www.ymdb.ca) is a richly annotated ‘metabolomic’ database containing detailed information about the metabolome of Saccharomyces cerevisiae. Modeled closely after the Human Metabolome Database, the YMDB contains >2000 metabolites with links to 995 different genes/proteins, including enzymes and transporters. The information in YMDB has been gathered from hundreds of books, journal articles and electronic databases. In addition to its comprehensive literature-derived data, the YMDB also contains an extensive collection of experimental intracellular and extracellular metabolite concentration data compiled from detailed Mass Spectrometry (MS) and Nuclear Magnetic Resonance (NMR) metabolomic analyses performed in our lab. This is further supplemented with thousands of NMR and MS spectra collected on pure, reference yeast metabolites. Each metabolite entry in the YMDB contains an average of 80 separate data fields including comprehensive compound description, names and synonyms, structural information, physico-chemical data, reference NMR and MS spectra, intracellular/extracellular concentrations, growth conditions and substrates, pathway information, enzyme data, gene/protein sequence data, as well as numerous hyperlinks to images, references and other public databases. Extensive searching, relational querying and data browsing tools are also provided that support text, chemical structure, spectral, molecular weight and gene/protein sequence queries. Because of S. cervesiae's importance as a model organism for biologists and as a biofactory for industry, we believe this kind of database could have considerable appeal not only to metabolomics researchers, but also to yeast biologists, systems biologists, the industrial fermentation industry, as well as the beer, wine and spirit industry. PMID:22064855

  20. JAEA thermodynamic database for performance assessment of geological disposal of high-level and TRU wastes. Refinement of thermodynamic data for tetravalent thorium, uranium, neptunium and plutonium

    International Nuclear Information System (INIS)

    Fujiwara, Kenso; Kitamura, Akira; Yui, Mikazu

    2010-03-01

    Within the scope of the JAEA thermodynamic database project for performance assessment of geological disposal of high-level and TRU radioactive wastes, the refinement of the thermodynamic data for the inorganic compounds and complexes of Thorium(IV), Uranium(IV), Neptunium(IV) and Plutonium(IV) was carried out. Refinement of thermodynamic data for the element was performed on a basis of the thermodynamic database for actinide published by the Nuclear Energy Agency in the Organisation for Economic Co-operation and Development (OECD/NEA). Additionally, the latest data after publication of thermodynamic data by OECD/NEA were reevaluated to determine whether the data should be included in the JAEA-TDB. (author)

  1. Database of Low-e Storm Window Energy Performance across U.S. Climate Zones

    Energy Technology Data Exchange (ETDEWEB)

    Culp, Thomas D.; Cort, Katherine A.

    2014-09-04

    This is an update of a report that describes process, assumptions, and modeling results produced Create a Database of U.S. Climate-Based Analysis for Low-E Storm Windows. The scope of the overall effort is to develop a database of energy savings and cost effectiveness of low-E storm windows in residential homes across a broad range of U.S. climates using the National Energy Audit Tool (NEAT) and RESFEN model calculations. This report includes a summary of the results, NEAT and RESFEN background, methodology, and input assumptions, and an appendix with detailed results and assumptions by cliamte zone.

  2. Art as a Means to Disrupt Routine Use of Space

    OpenAIRE

    Martin, K; Dalton, B; Nikolopoulou, M

    2013-01-01

    This paper examines the publicly visible aspects of\\ud counter-terrorism activity in pedestrian spaces as mechanisms\\ud of disruption. We discuss the objectives of counter-terrorism in\\ud terms of disruption of routine for both hostile actors and general\\ud users of public spaces, categorising the desired effects as 1)\\ud triangulation of attention; 2) creation of unexpected performance;\\ud and 3) choreographing of crowd flow. We review the\\ud potential effects of these existing forms of disr...

  3. Routine organic air emissions at the Radioactive Waste Management Complex Waste Storage Facilities fiscal year 1995 report

    International Nuclear Information System (INIS)

    Galloway, K.J.; Jolley, J.G.

    1995-12-01

    This report presents the data and results of the routine organic air emissions monitoring performed in the Radioactive Waste Management Complex Waste Storage Facility, WMF-628, from January 4, 1995 to September 3, 1995. The task objectives were to systematically identify and measure volatile organic compound (VOC) concentrations within WMF-628 that could be emitted into the environment. These routine measurements implemented a dual method approach using Open-Path Fourier Transform Infrared Spectroscopy (OP-FTIR) monitoring and the Environmental Protection Agency (EPA) analytical method TO-14, Summa reg-sign Canister sampling. The data collected from the routine monitoring of WNF-628 will assist in estimating the total VOC emissions from WMF-628

  4. DOT Online Database

    Science.gov (United States)

    Page Home Table of Contents Contents Search Database Search Login Login Databases Advisory Circulars accessed by clicking below: Full-Text WebSearch Databases Database Records Date Advisory Circulars 2092 5 data collection and distribution policies. Document Database Website provided by MicroSearch

  5. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  6. Proposal for a high-energy nuclear database

    International Nuclear Information System (INIS)

    Brown, D.A.; Vogt, R.

    2006-01-01

    We propose to develop a high-energy heavy-ion experimental database and make it accessible to the scientific community through an on-line interface. This database will be searchable and cross-indexed with relevant publications, including published detector descriptions. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This database should eventually contain all published data from Bevalac, AGS and SPS to RHIC and LHC energies, proton-proton to nucleus-nucleus collisions as well as other relevant systems, and all measured observables. Such a database would have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models to a broad range of old and new experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion and target and source development for upcoming facilities such as the Next Linear Collider. To enhance the utility of this database, we propose periodically performing evaluations of the data and summarizing the results in topical reviews. (author)

  7. Proposal for a High Energy Nuclear Database

    International Nuclear Information System (INIS)

    Brown, D A; Vogt, R

    2005-01-01

    The authors propose to develop a high-energy heavy-ion experimental database and make it accessible to the scientific community through an on-line interface. This database will be searchable and cross-indexed with relevant publications, including published detector descriptions. Since this database will be a community resource, it requires the high-energy nuclear physics community's financial and manpower support. This database should eventually contain all published data from Bevalac, AGS and SPS to RHIC and CERN-LHC energies, proton-proton to nucleus-nucleus collisions as well as other relevant systems, and all measured observables. Such a database would have tremendous scientific payoff as it makes systematic studies easier and allows simpler benchmarking of theoretical models to a broad range of old and new experiments. Furthermore, there is a growing need for compilations of high-energy nuclear data for applications including stockpile stewardship, technology development for inertial confinement fusion and target and source development for upcoming facilities such as the Next Linear Collider. To enhance the utility of this database, they propose periodically performing evaluations of the data and summarizing the results in topical reviews

  8. SU-E-T-255: Development of a Michigan Quality Assurance (MQA) Database for Clinical Machine Operations

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, D [University of Michigan Hospital, Ann Arbor, MI (United States)

    2015-06-15

    Purpose: A unified database system was developed to allow accumulation, review and analysis of quality assurance (QA) data for measurement, treatment, imaging and simulation equipment in our department. Recording these data in a database allows a unified and structured approach to review and analysis of data gathered using commercial database tools. Methods: A clinical database was developed to track records of quality assurance operations on linear accelerators, a computed tomography (CT) scanner, high dose rate (HDR) afterloader and imaging systems such as on-board imaging (OBI) and Calypso in our department. The database was developed using Microsoft Access database and visual basic for applications (VBA) programming interface. Separate modules were written for accumulation, review and analysis of daily, monthly and annual QA data. All modules were designed to use structured query language (SQL) as the basis of data accumulation and review. The SQL strings are dynamically re-written at run time. The database also features embedded documentation, storage of documents produced during QA activities and the ability to annotate all data within the database. Tests are defined in a set of tables that define test type, specific value, and schedule. Results: Daily, Monthly and Annual QA data has been taken in parallel with established procedures to test MQA. The database has been used to aggregate data across machines to examine the consistency of machine parameters and operations within the clinic for several months. Conclusion: The MQA application has been developed as an interface to a commercially available SQL engine (JET 5.0) and a standard database back-end. The MQA system has been used for several months for routine data collection.. The system is robust, relatively simple to extend and can be migrated to a commercial SQL server.

  9. SU-E-T-255: Development of a Michigan Quality Assurance (MQA) Database for Clinical Machine Operations

    International Nuclear Information System (INIS)

    Roberts, D

    2015-01-01

    Purpose: A unified database system was developed to allow accumulation, review and analysis of quality assurance (QA) data for measurement, treatment, imaging and simulation equipment in our department. Recording these data in a database allows a unified and structured approach to review and analysis of data gathered using commercial database tools. Methods: A clinical database was developed to track records of quality assurance operations on linear accelerators, a computed tomography (CT) scanner, high dose rate (HDR) afterloader and imaging systems such as on-board imaging (OBI) and Calypso in our department. The database was developed using Microsoft Access database and visual basic for applications (VBA) programming interface. Separate modules were written for accumulation, review and analysis of daily, monthly and annual QA data. All modules were designed to use structured query language (SQL) as the basis of data accumulation and review. The SQL strings are dynamically re-written at run time. The database also features embedded documentation, storage of documents produced during QA activities and the ability to annotate all data within the database. Tests are defined in a set of tables that define test type, specific value, and schedule. Results: Daily, Monthly and Annual QA data has been taken in parallel with established procedures to test MQA. The database has been used to aggregate data across machines to examine the consistency of machine parameters and operations within the clinic for several months. Conclusion: The MQA application has been developed as an interface to a commercially available SQL engine (JET 5.0) and a standard database back-end. The MQA system has been used for several months for routine data collection.. The system is robust, relatively simple to extend and can be migrated to a commercial SQL server

  10. Understanding Teachers' Routines to Inform Classroom Technology Design

    Science.gov (United States)

    An, Pengcheng; Bakker, Saskia; Eggen, Berry

    2017-01-01

    Secondary school teachers have quite busy and complex routines in their classrooms. However, present classroom technologies usually require focused attention from teachers while being interacted with, which restricts their use in teachers' daily routines. Peripheral interaction is a human-computer interaction style that aims to enable interaction…

  11. Database Description - eSOL | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name eSOL Alternative nam...eator Affiliation: The Research and Development of Biological Databases Project, National Institute of Genet...nology 4259 Nagatsuta-cho, Midori-ku, Yokohama, Kanagawa 226-8501 Japan Email: Tel.: +81-45-924-5785 Database... classification Protein sequence databases - Protein properties Organism Taxonomy Name: Escherichia coli Taxonomy ID: 562 Database...i U S A. 2009 Mar 17;106(11):4201-6. External Links: Original website information Database maintenance site

  12. The effect of routine early amniotomy on spontaneous labor: a meta-analysis.

    Science.gov (United States)

    Brisson-Carroll, G; Fraser, W; Bréart, G; Krauss, I; Thornton, J

    1996-05-01

    To obtain estimates of the effects of amniotomy on the risk of cesarean delivery and on other indicators of maternal and neonatal morbidity (Apgar score less than 7 at 5 minutes, admission to neonatal intensive care unit [NICU]). Published studies were identified through manual and computerized searches using Medline and the Cochrane Collaboration Pregnancy and Childbirth Database. Our search identified ten trials, all published in peer-reviewed journals. Trials were assigned a methodological quality score based on a standardized rating system. Three trials were excluded from the analysis for methodological limitations. Data were abstracted by two trained reviewers. Typical odds ratios (OR) were calculated. Amniotomy was associated with a reduction in labor duration varying from 0.8-2.3 hours. There was a nonstatistically significant increase in the risk of cesarean delivery; OR 1.2, 95% confidence interval (CI) 0.9-1.6. The risk of a 5-minute Apgar score less than 7 was reduced in association with early amniotomy (OR 0.5, 95% CI 0.3-0.9). Groups were similar with respect to other indicators of neonatal status (arterial cord pH, NICU admissions). Routine early amniotomy is associated with both benefits and risks. Benefits include a reduction in labor duration and a possible reduction in abnormal 5-minute Apgar scores. This meta-analysis provides no support for the hypothesis that routine early amniotomy reduces the risk of cesarean delivery. An association between early amniotomy and cesarean delivery for fetal distress was noted in one large trial, suggesting that amniotomy should be reserved for patients with abnormal labor progress.

  13. Evaluation of Eigenvalue Routines for Large Scale Applications

    Directory of Open Access Journals (Sweden)

    V.A. Tischler

    1994-01-01

    Full Text Available The NASA structural analysis (NASTRAN∗ program is one of the most extensively used engineering applications software in the world. It contains a wealth of matrix operations and numerical solution techniques, and they were used to construct efficient eigenvalue routines. The purpose of this article is to examine the current eigenvalue routines in NASTRAN and to make efficiency comparisons with a more recent implementation of the block Lanczos aLgorithm. This eigenvalue routine is now availabLe in several mathematics libraries as well as in severaL commerciaL versions of NASTRAN. In addition, the eRA Y library maintains a modified version of this routine on their network. Several example problems, with a varying number of degrees of freedom, were selected primarily for efficiency bench-marking. Accuracy is not an issue, because they all gave comparable results. The block Lanczos algorithm was found to be extremely efficient, particularly for very large problems.

  14. Mathematics for Databases

    NARCIS (Netherlands)

    ir. Sander van Laar

    2007-01-01

    A formal description of a database consists of the description of the relations (tables) of the database together with the constraints that must hold on the database. Furthermore the contents of a database can be retrieved using queries. These constraints and queries for databases can very well be

  15. Status of the solid breeder materials database

    International Nuclear Information System (INIS)

    Billone, M.C.; Dienst, W.; Lorenzetto, P.; Noda, K.; Roux, N.

    1995-01-01

    The databases for solid breeder ceramics (Li 2 O, Li 4 SiO 4 , Li 2 ZrO 3 , and LiAlO 2 ) and beryllium multiplier material were critically reviewed and evaluated as part of the ITER/CDA design effort (1988-1990). The results have been documented in a detailed technical report. Emphasis was placed on the physical, thermal, mechanical, chemical stability/compatibility, tritium retention/release, and radiation stability properties which are needed to assess the performance of these materials in a fusion reactor environment. Materials properties correlations were selected for use in design analysis, and ranges for input parameters (e.g., temperature, porosity, etc.) were established. Also, areas for future research and development in blanket materials technology were highlighted and prioritized. For Li 2 O, the most significant increase in the database has come in the area of tritium retention as a function of operating temperature and purge flow composition. The database for postirradiation inventory from purged in-reactor samples has increased from four points to 20 points. These new data have allowed an improvement in understanding and modeling, as well as better interpretation of the results of laboratory annealing studies on unirradiated and irradiated material. In the case of Li 2 ZrO 3 , relatively little data were available on the sensitivity of the mechanical properties of this ternary ceramic to microstructure and moisture content. The increase in the database for this material has allowed not only better characterization of its properties, but also optimization of fabrication parameters to improve its performance. Some additional data are also available for the other two ternary ceramics to aid in the characterization of their performance. In particular, the thermal performance of these materials, as well as beryllium, in packed-bed form has been measured and characterized

  16. ACR Appropriateness Criteria® Routine Chest Radiography.

    Science.gov (United States)

    McComb, Barbara L; Chung, Jonathan H; Crabtree, Traves D; Heitkamp, Darel E; Iannettoni, Mark D; Jokerst, Clinton; Saleh, Anthony G; Shah, Rakesh D; Steiner, Robert M; Mohammed, Tan-Lucien H; Ravenel, James G

    2016-03-01

    Chest radiographs are sometimes taken before surgeries and interventional procedures on hospital admissions and outpatients. This manuscript summarizes the American College of Radiology review of the literature and recommendations on routinely performed chest radiographies in these settings. The American College of Radiology Appropriateness Criteria are evidence-based guidelines for specific clinical conditions that are reviewed every 3 years by a multidisciplinary expert panel. The guideline development and review include an extensive analysis of current medical literature from peer-reviewed journals and the application of a well-established consensus methodology (modified Delphi) to rate the appropriateness of imaging and treatment procedures by the panel. In those instances in which evidence is lacking or not definitive, expert opinion may be used to recommend imaging or treatment.

  17. User manual for two simple postscript output FORTRAN plotting routines

    Science.gov (United States)

    Nguyen, T. X.

    1991-01-01

    Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.

  18. ITER solid breeder blanket materials database

    International Nuclear Information System (INIS)

    Billone, M.C.; Dienst, W.; Noda, K.; Roux, N.

    1993-11-01

    The databases for solid breeder ceramics (Li 2 ,O, Li 4 SiO 4 , Li 2 ZrO 3 and LiAlO 2 ) and beryllium multiplier material are critically reviewed and evaluated. Emphasis is placed on physical, thermal, mechanical, chemical stability/compatibility, tritium, and radiation stability properties which are needed to assess the performance of these materials in a fusion reactor environment. Correlations are selected for design analysis and compared to the database. Areas for future research and development in blanket materials technology are highlighted and prioritized

  19. Thinking Routines: Replicating Classroom Practices within Museum Settings

    Science.gov (United States)

    Wolberg, Rochelle Ibanez; Goff, Allison

    2012-01-01

    This article describes thinking routines as tools to guide and support young children's thinking. These learning strategies, developed by Harvard University's Project Zero Classroom, actively engage students in constructing meaning while also understanding their own thinking process. The authors discuss how thinking routines can be used in both…

  20. Quality of routine health data collected by health workers using smartphone at primary health care in Ethiopia.

    Science.gov (United States)

    Medhanyie, Araya Abrha; Spigt, Mark; Yebyo, Henock; Little, Alex; Tadesse, Kidane; Dinant, Geert-Jan; Blanco, Roman

    2017-05-01

    Mobile phone based applications are considered by many as potentially useful for addressing challenges and improving the quality of data collection in developing countries. Yet very little evidence is available supporting or refuting the potential and widely perceived benefits on the use of electronic forms on smartphones for routine patient data collection by health workers at primary health care facilities. A facility based cross sectional study using a structured paper checklist was prepared to assess the completeness and accuracy of 408 electronic records completed and submitted to a central database server using electronic forms on smartphones by 25 health workers. The 408 electronic records were selected randomly out of a total of 1772 maternal health records submitted by the health workers to the central database over a period of six months. Descriptive frequencies and percentages of data completeness and error rates were calculated. When compared to paper records, the use of electronic forms significantly improved data completeness by 209 (8%) entries. Of a total 2622 entries checked for completeness, 2602 (99.2%) electronic record entries were complete, while 2393 (91.3%) paper record entries were complete. A very small percentage of error rates, which was easily identifiable, occurred in both electronic and paper forms although the error rate in the electronic records was more than double that of paper records (2.8% vs. 1.1%). More than half of entry errors in the electronic records related to entering a text value. With minimal training, supervision, and no incentives, health care workers were able to use electronic forms for patient assessment and routine data collection appropriately and accurately with a very small error rate. Minimising the number of questions requiring text responses in electronic forms would be helpful in minimizing data errors. Copyright © 2017 Elsevier B.V. All rights reserved.