WorldWideScience

Sample records for subscription databases performed

  1. Subscriptions

    African Journals Online (AJOL)

    ₤36.30 less 10% to Subscription Agent (₤32.67 + ₤9.33 Forex charges = ₤42.00 to ISEA, Rhodes University). $57.48 less 10% to Subscription Agent ($51.73 + $18.27 Forex charges = $70.00 to ISEA, Rhodes University). Airmail: Please add a further £9/$16. Africa: Individuals: R230.00 less 10% to Agent - (R 207.00 incl.

  2. Subscriptions

    African Journals Online (AJOL)

    Subscriptions Contact. Rachel Rege Kenya Agricultural Research Institute, P O Box 57811, Nairobi, KENYA Email: rrege@kari.org. East Africa Kssh 4000; Sterling 40; USA $62; including postage by surface mail. Postage by air mail can be arranged at cost on request. Payments to be made to the Editor, East African ...

  3. Subscriptions

    African Journals Online (AJOL)

    Subscriptions Contact. Professor Dele Braimoh P.O. Box 392. UNISA 0003. Pretoria South Africa Email: dbraimoh@yahoo.com. The AJCPSF requires a token payment of N5000 (excluding Review fee) from Nigerian and $150 ( International) contributors for maintaining the initial cost of editorial works on submitted articles ...

  4. Subscriptions

    African Journals Online (AJOL)

    Subscriptions Contact. Dr Basil C Ezeanolue Department of Otorhinolaryngology College of Medicine University of Nigeria Teaching Hospital Enugu 400 001. Nigeria Email: editornjorl@yahoo.com. Individuals Nigeria - 600.00 Naira Africa - US$15.00. Other Countries - US$25.00. Institutions Nigeria - 1200.00 Naira

  5. Subscriptions

    African Journals Online (AJOL)

    US$80 per volume (Institutional) US$45 per volume (Individuals) Single copies: US$35. Southern African Subscriptions: R70 per volume (Institutional) R50 per volume (Individuals) Single copies: R30 European sales from Intervention Press, Castenschioldsvej 7, DK 8270 Hojbjerg, Denmark. E-mail: interven@inet.uni2.dk.

  6. Google Scholar Out-Performs Many Subscription Databases when Keyword Searching. A Review of: Walters, W. H. (2009. Google Scholar search performance: Comparative recall and precision. portal: Libraries and the Academy, 9(1, 5-24.

    Directory of Open Access Journals (Sweden)

    Giovanna Badia

    2010-09-01

    title search in Google Scholar using the same keywords, elderly and migration. Compared to the standard search on the same topic, there was almost no difference in recall or precision when a title search was performed and the first 50 results were viewed.Conclusion – Database search performance differs significantly from one field to another so that a comparative study using a different search topic might produce different search results from those summarized above. Nevertheless, Google Scholar out-performs many subscription databases – in terms of recall and precision – when using keyword searches for some topics, as was the case for the multidisciplinary topic of later-life migration. Google Scholar’s recall and precision rates were high within the first 10 to 100 search results examined. According to the author, “these findings suggest that a searcher who is unwilling to search multiple databases or to adopt a sophisticated search strategy is likely to achieve better than average recall and precision by using Google Scholar” (p. 16.The author concludes the paper by discussing the relevancy of search results obtained by undergraduate students. All of the 155 relevant journal articles on the topic of later-life migration were pre-selected based on an expert critique of the complete articles, rather than by looking at only the titles or abstracts of references as most searchers do. Instructors and librarians may wish to support the use of databases that increase students’ contact with high-quality research documents (i.e.., documents that are authoritative, well written, contain a strong analysis, or demonstrate quality in other ways. The study’s findings indicate that Google Scholar is an example of one such database, since it obtained a large number of references to the relevant papers on the topic searched.

  7. Optimization Performance of a CO[subscript 2] Pulsed Tuneable Laser

    Science.gov (United States)

    Ribeiro, J. H. F.; Lobo, R. F. M.

    2009-01-01

    In this paper, a procedure is presented that will allow (i) the power and (ii) the energy of a pulsed and tuneable TEA CO[subscript 2] laser to be optimized. This type of laser represents a significant improvement in performance and portability. Combining a pulse mode with a grating tuning facility, it enables us to scan the working wavelength…

  8. Database principles programming performance

    CERN Document Server

    O'Neil, Patrick

    2014-01-01

    Database: Principles Programming Performance provides an introduction to the fundamental principles of database systems. This book focuses on database programming and the relationships between principles, programming, and performance.Organized into 10 chapters, this book begins with an overview of database design principles and presents a comprehensive introduction to the concepts used by a DBA. This text then provides grounding in many abstract concepts of the relational model. Other chapters introduce SQL, describing its capabilities and covering the statements and functions of the programmi

  9. Human Performance Event Database

    International Nuclear Information System (INIS)

    Trager, E. A.

    1998-01-01

    The purpose of this paper is to describe several aspects of a Human Performance Event Database (HPED) that is being developed by the Nuclear Regulatory Commission. These include the background, the database structure and basis for the structure, the process for coding and entering event records, the results of preliminary analyses of information in the database, and plans for the future. In 1992, the Office for Analysis and Evaluation of Operational Data (AEOD) within the NRC decided to develop a database for information on human performance during operating events. The database was needed to help classify and categorize the information to help feedback operating experience information to licensees and others. An NRC interoffice working group prepared a list of human performance information that should be reported for events and the list was based on the Human Performance Investigation Process (HPIP) that had been developed by the NRC as an aid in investigating events. The structure of the HPED was based on that list. The HPED currently includes data on events described in augmented inspection team (AIT) and incident investigation team (IIT) reports from 1990 through 1996, AEOD human performance studies from 1990 through 1993, recent NRR special team inspections, and licensee event reports (LERs) that were prepared for the events. (author)

  10. subscription information

    Indian Academy of Sciences (India)

    Administrator

    PROCEEDINGS IN ENGINEERING JOURNAL OF. SCIENCE EDUCATION SADHANA (PROCEEDINGS. ENGINEERING SCIENCES) SUBSCRIPTION JOUR-. NAL OF ASTROPHYSICS AND ASTRONOMY. JOURNAL OF BIOSCIENCES JOURNAL OF. CHEMICAL SCIENCES JOURNAL OF EARTH SYS. No. Journal.

  11. 75 FR 3666 - Digital Performance Right in Sound Recordings and Ephemeral Recordings for a New Subscription...

    Science.gov (United States)

    2010-01-22

    ... additions to Sec. 383.3 read as follows: Sec. 383.3 Royalty fees for public performances of sound recordings.... 383.4 to read as follows: Sec. 383.4 Terms for making payment of royalty fees. (a) Terms in general... Collective, late fees, statements of account, audit and verification of royalty payments and distributions...

  12. 75 FR 14074 - Digital Performance Right in Sound Recordings and Ephemeral Recordings for a New Subscription...

    Science.gov (United States)

    2010-03-24

    ...). The additions to Sec. 383.3 read as follows: Sec. 383.3 Royalty fees for public performances of sound... Sec. 383.4 to read as follows: Sec. 383.4 Terms for making payment of royalty fees. (a) Terms in... payments to the Collective, late fees, statements of account, audit and verification of royalty payments...

  13. PostgreSQL database performance optimization

    OpenAIRE

    Wang, Qiang

    2011-01-01

    The thesis was request by Marlevo software Oy for a general description of the PostgreSQL database and its performance optimization technics. Its purpose was to help new PostgreSQL users to quickly understand the system and to assist DBAs to improve the database performance. The thesis was divided into two parts. The first part described PostgreSQL database optimization technics in theory. In additional popular tools were also introduced. This part was based on PostgreSQL documentation, r...

  14. Performance Enhancements for Advanced Database Management Systems

    OpenAIRE

    Helmer, Sven

    2000-01-01

    New applications have emerged, demanding database management systems with enhanced functionality. However, high performance is a necessary precondition for the acceptance of such systems by end users. In this context we developed, implemented, and tested algorithms and index structures for improving the performance of advanced database management systems. We focused on index structures and join algorithms for set-valued attributes.

  15. Power Subscription Strategy

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1998-12-21

    This document lays out the Bonneville Power Administration`s Power Subscription Strategy, a process that will enable the people of the Pacific Northwest to share the benefits of the Federal Columbia river Power System after 2001 while retaining those benefits within the region for future generations. The strategy also addresses how those who receive the benefits of the region`s low-cost federal power should share a corresponding measure of the risks. This strategy seeks to implement the subscription concept created by the Comprehensive Review in 1996 through contracts for the sale of power and the distribution of federal power benefits in the deregulated wholesale electricity market. The success of the subscription process is fundamental to BPA`s overall business purpose to provide public benefits to the Northwest through commercially successful businesses.

  16. OAS :: Email subscriptions

    Science.gov (United States)

    subscriptions Videos Photos Live Webcast Social Media Facebook @oasofficial Facebook Twitter @oas_official Webcast Videos Audios Photos Social Media Facebook Twitter Newsletters Press and Communications Department Rights Actions against Corruption C Children Civil Registry Civil Society Contact Us Culture Cyber

  17. Basic database performance tuning - developer's perspective

    CERN Document Server

    Kwiatek, Michal

    2008-01-01

    This lecture discusses selected database performance issues from the developer's point of view: connection overhead, bind variables and SQL injection, making most of the optimizer with up-to-date statistics, reading execution plans. Prior knowledge of SQL is expected.

  18. A trending database for human performance events

    International Nuclear Information System (INIS)

    Harrison, D.

    1993-01-01

    An effective Operations Experience program includes a standardized methodology for the investigation of unplanned events and a tool capable of retaining investigation data for the purpose of trending analysis. A database used in conjunction with a formalized investigation procedure for the purpose of trending unplanning event data is described. The database follows the structure of INPO's Human Performance Enhancement System for investigations. The database screens duplicate on-line the HPES evaluation Forms. All information pertaining to investigations is collected, retained and entered into the database using these forms. The database will be used for trending analysis to determine if any significant patterns exist, for tracking progress over time both within AECL and against industry standards, and for evaluating the success of corrective actions. Trending information will be used to help prevent similar occurrences

  19. Specialist Bibliographic Databases

    OpenAIRE

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A.; Trukhachev, Vladimir I.; Kostyukova, Elena I.; Gerasimov, Alexey N.; Kitas, George D.

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and d...

  20. CERN GSM SUBSCRIPTIONS

    CERN Multimedia

    Labo Telecom

    2002-01-01

    AS Division has created a new EDH document for handling all GSM subscription requests and amendments. This procedure will enter force immediately and from now on the Labo Telecom stores will no longer be able to deal with requests submitted on paper forms. Detailed information on the subject can be found here and the Labo Telecom stores will continue to open every day between 11.00 a.m. and 12.00 midday. IT-CS-TEL, Labo Telecom

  1. Benchmarking database performance for genomic data.

    Science.gov (United States)

    Khushi, Matloob

    2015-06-01

    Genomic regions represent features such as gene annotations, transcription factor binding sites and epigenetic modifications. Performing various genomic operations such as identifying overlapping/non-overlapping regions or nearest gene annotations are common research needs. The data can be saved in a database system for easy management, however, there is no comprehensive database built-in algorithm at present to identify overlapping regions. Therefore I have developed a novel region-mapping (RegMap) SQL-based algorithm to perform genomic operations and have benchmarked the performance of different databases. Benchmarking identified that PostgreSQL extracts overlapping regions much faster than MySQL. Insertion and data uploads in PostgreSQL were also better, although general searching capability of both databases was almost equivalent. In addition, using the algorithm pair-wise, overlaps of >1000 datasets of transcription factor binding sites and histone marks, collected from previous publications, were reported and it was found that HNF4G significantly co-locates with cohesin subunit STAG1 (SA1).Inc. © 2015 Wiley Periodicals, Inc.

  2. Further Analysis of Boiling Points of Small Molecules, CH[subscript w]F[subscript x]Cl[subscript y]Br[subscript z

    Science.gov (United States)

    Beauchamp, Guy

    2005-01-01

    A study to present specific hypothesis that satisfactorily explain the boiling point of a number of molecules, CH[subscript w]F[subscript x]Cl[subscript y]Br[subscript z] having similar structure, and then analyze the model with the help of multiple linear regression (MLR), a data analysis tool. The MLR analysis was useful in selecting the…

  3. Evaluation of Maximal O[subscript 2] Uptake with Undergraduate Students at the University of La Reunion

    Science.gov (United States)

    Tarnus, Evelyne; Catan, Aurelie; Verkindt, Chantal; Bourdon, Emmanuel

    2011-01-01

    The maximal rate of O[subscript 2] consumption (VO[subscript 2max]) constitutes one of the oldest fitness indexes established for the measure of cardiorespiratory fitness and aerobic performance. Procedures have been developed in which VO[subscript 2max]is estimated from physiological responses during submaximal exercise. Generally, VO[subscript…

  4. Improved Information Retrieval Performance on SQL Database Using Data Adapter

    Science.gov (United States)

    Husni, M.; Djanali, S.; Ciptaningtyas, H. T.; Wicaksana, I. G. N. A.

    2018-02-01

    The NoSQL databases, short for Not Only SQL, are increasingly being used as the number of big data applications increases. Most systems still use relational databases (RDBs), but as the number of data increases each year, the system handles big data with NoSQL databases to analyze and access data more quickly. NoSQL emerged as a result of the exponential growth of the internet and the development of web applications. The query syntax in the NoSQL database differs from the SQL database, therefore requiring code changes in the application. Data adapter allow applications to not change their SQL query syntax. Data adapters provide methods that can synchronize SQL databases with NotSQL databases. In addition, the data adapter provides an interface which is application can access to run SQL queries. Hence, this research applied data adapter system to synchronize data between MySQL database and Apache HBase using direct access query approach, where system allows application to accept query while synchronization process in progress. From the test performed using data adapter, the results obtained that the data adapter can synchronize between SQL databases, MySQL, and NoSQL database, Apache HBase. This system spends the percentage of memory resources in the range of 40% to 60%, and the percentage of processor moving from 10% to 90%. In addition, from this system also obtained the performance of database NoSQL better than SQL database.

  5. Designing a database for performance assessment: Lessons learned from WIPP

    International Nuclear Information System (INIS)

    Martell, M.A.; Schenker, A.

    1997-01-01

    The Waste Isolation Pilot Plant (WIPP) Compliance Certification Application (CCA) Performance Assessment (PA) used a relational database that was originally designed only to supply the input parameters required for implementation of the PA codes. Reviewers used the database as a point of entry to audit quality assurance measures for control, traceability, and retrievability of input information used for analysis, and output/work products. During these audits it became apparent that modifications to the architecture and scope of the database would benefit the EPA regulator and other stakeholders when reviewing the recertification application. This paper contains a discussion of the WPP PA CCA database and lessons learned for designing a database

  6. Distributed MDSplus database performance with Linux clusters

    International Nuclear Information System (INIS)

    Minor, D.H.; Burruss, J.R.

    2006-01-01

    The staff at the DIII-D National Fusion Facility, operated for the USDOE by General Atomics, are investigating the use of grid computing and Linux technology to improve performance in our core data management services. We are in the process of converting much of our functionality to cluster-based and grid-enabled software. One of the most important pieces is a new distributed version of the MDSplus scientific data management system that is presently used to support fusion research in over 30 countries worldwide. To improve data handling performance, the staff is investigating the use of Linux clusters for both data clients and servers. The new distributed capability will result in better load balancing between these clients and servers, and more efficient use of network resources resulting in improved support of the data analysis needs of the scientific staff

  7. A performance evaluation of in-memory databases

    Directory of Open Access Journals (Sweden)

    Abdullah Talha Kabakus

    2017-10-01

    Full Text Available The popularity of NoSQL databases has increased due to the need of (1 processing vast amount of data faster than the relational database management systems by taking the advantage of highly scalable architecture, (2 flexible (schema-free data structure, and, (3 low latency and high performance. Despite that memory usage is not major criteria to evaluate performance of algorithms, since these databases serve the data from memory, their memory usages are also experimented alongside the time taken to complete each operation in the paper to reveal which one uses the memory most efficiently. Currently there exists over 225 NoSQL databases that provide different features and characteristics. So it is necessary to reveal which one provides better performance for different data operations. In this paper, we experiment the widely used in-memory databases to measure their performance in terms of (1 the time taken to complete operations, and (2 how efficiently they use memory during operations. As per the results reported in this paper, there is no database that provides the best performance for all data operations. It is also proved that even though a RDMS stores its data in memory, its overall performance is worse than NoSQL databases.

  8. Anon-Pass: Practical Anonymous Subscriptions.

    Science.gov (United States)

    Lee, Michael Z; Dunn, Alan M; Katz, Jonathan; Waters, Brent; Witchel, Emmett

    2013-12-31

    We present the design, security proof, and implementation of an anonymous subscription service. Users register for the service by providing some form of identity, which might or might not be linked to a real-world identity such as a credit card, a web login, or a public key. A user logs on to the system by presenting a credential derived from information received at registration. Each credential allows only a single login in any authentication window, or epoch . Logins are anonymous in the sense that the service cannot distinguish which user is logging in any better than random guessing. This implies unlinkability of a user across different logins. We find that a central tension in an anonymous subscription service is the service provider's desire for a long epoch (to reduce server-side computation) versus users' desire for a short epoch (so they can repeatedly "re-anonymize" their sessions). We balance this tension by having short epochs, but adding an efficient operation for clients who do not need unlinkability to cheaply re-authenticate themselves for the next time period. We measure performance of a research prototype of our protocol that allows an independent service to offer anonymous access to existing services. We implement a music service, an Android-based subway-pass application, and a web proxy, and show that adding anonymity adds minimal client latency and only requires 33 KB of server memory per active user.

  9. Subscriptions

    African Journals Online (AJOL)

    Dr. I. I. Olatunji-Bello Nigerian Journal of Health and Biomedical Sciences Department of Physiology, College of Medicine of the University of Lagos P. M. B. 12003, Lagos NIGERIA Email: yemibello@lycos.com. Individual Nigeria - N1,000 per annum U. K. - 50 per annum U.S.A - $100 per annum. Canada - $80 per annum

  10. Subscriptions

    African Journals Online (AJOL)

    School of Philosophy and Ethics University Of KwaZulu-Natal Durban 4041. South Africa. E-Mail: Clare@ukzn.ac.zaThis e-mail address is being protected from spambots. You need JavaScript enabled to view it. Members of the Philosophical Society of Southern Africa receive the Journal free of charge. ISSN: 0258-0136.

  11. Subscriptions

    African Journals Online (AJOL)

    the International Council on Archives is published once a year with the technical support of the National Archives and Records Service of South Africa. However, the views expressed here are those of the authors. Articles appearing in the Journal are annotated, indexed or abstracted in African Journals OnLine (AJOL) at: ...

  12. Comparison of Cloud backup performance and costs in Oracle database

    OpenAIRE

    Aljaž Zrnec; Dejan Lavbič

    2011-01-01

    Current practice of backing up data is based on using backup tapes and remote locations for storing data. Nowadays, with the advent of cloud computing a new concept of database backup emerges. The paper presents the possibility of making backup copies of data in the cloud. We are mainly focused on performance and economic issues of making backups in the cloud in comparison to traditional backups. We tested the performance and overall costs of making backup copies of data in Oracle database u...

  13. Performance analysis of different database in new internet mapping system

    Science.gov (United States)

    Yao, Xing; Su, Wei; Gao, Shuai

    2017-03-01

    In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.

  14. Oracle database performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2011-01-01

    A data-driven, fact-based, quantitative text on Oracle performance and scalability With database concepts and theories clearly explained in Oracle's context, readers quickly learn how to fully leverage Oracle's performance and scalability capabilities at every stage of designing and developing an Oracle-based enterprise application. The book is based on the author's more than ten years of experience working with Oracle, and is filled with dependable, tested, and proven performance optimization techniques. Oracle Database Performance and Scalability is divided into four parts that enable reader

  15. Managing Distributed Systems with Smart Subscriptions

    Science.gov (United States)

    Filman, Robert E.; Lee, Diana D.; Swanson, Keith (Technical Monitor)

    2000-01-01

    We describe an event-based, publish-and-subscribe mechanism based on using 'smart subscriptions' to recognize weakly-structured events. We present a hierarchy of subscription languages (propositional, predicate, temporal and agent) and algorithms for efficiently recognizing event matches. This mechanism has been applied to the management of distributed applications.

  16. Data Preparation Process for the Buildings Performance Database

    Energy Technology Data Exchange (ETDEWEB)

    Walter, Travis; Dunn, Laurel; Mercado, Andrea; Brown, Richard E.; Mathew, Paul

    2014-06-30

    The Buildings Performance Database (BPD) includes empirically measured data from a variety of data sources with varying degrees of data quality and data availability. The purpose of the data preparation process is to maintain data quality within the database and to ensure that all database entries have sufficient data for meaningful analysis and for the database API. Data preparation is a systematic process of mapping data into the Building Energy Data Exchange Specification (BEDES), cleansing data using a set of criteria and rules of thumb, and deriving values such as energy totals and dominant asset types. The data preparation process takes the most amount of effort and time therefore most of the cleansing process has been automated. The process also needs to adapt as more data is contributed to the BPD and as building technologies over time. The data preparation process is an essential step between data contributed by providers and data published to the public in the BPD.

  17. High-Performance Secure Database Access Technologies for HEP Grids

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the

  18. High-Performance Secure Database Access Technologies for HEP Grids

    International Nuclear Information System (INIS)

    Vranicar, Matthew; Weicher, John

    2006-01-01

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist's computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that 'Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications'. There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure

  19. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  20. Downsizing a database platform for increased performance and decreased costs

    Energy Technology Data Exchange (ETDEWEB)

    Miller, M.M.; Tolendino, L.F.

    1993-06-01

    Technological advances in the world of microcomputers have brought forth affordable systems and powerful software than can compete with the more traditional world of minicomputers. This paper describes an effort at Sandia National Laboratories to decrease operational and maintenance costs and increase performance by moving a database system from a minicomputer to a microcomputer.

  1. Sorption, Diffusion and Solubility Databases for Performance Assessment

    International Nuclear Information System (INIS)

    Garcia Gutierrez, M.

    2000-01-01

    This report presents a deterministic and probabilistic databases for application in Performance Assessment of a high-level radioactive waste disposal. This work includes a theoretical description of sorption, diffusion and solubility phenomena of radionuclides in geological media. The report presents and compares the databases of different nuclear wastes management agencies, describes the materials in the Spanish reference system, and the results of sorption diffusion and solubility in this system, with both the deterministic and probabilistic approximation. The probabilistic approximation is presented in the form of probability density functions (pdf). (Author) 52 refs

  2. Understanding, modeling, and improving main-memory database performance

    OpenAIRE

    Manegold, S.

    2002-01-01

    textabstractDuring the last two decades, computer hardware has experienced remarkable developments. Especially CPU (clock-)speed has been following Moore's Law, i.e., doubling every 18 months; and there is no indication that this trend will change in the foreseeable future. Recent research has revealed that database performance, even with main-memory based systems, can hardly benefit from the ever increasing CPU power. The reason for this is that the performance of other hardware components h...

  3. Utilization of office computer for journal subscription

    International Nuclear Information System (INIS)

    Yonezawa, Minoru; Shimizu, Tokiyo

    1993-01-01

    Integrated library automation system has been operating in Japan Atomic Energy Research Institute library, which consists of three subsystems for serials control, book acquisition and circulation control. Functions of the serials control subsystem have been improved to reduce the work load of journal subscription work. Subscription price(both in foreign currency and Japanese yen), conversion factor, foreign currency exchange rate are newly introduced as data elements to a master file for automatic calculation and totalization in the system, e.g. conversion of subscription price from foreign currency into Japanese yen. Some kinds of journal lists are also printed out, such as journal subscription list, journal distribution list for each laboratory, etc. (author)

  4. Comparison of Cloud backup performance and costs in Oracle database

    Directory of Open Access Journals (Sweden)

    Aljaž Zrnec

    2011-06-01

    Full Text Available Normal 0 21 false false false SL X-NONE X-NONE Current practice of backing up data is based on using backup tapes and remote locations for storing data. Nowadays, with the advent of cloud computing a new concept of database backup emerges. The paper presents the possibility of making backup copies of data in the cloud. We are mainly focused on performance and economic issues of making backups in the cloud in comparison to traditional backups. We tested the performance and overall costs of making backup copies of data in Oracle database using Amazon S3 and EC2 cloud services. The costs estimation was performed on the basis of the prices published on Amazon S3 and Amazon EC2 sites.

  5. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  6. Vibrational Spectroscopy of the CCI[subscript 4]?[subscript 1] Mode: Effect of Thermally Populated Vibrational States

    Science.gov (United States)

    Gaynor, James D.; Wetterer, Anna M.; Cochran, Rea M.; Valente, Edward J.; Mayer, Steven G.

    2015-01-01

    In our previous article on CCl[subscript 4] in this "Journal," we presented an investigation of the fine structure of the symmetric stretch of carbon tetrachloride (CCl[subscript 4]) due to isotopic variations of chlorine in C[superscript 35]Cl[subscript x][superscript 37]Cl[subscript 4-x]. In this paper, we present an investigation of…

  7. Development of comprehensive material performance database for nuclear applications

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime

    1993-01-01

    This paper introduces the present status of the comprehensive material performance database for nuclear applications, which was named JAERI Material Performance Database (JMPD), and examples of its utilization. The JMPD has been developed since 1986 in JAERI with a view to utilizing various kinds of characteristics data of nuclear materials efficiently. Management system of relational database, PLANNER, was employed, and supporting systems for data retrieval and output were expanded. In order to improve user-friendliness of the retrieval system, the menu selection type procedures have been developed where knowledge of the system or the data structures are not required for end-users. As to utilization of the JMPD, two types of data analyses are mentioned as follows: (1) A series of statistical analyses was performed in order to estimate the design values both of the yield strength (Sy) and the tensile strength (Su) for aluminum alloys which are widely used as structural materials for research reactors. (2) Statistical analyses were accomplished by using the cyclic crack growth rate data for nuclear pressure vessel steels, and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and ΔK-constant type tests. (author)

  8. JAERI Material Performance Database (JMPD); outline of the system

    International Nuclear Information System (INIS)

    Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime.

    1991-01-01

    JAERI Material Performance Database (JMPD) has been developed since 1986 in JAERI with a view to utilizing the various kinds of characteristic data of nuclear materials efficiently. Management system of relational database, PLANNER was employed and supporting systems for data retrieval and output were expanded. JMPD is currently serving the following data; (1) Data yielded from the research activities of JAERI including fatigue crack growth data of LWR pressure vessel materials as well as creep and fatigue data of the alloy developed for the High Temperature Gas-cooled Reactor (HTGR), Hastelloy XR. (2) Data of environmentally assisted cracking of LWR materials arranged by Electric power Research Institute (EPRI) including fatigue crack growth data (3000 tests), stress corrosion data (500 tests) and Slow Strain Rate Technique (SSRT) data (1000 tests). In order to improve user-friendliness of retrieval system, the menu selection type procedures have been developed where knowledge of system and data structures are not required for end-users. In addition a retrieval via database commands, Structured Query Language (SQL), is supported by the relational database management system. In JMPD the retrieved data can be processed readily through supporting systems for graphical and statistical analyses. The present report outlines JMPD and describes procedures for data retrieval and analyses by utilizing JMPD. (author)

  9. Open access versus subscription journals: a comparison of scientific impact.

    Science.gov (United States)

    Björk, Bo-Christer; Solomon, David

    2012-07-17

    In the past few years there has been an ongoing debate as to whether the proliferation of open access (OA) publishing would damage the peer review system and put the quality of scientific journal publishing at risk. Our aim was to inform this debate by comparing the scientific impact of OA journals with subscription journals, controlling for journal age, the country of the publisher, discipline and (for OA publishers) their business model. The 2-year impact factors (the average number of citations to the articles in a journal) were used as a proxy for scientific impact. The Directory of Open Access Journals (DOAJ) was used to identify OA journals as well as their business model. Journal age and discipline were obtained from the Ulrich's periodicals directory. Comparisons were performed on the journal level as well as on the article level where the results were weighted by the number of articles published in a journal. A total of 610 OA journals were compared with 7,609 subscription journals using Web of Science citation data while an overlapping set of 1,327 OA journals were compared with 11,124 subscription journals using Scopus data. Overall, average citation rates, both unweighted and weighted for the number of articles per journal, were about 30% higher for subscription journals. However, after controlling for discipline (medicine and health versus other), age of the journal (three time periods) and the location of the publisher (four largest publishing countries versus other countries) the differences largely disappeared in most subcategories except for journals that had been launched prior to 1996. OA journals that fund publishing with article processing charges (APCs) are on average cited more than other OA journals. In medicine and health, OA journals founded in the last 10 years are receiving about as many citations as subscription journals launched during the same period. Our results indicate that OA journals indexed in Web of Science and/or Scopus are

  10. Nitration of Phenols Using Cu(NO[subscript 3])[subscript 2]: Green Chemistry Laboratory Experiment

    Science.gov (United States)

    Yadav, Urvashi; Mande, Hemant; Ghalsasi, Prasanna

    2012-01-01

    An easy-to-complete, microwave-assisted, green chemistry, electrophilic nitration method for phenol using Cu(NO[subscript 3])[subscript 2] in acetic acid is discussed. With this experiment, students clearly understand the mechanism underlying the nitration reaction in one laboratory session. (Contains 4 schemes.)

  11. Oracle Database 11gR2 Performance Tuning Cookbook

    CERN Document Server

    Fiorillo, Ciro

    2012-01-01

    In this book you will find both examples and theoretical concepts covered. Every recipe is based on a script/procedure explained step-by-step, with screenshots, while theoretical concepts are explained in the context of the recipe, to explain why a solution performs better than another. This book is aimed at software developers, software and data architects, and DBAs who are using or are planning to use the Oracle Database, who have some experience and want to solve performance problems faster and in a rigorous way. If you are an architect who wants to design better applications, a DBA who is

  12. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  13. Internet Access from CERN GSM subscriptions

    CERN Multimedia

    IT Department

    2008-01-01

    The data service on GSM subscriptions has been improved, allowing CERN users to access the Internet directly. A CERN GSM subscription with data option now allows you to connect to the Internet from a mobile phone or a PC equipped with a GSM modem. The previous access (CERN intranet) still exists. To get access to the new service, you will find all the information on configurations at: http://cern.ch/gprs The use of this service on the Sunrise network is charged on a flat-rate basis (no extra charge related to the volume of downloaded data). Depending on your CERN subscription type (standard or master), you can also connect to foreign GSM data networks (roaming), but this is strongly discouraged, except where absolutely necessary, due to international roaming charges. Telecom Section, IT/CS

  14. YUCSA: A CLIPS expert database system to monitor academic performance

    Science.gov (United States)

    Toptsis, Anestis A.; Ho, Frankie; Leindekar, Milton; Foon, Debra Low; Carbonaro, Mike

    1991-01-01

    The York University CLIPS Student Administrator (YUCSA), an expert database system implemented in C Language Integrated Processing System (CLIPS), for monitoring the academic performance of undergraduate students at York University, is discussed. The expert system component in the system has already been implemented for two major departments, and it is under testing and enhancement for more departments. Also, more elaborate user interfaces are under development. We describe the design and implementation of the system, problems encountered, and immediate future plans. The system has excellent maintainability and it is very efficient, taking less than one minute to complete an assessment of one student.

  15. The SACADA database for human reliability and human performance

    International Nuclear Information System (INIS)

    James Chang, Y.; Bley, Dennis; Criscione, Lawrence; Kirwan, Barry; Mosleh, Ali; Madary, Todd; Nowell, Rodney; Richards, Robert; Roth, Emilie M.; Sieben, Scott; Zoulis, Antonios

    2014-01-01

    Lack of appropriate and sufficient human performance data has been identified as a key factor affecting human reliability analysis (HRA) quality especially in the estimation of human error probability (HEP). The Scenario Authoring, Characterization, and Debriefing Application (SACADA) database was developed by the U.S. Nuclear Regulatory Commission (NRC) to address this data need. An agreement between NRC and the South Texas Project Nuclear Operating Company (STPNOC) was established to support the SACADA development with aims to make the SACADA tool suitable for implementation in the nuclear power plants' operator training program to collect operator performance information. The collected data would support the STPNOC's operator training program and be shared with the NRC for improving HRA quality. This paper discusses the SACADA data taxonomy, the theoretical foundation, the prospective data to be generated from the SACADA raw data to inform human reliability and human performance, and the considerations on the use of simulator data for HRA. Each SACADA data point consists of two information segments: context and performance results. Context is a characterization of the performance challenges to task success. The performance results are the results of performing the task. The data taxonomy uses a macrocognitive functions model for the framework. At a high level, information is classified according to the macrocognitive functions of detecting the plant abnormality, understanding the abnormality, deciding the response plan, executing the response plan, and team related aspects (i.e., communication, teamwork, and supervision). The data are expected to be useful for analyzing the relations between context, error modes and error causes in human performance

  16. Pursuit of a scalable high performance multi-petabyte database

    CERN Document Server

    Hanushevsky, A

    1999-01-01

    When the BaBar experiment at the Stanford Linear Accelerator Center starts in April 1999, it will generate approximately 200 TB/year of data at a rate of 10 MB/sec for 10 years. A mere six years later, CERN, the European Laboratory for Particle Physics, will start an experiment whose data storage requirements are two orders of magnitude larger. In both experiments, all of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). The quantity and rate at which the data is produced requires the use of a high performance hierarchical mass storage system in place of a standard Unix file system. Furthermore, the distributed nature of the experiment, involving scientists from 80 Institutions in 10 countries, also requires an extended security infrastructure not commonly found in standard Unix file systems. The combination of challenges that must be overcome in order to effectively deal with a multi-petabyte object oriented database is substantial. Our particular approach...

  17. Providing Availability, Performance, and Scalability By Using Cloud Database

    OpenAIRE

    Prof. Dr. Alaa Hussein Al-Hamami; RafalAdeeb Al-Khashab

    2014-01-01

    With the development of the internet, new technical and concepts have attention to all users of the internet especially in the development of information technology, such as concept is cloud. Cloud computing includes different components, of which cloud database has become an important one. A cloud database is a distributed database that delivers computing as a service or in form of virtual machine image instead of a product via the internet; its advantage is that database can...

  18. 47 CFR 73.641 - Subscription TV definitions.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Subscription TV definitions. 73.641 Section 73.641 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES RADIO BROADCAST SERVICES Television Broadcast Stations § 73.641 Subscription TV definitions. (a) Subscription...

  19. Using a Subscription Agent for E-Journal Management

    Science.gov (United States)

    Grogg, Jill E.

    2010-01-01

    Subscription agents have had to reinvent themselves over the past 15 years as the numbers of print subscriptions have dramatically dwindled. Many libraries have chosen to bypass the subscription agent and its extra fees in favor of dealing directly with the publisher for e-journal and e-journal package procurement and management. Especially in…

  20. Sorption databases for increasing confidence in performance assessment - 16053

    International Nuclear Information System (INIS)

    Richter, Anke; Brendler, Vinzenz; Nebelung, Cordula; Payne, Timothy E.; Brasser, Thomas

    2009-01-01

    requires that all mineral constituents of the solid phase are characterized. Another issue is the large number of required parameters combined with time-consuming iterations. Addressing both approaches, we present two sorption databases, developed mainly by or under participation of the Forschungszentrum Dresden-Rossendorf (FZD). Both databases are implemented as relational databases, assist identification of critical data gaps and the evaluation of existing parameter sets, provide web based data search and analyses and permit the comparison of SCM predictions with K d values. RES 3 T (Rossendorf Expert System for Surface and Sorption Thermodynamics) is a digitized thermodynamic sorption database (see www.fzd.de/db/RES3T.login) and free of charge. It is mineral-specific and can therefore also be used for additive models of more complex solid phases. ISDA (Integrated Sorption Database System) connects SCM with the K d concept but focuses on conventional K d . The integrated datasets are accessible through a unified user interface. An application case, K d values in Performance Assessment, is given. (authors)

  1. Database in Artificial Intelligence.

    Science.gov (United States)

    Wilkinson, Julia

    1986-01-01

    Describes a specialist bibliographic database of literature in the field of artificial intelligence created by the Turing Institute (Glasgow, Scotland) using the BRS/Search information retrieval software. The subscription method for end-users--i.e., annual fee entitles user to unlimited access to database, document provision, and printed awareness…

  2. The high-performance database archiver for the LHC experiments

    CERN Document Server

    González-Berges, M

    2007-01-01

    Each of the Large Hadron Collider (LHC) experiments will be controlled by a large distributed system built with the Supervisory Control and Data Acquisition (SCADA) tool Prozeßvisualisierungs- und Steuerungsystem (PVSS). There will be in the order of 150 computers and one million input/output parameters per experiment. The values read from the hardware, the alarms generated and the user actions will be archived for the later physics analysis, the operation and the debugging of the control system itself. Although the original PVSS implementation of a database archiver was appropriate for standard industrial use, the performance was not sufficient for the experiments. A collaboration was setup between CERN and ETM, the company that develops PVSS. Changes in the architecture and several optimizations were made and tested in a system of a comparable size to the final ones. As a result, we have been able to improve the performance by more than one order of magnitude, and what is more important, we now have a scal...

  3. ISSUES IN MOBILE DISTRIBUTED REAL TIME DATABASES: PERFORMANCE AND REVIEW

    OpenAIRE

    VISHNU SWAROOP,; Gyanendra Kumar Gupta,; UDAI SHANKER

    2011-01-01

    Increase in handy and small electronic devices in computing fields; it makes the computing more popularand useful in business. Tremendous advances in wireless networks and portable computing devices have led to development of mobile computing. Support of real time database system depending upon thetiming constraints, due to availability of data distributed database, and ubiquitous computing pull the mobile database concept, which emerges them in a new form of technology as mobile distributed ...

  4. Performance Assessment of Dynaspeak Speech Recognition System on Inflight Databases

    National Research Council Canada - National Science Library

    Barry, Timothy

    2004-01-01

    .... To aid in the assessment of various commercially available speech recognition systems, several aircraft speech databases have been developed at the Air Force Research Laboratory's Human Effectiveness Directorate...

  5. Database usage and performance for the Fermilab Run II experiments

    International Nuclear Information System (INIS)

    Bonham, D.; Box, D.; Gallas, E.; Guo, Y.; Jetton, R.; Kovich, S.; Kowalkowski, J.; Kumar, A.; Litvintsev, D.; Lueking, L.; Stanfield, N.; Trumbo, J.; Vittone-Wiersma, M.; White, S.P.; Wicklund, E.; Yasuda, T.; Maksimovic, P.

    2004-01-01

    The Run II experiments at Fermilab, CDF and D0, have extensive database needs covering many areas of their online and offline operations. Delivering data to users and processing farms worldwide has represented major challenges to both experiments. The range of applications employing databases includes, calibration (conditions), trigger information, run configuration, run quality, luminosity, data management, and others. Oracle is the primary database product being used for these applications at Fermilab and some of its advanced features have been employed, such as table partitioning and replication. There is also experience with open source database products such as MySQL for secondary databases used, for example, in monitoring. Tools employed for monitoring the operation and diagnosing problems are also described

  6. Specialist Bibliographic Databases.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A; Trukhachev, Vladimir I; Kostyukova, Elena I; Gerasimov, Alexey N; Kitas, George D

    2016-05-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls.

  7. Specialist Bibliographic Databases

    Science.gov (United States)

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls. PMID:27134485

  8. Performance Evaluation of Cloud Database and Traditional Database in terms of Response Time while Retrieving the Data

    OpenAIRE

    Donkena, Kaushik; Gannamani, Subbarayudu

    2012-01-01

    Context: There has been an exponential growth in the size of the databases in the recent times and the same amount of growth is expected in the future. There has been a firm drop in the storage cost followed by a rapid increase in t he storage capacity. The entry of Cloud in the recent times has changed the equations. The Performance of the Database plays a vital role in the competition. In this research, an attempt has been made to evaluate and compare the performance of the traditional data...

  9. On the use of databases about research performance

    NARCIS (Netherlands)

    Rodela, Romina

    2016-01-01

    The accuracy of interdisciplinarity measurements depends on how well the data is used for this purpose and whether it can meaningfully inform about work that crosses disciplinary domains. At present, there are no ad hoc databases compiling information only and exclusively about interdisciplinary

  10. Managing XML Data to optimize Performance into Object-Relational Databases

    Directory of Open Access Journals (Sweden)

    Iuliana BOTHA

    2011-06-01

    Full Text Available This paper propose some possibilities for manage XML data in order to optimize performance into object-relational databases. It is detailed the possibility of storing XML data into such databases, using for exemplification an Oracle database and there are tested some optimizing techniques of the queries over XMLType tables, like indexing and partitioning tables.

  11. Measurement of Levitation Forces of High-"T[subscript c] Superconductors

    Science.gov (United States)

    Becker, M.; Koblischka, M. R.; Hartmann, U.

    2010-01-01

    We show the construction of a so-called levitation balance which is capable of measuring the levitation forces between a permanent magnet and a superconducting high-T[subscript c] thin film sample. The underlying theoretical basis is discussed in detail. The experiment is performed as an introductory physics experiment for school students as well…

  12. New types of subscriptions for CERN GSM

    CERN Multimedia

    IT Department

    2010-01-01

    A recent renegotiation of our commercial conditions with our mobile telephony operator allows us today to deploy new GSM mobile services, reduce communication costs, as well as put in place a new subscription system. First of all, the "email to SMS" service has already been extended to all Swiss numbers. This service allows you to send SMS messages (Short Message Service) to any Swiss mobile telephone from your CERN e-mail account. For further details, please refer to the web site http://cern.ch/sms. The sending of MMS messages (Multi-media Message Service) will be activated by default on all CERN subscriptions by the end of March 2010. This service allows users to attach to a text message an image, a video or an audio recording. All the necessary details for configuring this new service on CERN mobile phones will be published on the web site http://cern.ch/mms. Concerning mobile service costs, new rates have been put in place since 1st January 2010. All tariffs have dramatically decrea...

  13. 41 CFR 101-25.108 - Multiyear subscriptions for publications.

    Science.gov (United States)

    2010-07-01

    ... for publications. 101-25.108 Section 101-25.108 Public Contracts and Property Management Federal...-GENERAL 25.1-General Policies § 101-25.108 Multiyear subscriptions for publications. Subscriptions for periodicals, newspapers, and other publications for which it is known in advance that a continuing requirement...

  14. 46 CFR 201.42 - Subscription, authentication of documents.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Subscription, authentication of documents. 201.42 Section 201.42 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION POLICY, PRACTICE AND... Subscription, authentication of documents. (a) Documents filed shall be subscribed: (1) By the person or...

  15. L[subscript 1] and L[subscript 2] Spoken Word Processing: Evidence from Divided Attention Paradigm

    Science.gov (United States)

    Shafiee Nahrkhalaji, Saeedeh; Lotfi, Ahmad Reza; Koosha, Mansour

    2016-01-01

    The present study aims to reveal some facts concerning first language (L[subscript 1]) and second language (L[subscript 2]) spoken-word processing in unbalanced proficient bilinguals using behavioral measures. The intention here is to examine the effects of auditory repetition word priming and semantic priming in first and second languages of…

  16. Performance assessment of EMR systems based on post-relational database.

    Science.gov (United States)

    Yu, Hai-Yan; Li, Jing-Song; Zhang, Xiao-Guang; Tian, Yu; Suzuki, Muneou; Araki, Kenji

    2012-08-01

    Post-relational databases provide high performance and are currently widely used in American hospitals. As few hospital information systems (HIS) in either China or Japan are based on post-relational databases, here we introduce a new-generation electronic medical records (EMR) system called Hygeia, which was developed with the post-relational database Caché and the latest platform Ensemble. Utilizing the benefits of a post-relational database, Hygeia is equipped with an "integration" feature that allows all the system users to access data-with a fast response time-anywhere and at anytime. Performance tests of databases in EMR systems were implemented in both China and Japan. First, a comparison test was conducted between a post-relational database, Caché, and a relational database, Oracle, embedded in the EMR systems of a medium-sized first-class hospital in China. Second, a user terminal test was done on the EMR system Izanami, which is based on the identical database Caché and operates efficiently at the Miyazaki University Hospital in Japan. The results proved that the post-relational database Caché works faster than the relational database Oracle and showed perfect performance in the real-time EMR system.

  17. Discrete Optimization of Internal Part Structure via SLM Unit Structure-Performance Database

    Directory of Open Access Journals (Sweden)

    Li Tang

    2018-01-01

    Full Text Available The structural optimization of the internal structure of parts based on three-dimensional (3D printing has been recognized as being important in the field of mechanical design. The purpose of this paper is to present a creation of a unit structure-performance database based on the selective laser melting (SLM, which contains various structural units with different functions and records their structure and performance characteristics so that we can optimize the internal structure of parts directly, according to the database. The method of creating the unit structure-performance database was introduced in this paper and several structural units of the unit structure-performance database were introduced. The bow structure unit was used to show how to create the structure-performance database of the unit as an example. Some samples of the bow structure unit were designed and manufactured by SLM. These samples were tested in the WDW-100 compression testing machine to obtain their performance characteristics. After this, the paper collected all data regarding unit structure parameters, weight, performance characteristics, and other data; and, established a complete set of data from the bow structure unit for the unit structure-performance database. Furthermore, an aircraft part was reconstructed conveniently to be more lightweight according to the unit structure-performance database. Its weight was reduced by 36.8% when compared with the original structure, while the strength far exceeded the requirements.

  18. GUC100 multisensor fingerprint database for in-house (semipublic) performance test

    OpenAIRE

    Gafurov D.; Bours P.; Yang B.; Busch C.

    2010-01-01

    For evaluation of biometric performance of biometric components and system, the availability of independent databases and desirably independent evaluators is important. Both databases of significant size and independent testing institutions provide the precondition for fair and unbiased benchmarking. In order to show generalization capabilities of the system under test, it is essential that algorithm developers do not have access to the testing database, and thus the risk of tuned algorithms...

  19. Performance Evaluation of a Database System in a Multiple Backend Configurations,

    Science.gov (United States)

    1984-10-01

    leaving a systemn process , the * internal performance measuremnents of MMSD have been carried out. Mathodo lo.- gies for constructing test databases...access d i rectory data via the AT, EDIT, and CDT. In designing the test database, one of the key concepts is the choice of the directory attributes in...internal timing. These requests are selected since they retrieve the seIaI lest portion of the test database and the processing time for each request is

  20. Frontier: High Performance Database Access Using Standard Web Components in a Scalable Multi-Tier Architecture

    International Nuclear Information System (INIS)

    Kosyakov, S.; Kowalkowski, J.; Litvintsev, D.; Lueking, L.; Paterno, M.; White, S.P.; Autio, Lauri; Blumenfeld, B.; Maksimovic, P.; Mathis, M.

    2004-01-01

    A high performance system has been assembled using standard web components to deliver database information to a large number of broadly distributed clients. The CDF Experiment at Fermilab is establishing processing centers around the world imposing a high demand on their database repository. For delivering read-only data, such as calibrations, trigger information, and run conditions data, we have abstracted the interface that clients use to retrieve data objects. A middle tier is deployed that translates client requests into database specific queries and returns the data to the client as XML datagrams. The database connection management, request translation, and data encoding are accomplished in servlets running under Tomcat. Squid Proxy caching layers are deployed near the Tomcat servers, as well as close to the clients, to significantly reduce the load on the database and provide a scalable deployment model. Details the system's construction and use are presented, including its architecture, design, interfaces, administration, performance measurements, and deployment plan

  1. OPERA-a human performance database under simulated emergencies of nuclear power plants

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea

    2007-01-01

    In complex systems such as the nuclear and chemical industry, the importance of human performance related problems is well recognized. Thus a lot of effort has been spent on this area, and one of the main streams for unraveling human performance related problems is the execution of HRA. Unfortunately a lack of prerequisite information has been pointed out as the most critical problem in conducting HRA. From this necessity, OPERA database that can provide operators' performance data obtained under simulated emergencies has been developed. In this study, typical operators' performance data that are available from OPERA database are briefly explained. After that, in order to ensure the appropriateness of OPERA database, operators' performance data from OPERA database are compared with those of other studies and real events. As a result, it is believed that operators' performance data of OPERA database are fairly comparable to those of other studies and real events. Therefore it is meaningful to expect that OPERA database can be used as a serviceable data source for scrutinizing human performance related problems including HRA

  2. An Integrated Database of Unit Training Performance: Description an Lessons Learned

    National Research Council Canada - National Science Library

    Leibrecht, Bruce

    1997-01-01

    The Army Research Institute (ARI) has developed a prototype relational database for processing and archiving unit performance data from home station, training area, simulation based, and Combat Training Center training exercises...

  3. High performance technique for database applicationsusing a hybrid GPU/CPU platform

    KAUST Repository

    Zidan, Mohammed A.

    2012-07-28

    Many database applications, such as sequence comparing, sequence searching, and sequence matching, etc, process large database sequences. we introduce a novel and efficient technique to improve the performance of database applica- tions by using a Hybrid GPU/CPU platform. In particular, our technique solves the problem of the low efficiency result- ing from running short-length sequences in a database on a GPU. To verify our technique, we applied it to the widely used Smith-Waterman algorithm. The experimental results show that our Hybrid GPU/CPU technique improves the average performance by a factor of 2.2, and improves the peak performance by a factor of 2.8 when compared to earlier implementations. Copyright © 2011 by ASME.

  4. Sustainable funding for biocuration: The Arabidopsis Information Resource (TAIR) as a case study of a subscription-based funding model.

    Science.gov (United States)

    Reiser, Leonore; Berardini, Tanya Z; Li, Donghui; Muller, Robert; Strait, Emily M; Li, Qian; Mezheritsky, Yarik; Vetushko, Andrey; Huala, Eva

    2016-01-01

    Databases and data repositories provide essential functions for the research community by integrating, curating, archiving and otherwise packaging data to facilitate discovery and reuse. Despite their importance, funding for maintenance of these resources is increasingly hard to obtain. Fueled by a desire to find long term, sustainable solutions to database funding, staff from the Arabidopsis Information Resource (TAIR), founded the nonprofit organization, Phoenix Bioinformatics, using TAIR as a test case for user-based funding. Subscription-based funding has been proposed as an alternative to grant funding but its application has been very limited within the nonprofit sector. Our testing of this model indicates that it is a viable option, at least for some databases, and that it is possible to strike a balance that maximizes access while still incentivizing subscriptions. One year after transitioning to subscription support, TAIR is self-sustaining and Phoenix is poised to expand and support additional resources that wish to incorporate user-based funding strategies. Database URL: www.arabidopsis.org. © The Author(s) 2016. Published by Oxford University Press.

  5. The IPE Database: providing information on plant design, core damage frequency and containment performance

    International Nuclear Information System (INIS)

    Lehner, J.R.; Lin, C.C.; Pratt, W.T.; Su, T.; Danziger, L.

    1996-01-01

    A database, called the IPE Database has been developed that stores data obtained from the Individual Plant Examinations (IPEs) which licensees of nuclear power plants have conducted in response to the Nuclear Regulatory Commission's (NRC) Generic Letter GL88-20. The IPE Database is a collection of linked files which store information about plant design, core damage frequency (CDF), and containment performance in a uniform, structured way. The information contained in the various files is based on data contained in the IPE submittals. The information extracted from the submittals and entered into the IPE Database can be manipulated so that queries regarding individual or groups of plants can be answered using the IPE Database

  6. A high performance, ad-hoc, fuzzy query processing system for relational databases

    Science.gov (United States)

    Mansfield, William H., Jr.; Fleischman, Robert M.

    1992-01-01

    Database queries involving imprecise or fuzzy predicates are currently an evolving area of academic and industrial research. Such queries place severe stress on the indexing and I/O subsystems of conventional database environments since they involve the search of large numbers of records. The Datacycle architecture and research prototype is a database environment that uses filtering technology to perform an efficient, exhaustive search of an entire database. It has recently been modified to include fuzzy predicates in its query processing. The approach obviates the need for complex index structures, provides unlimited query throughput, permits the use of ad-hoc fuzzy membership functions, and provides a deterministic response time largely independent of query complexity and load. This paper describes the Datacycle prototype implementation of fuzzy queries and some recent performance results.

  7. Federated or cached searches: providing expected performance from multiple invasive species databases

    Science.gov (United States)

    Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.

    2011-01-01

    Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search “deep” web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.

  8. A performance study on the synchronisation of heterogeneous Grid databases using CONStanza

    CERN Document Server

    Pucciani, G; Domenici, Andrea; Stockinger, Heinz

    2010-01-01

    In Grid environments, several heterogeneous database management systems are used in various administrative domains. However, data exchange and synchronisation need to be available across different sites and different database systems. In this article we present our data consistency service CONStanza and give details on how we achieve relaxed update synchronisation between different database implementations. The integration in existing Grid environments is one of the major goals of the system. Performance tests have been executed following a factorial approach. Detailed experimental results and a statistical analysis are presented to evaluate the system components and drive future developments. (C) 2010 Elsevier B.V. All rights reserved.

  9. A database for human performance under simulated emergencies of nuclear power plants

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea

    2005-01-01

    Reliable human performance is a prerequisite in securing the safety of complicated process systems such as nuclear power plants. However, the amount of available knowledge that can explain why operators deviate from an expected performance level is so small because of the infrequency of real accidents. Therefore, in this study, a database that contains a set of useful information extracted from simulated emergencies was developed in order to provide important clues for understanding the change of operators' performance under stressful conditions (i.e., real accidents). The database was developed under Microsoft Windows TM environment using Microsoft Access 97 TM and Microsoft Visual Basic 6.0 TM . In the database, operators' performance data obtained from the analysis of over 100 audio-visual records for simulated emergencies were stored using twenty kinds of distinctive data fields. A total of ten kinds of operators' performance data are available from the developed database. Although it is still difficult to predict operators' performance under stressful conditions based on the results of simulated emergencies, simulation studies remain the most feasible way to scrutinize performance. Accordingly, it is expected that the performance data of this study will provide a concrete foundation for understanding the change of operators' performance in emergency situations

  10. Oracle database 12c release 2 in-memory tips and techniques for maximum performance

    CERN Document Server

    Banerjee, Joyjeet

    2017-01-01

    This Oracle Press guide shows, step-by-step, how to optimize database performance and cut transaction processing time using Oracle Database 12c Release 2 In-Memory. Oracle Database 12c Release 2 In-Memory: Tips and Techniques for Maximum Performance features hands-on instructions, best practices, and expert tips from an Oracle enterprise architect. You will learn how to deploy the software, use In-Memory Advisor, build queries, and interoperate with Oracle RAC and Multitenant. A complete chapter of case studies illustrates real-world applications. • Configure Oracle Database 12c and construct In-Memory enabled databases • Edit and control In-Memory options from the graphical interface • Implement In-Memory with Oracle Real Application Clusters • Use the In-Memory Advisor to determine what objects to keep In-Memory • Optimize In-Memory queries using groups, expressions, and aggregations • Maximize performance using Oracle Exadata Database Machine and In-Memory option • Use Swingbench to create d...

  11. Interlocking Boards and Firm Performance: Evidence from a New Panel Database

    NARCIS (Netherlands)

    M.C. Non (Marielle); Ph.H.B.F. Franses (Philip Hans)

    2007-01-01

    textabstractAn interlock between two firms occurs if the firms share one or more directors in their boards of directors. We explore the effect of interlocks on firm performance for 101 large Dutch firms using a large and new panel database. We use five different performance measures, and for each

  12. Capacity subscription and its market design impact

    International Nuclear Information System (INIS)

    Doorman, Gerard; Solem, Gerd

    2005-04-01

    Capacity Subscription (CS) implies that consumers buy (subscribe to) a certain amount of capacity. Their demand is limited to this capacity when the total power system is short of capacity and the System Operator activates controllable Load Limiting Devices (LLDs). The objective is to maintain system security by avoiding involuntary load shedding. The report describes a market design with CS. As a case study, an analysis is made of the changes in the market design of the Nordic system that would be necessary to implement CS. First the present Nordic market design is described. Focus is on the various market participants, their roles within various time horizons and their interactions. So it is described how CS works, why it works and what is necessary to make it work. Subsequently the necessary changes in the Nordic market structure are described. The major changes are the installation of the LLDs, the establishment of the necessary infrastructure to control the LLDs and the rules governing their control and the establishment of a capacity market. The major rule is that the System Operator announces LLD activation when a shortage situation is expected. In the capacity market generators offer available capacity during system peak conditions, while consumers bid their need for capacity. Market participants are the same as on the spot market, while small consumers buy through retailer. Generators are obliged to offer the capacity sold on the capacity market on the spot market during LLD activation. Failure to do so results in a penalty payment. The report further discusses issues like the need for verification procedures, import and export, generation pooling, the handling of small consumers, reserves and a possible implementation path of CS. With respect to transmission constraints it is argued that market splitting can be a viable option. It is concluded that CS can be a possible solution to maintain generation adequacy, but there are some serious challenges. The

  13. Food advertising on children's popular subscription television channels in Australia.

    Science.gov (United States)

    Hebden, Lana; King, Lesley; Chau, Josephine; Kelly, Bridget

    2011-04-01

    Trends on Australian free-to-air television show children continue to be exposed to a disproportionate amount of unhealthy food advertising. This study describes the nature and extent of food marketing on the Australian subscription television channels most popular with children. Advertisements broadcast on the six subscription television channels most popular with children were recorded over four days in February 2009. Advertised foods were coded as core/healthy, non-core/unhealthy or miscellaneous/other, and for persuasive marketing techniques (promotional characters, premium offers and nutrition claims). The majority of foods advertised were non-core (72%), with a mean rate of 0.7 non-core food advertisements broadcast per hour, per channel. The frequency of non-core food advertisements differed significantly across channels. Persuasive techniques were used to advertise non-core foods less frequently than core and miscellaneous foods. Non-core foods make up the majority of foods advertised on children's popular subscription channels. However, Australian children currently view less non-core food advertising on subscription television compared with free-to-air. Unlike free-to-air television, subscription services have the unique opportunity to limit inappropriate food marketing to children, given they are less reliant on advertising revenue. © 2011 The Authors. ANZJPH © 2011 Public Health Association of Australia.

  14. Power Subscription Strategy: Administrator`s Record of Decision.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration

    1998-12-01

    The Bonneville Power Administration (BPA) has decided to adopt a Power Subscription Strategy for entering into new power sales contracts with its Pacific Northwest customers. The Strategy equitably distributes the electric power generated by the Federal Columbia River Power System (FCRPS) within the framework of existing law. The Power Subscription Strategy addresses the availability of power; describes power products; lays out strategies for pricing, including risk management; and discusses contract elements. In proceeding with this Subscription Strategy, BPA is guided by and committed to the Fish and Wildlife funding Principles for the BPA announced by the Vice President of the US in September 1998. This Record of Decision (ROD) addresses the issues raised by commenters who responded to BPA`s Power Subscription Strategy Proposal during and after the comment period that began with the release of the Proposal on September 18, 1998. The ROD is organized in approximately the same way as the Proposal and the Power Subscription Strategy that BPA developed based on the comments received. Abbreviations of party names used in citations appear in the section just preceding this introduction; a list of all the commenters follows the text of the ROD.

  15. Locational Prices in Capacity Subscription Market Considering Transmission Limitations

    Directory of Open Access Journals (Sweden)

    S. Babaeinejad Sarookolaee

    2013-06-01

    Full Text Available This study focuses on one of the most effective type of capacity markets named Capacity Subscription (CS market which is predicted to be widely used in the upcoming smart grids. Despite variant researches done about the mechanism and structure of capacity markets, their performances have been rarely tested in the presence of network constraints. Considering this deficiency, we tried to propose a new method to determine capacity prices in the network considering the transmission line flow limitations named Local capacity Prices (LP. This method is quite new and has not been tried before in any other similar researches. The philosophy of the proposed method is to determine capacity prices considering each consumer share of total peak demand. The first advantage of LP is that the consumers who benefit from the transmission facilities and are the responsible for transmission congestions, pay higher capacity prices than those whom their needed electricity is prepared locally. The second advantage of LP is that consumers connected to the same bus do not have to pay same capacity price due to their different shares of total peak demand. For more clarification, two other different methods named Branches Flow limit as a Global Limit (BFGL and Locational Capacity Prices (LCP are proposed and compared to the LP method in order to show LP method efficiency. The numerical results obtained from case studies show that the LP method follows more justice market procedure which results in more efficient capacity prices in comparison to BFGL and LCP methods.

  16. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  17. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    International Nuclear Information System (INIS)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea

    2016-01-01

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  18. Health sciences libraries' subscriptions to journals: expectations of general practice departments and collection-based analysis.

    Science.gov (United States)

    Barreau, David; Bouton, Céline; Renard, Vincent; Fournier, Jean-Pascal

    2018-04-01

    The aims of this study were to (i) assess the expectations of general practice departments regarding health sciences libraries' subscriptions to journals and (ii) describe the current general practice journal collections of health sciences libraries. A cross-sectional survey was distributed electronically to the thirty-five university general practice departments in France. General practice departments were asked to list ten journals to which they expected access via the subscriptions of their health sciences libraries. A ranked reference list of journals was then developed. Access to these journals was assessed through a survey sent to all health sciences libraries in France. Adequacy ratios (access/need) were calculated for each journal. All general practice departments completed the survey. The total reference list included 44 journals. This list was heterogeneous in terms of indexation/impact factor, language of publication, and scope (e.g., patient care, research, or medical education). Among the first 10 journals listed, La Revue Prescrire (96.6%), La Revue du Praticien-Médecine Générale (90.9%), the British Medical Journal (85.0%), Pédagogie Médicale (70.0%), Exercer (69.7%), and the Cochrane Database of Systematic Reviews (62.5%) had the highest adequacy ratios, whereas Family Practice (4.2%), the British Journal of General Practice (16.7%), Médecine (29.4%), and the European Journal of General Practice (33.3%) had the lowest adequacy ratios. General practice departments have heterogeneous expectations in terms of health sciences libraries' subscriptions to journals. It is important for librarians to understand the heterogeneity of these expectations, as well as local priorities, so that journal access meets users' needs.

  19. Comparison of Cloud vs. Tape Backup Performance and Costs with Oracle Database

    OpenAIRE

    Zrnec, Aljaž; Lavbič, Dejan

    2011-01-01

    Current practice of backing up data is based on using backup tapes and remote locations for storing data. Nowadays, with the advent of cloud computing a new concept of database backup emerges. The paper presents the possibility of making backup copies of data in the cloud. We are mainly focused on performance and economic issues of making backups in the cloud in comparison to traditional backups. We tested the performance and overall costs of making backup copies of data in Ora...

  20. Compilation and comparison of radionuclide sorption databases used in recent performance assessments

    International Nuclear Information System (INIS)

    McKinley, I.G.; Scholtis, A.

    1992-01-01

    The aim of this paper is to review the radionuclide sorption databases which have been used in performance assessments published within the last decade. It was hoped that such a review would allow areas of consistency to be identified, possibly indicating nuclide/rock/water systems which are now well characterised. Inconsistencies, on the other hand, might indicate areas in which further work is required. This study followed on from a prior review of the various databases which had been used in Swiss performance assessments. The latter was, however, considerably simplified by the fact that the authors had been heavily involved in sorption database definition for these assessments. The first phase of the current study was based entirely on available literature and it was quickly evident that the analyses would be much more complex (and time consuming) than initially envisaged. While some assessments clearly list all sorption data used, others depend on secondary literature (which may or may not be clearly referenced) or present sorption data which has been transmogrified into another form (e.g. into a retardation factor -c.f. following section). This study focused on database used (or intended) for performance assessment which have been published within the last 10 years or so. 45 refs., 12 tabs., 1 fig

  1. The shortest path algorithm performance comparison in graph and relational database on a transportation network

    Directory of Open Access Journals (Sweden)

    Mario Miler

    2014-02-01

    Full Text Available In the field of geoinformation and transportation science, the shortest path is calculated on graph data mostly found in road and transportation networks. This data is often stored in various database systems. Many applications dealing with transportation network require calculation of the shortest path. The objective of this research is to compare the performance of Dijkstra shortest path calculation in PostgreSQL (with pgRouting and Neo4j graph database for the purpose of determining if there is any difference regarding the speed of the calculation. Benchmarking was done on commodity hardware using OpenStreetMap road network. The first assumption is that Neo4j graph database would be well suited for the shortest path calculation on transportation networks but this does not come without some cost. Memory proved to be an issue in Neo4j setup when dealing with larger transportation networks.

  2. Extending the Mertonian Norms: Scientists' Subscription to Norms of Research

    Science.gov (United States)

    Anderson, Melissa S.; Ronning, Emily A.; De Vries, Raymond; Martinson, Brian C.

    2010-01-01

    This analysis, based on focus groups and a national survey, assesses scientists' subscription to the Mertonian norms of science and associated counternorms. It also supports extension of these norms to governance (as opposed to administration), as a norm of decision-making, and quality (as opposed to quantity), as an evaluative norm. (Contains 1…

  3. price list 2015.pdf | Subscription | Journals | Resources | public ...

    Indian Academy of Sciences (India)

    Home; public; Resources; Journals; Subscription; price list 2015.pdf. 404! error. The page your are looking for can not be found! Please check the link or use the navigation bar at the top. YouTube; Twitter; Facebook; Blog. Academy News. IAS Logo. 29th Mid-year meeting. Posted on 19 January 2018. The 29th Mid-year ...

  4. 29 CFR 1905.7 - Form of documents; subscription; copies.

    Science.gov (United States)

    2010-07-01

    ... UNDER THE WILLIAMS-STEIGER OCCUPATIONAL SAFETY AND HEALTH ACT OF 1970 General § 1905.7 Form of documents... 29 Labor 5 2010-07-01 2010-07-01 false Form of documents; subscription; copies. 1905.7 Section 1905.7 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION...

  5. GRID[subscript C] Renewable Energy Data Streaming into Classrooms

    Science.gov (United States)

    DeLuca, V. William; Carpenter, Pam; Lari, Nasim

    2010-01-01

    For years, researchers have shown the value of using real-world data to enhance instruction in mathematics, science, and social studies. In an effort to help develop students' higher-order thinking skills in a data-rich learning environment, Green Research for Incorporating Data in the Classroom (GRID[subscript C]), a National Science…

  6. 47 CFR 73.642 - Subscription TV service.

    Science.gov (United States)

    2010-10-01

    ... expressed or implied, that: (1) Prevents or hinders it from rejecting or refusing any subscription TV..., service may be terminated. (ii) Charges, terms and conditions of service to subscribers must be applied... impositions of different sets of terms and conditions may be applied to subscribers in different...

  7. How Einstein Discovered "E[subscript 0] = mc[squared]"

    Science.gov (United States)

    Hecht, Eugene

    2012-01-01

    This paper traces Einstein's discovery of "the equivalence of mass [m] and energy ["E[subscript 0]"]." He came to that splendid insight in 1905 while employed by the Bern Patent Office, at which time he was not an especially ardent reader of physics journals. How then did the young savant, working outside of academia in semi-isolation, realize…

  8. Effect of Uncertainties in CO2 Property Databases on the S-CO2 Compressor Performance

    International Nuclear Information System (INIS)

    Lee, Je Kyoung; Lee, Jeong Ik; Ahn, Yoonhan; Kim, Seong Gu; Cha, Je Eun

    2013-01-01

    Various S-CO 2 Brayton cycle experiment facilities are on the state of construction or operation for demonstration of the technology. However, during the data analysis, S-CO 2 property databases are widely used to predict the performance and characteristics of S-CO 2 Brayton cycle. Thus, a reliable property database is very important before any experiment data analyses or calculation. In this paper, deviation of two different property databases which are widely used for the data analysis will be identified by using three selected properties for comparison, C p , density and enthalpy. Furthermore, effect of above mentioned deviation on the analysis of test data will be briefly discussed. From this deviation, results of the test data analysis can have critical error. As the S-CO 2 Brayton cycle researcher knows, CO 2 near the critical point has dramatic change on thermodynamic properties. Thus, it is true that a potential error source of property prediction exists in CO 2 properties near the critical point. During an experiment data analysis with the S-CO 2 Brayton cycle experiment facility, thermodynamic properties are always involved to predict the component performance and characteristics. Thus, construction or defining of precise CO 2 property database should be carried out to develop Korean S-CO 2 Brayton cycle technology

  9. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    Science.gov (United States)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  10. Reliability database development and plant performance improvement effort at Korea Hydro and Nuclear Power Co

    International Nuclear Information System (INIS)

    Oh, S. J.; Hwang, S. W.; Na, J. H.; Lim, H. S.

    2008-01-01

    Nuclear utilities in recent years have focused on improved plant performance and equipment reliability. In U.S., there is a movement toward process integration. Examples are INPO AP-913 equipment reliability program and the standard nuclear performance model developed by NEI. Synergistic effect from an integrated approach can be far greater than as compared to individual effects from each program. In Korea, PSA for all Korean NPPs (Nuclear Power Plants) has been completed. Plant performance monitoring and improvement is an important goal for KHNP (Korea Hydro and Nuclear Power Company) and a risk monitoring system called RIMS has been developed for all nuclear plants. KHNP is in the process of voluntarily implementing maintenance rule program similar to that in U.S. In the future, KHNP would like to expand the effort to equipment reliability program and to achieve highest equipment reliability and improved plant performance. For improving equipment reliability, the current trend is moving toward preventive/predictive maintenance from corrective maintenance. With the emphasis on preventive maintenance, the failure cause and operation history and environment are important. Hence, the development of accurate reliability database is necessary. Furthermore, the database should be updated regularly and maintained as a living program to reflect the current status of equipment reliability. This paper examines the development of reliability database system and its application of maintenance optimization or Risk Informed Application (RIA). (authors)

  11. Vibrational Spectroscopy of the CCl[subscript 4] v[subscript 1] Mode: Theoretical Prediction of Isotopic Effects

    Science.gov (United States)

    Gaynor, James D.; Wetterer, Anna M.; Cochran, Rea M.; Valente, Edward J.; Mayer, Steven G.

    2015-01-01

    Raman spectroscopy is a powerful experimental technique, yet it is often missing from the undergraduate physical chemistry laboratory curriculum. Tetrachloromethane (CCl[subscript 4]) is the ideal molecule for an introductory vibrational spectroscopy experiment and the symmetric stretch vibration contains fine structure due to isotopic variations…

  12. Performance of popular open source databases for HEP related computing problems

    International Nuclear Information System (INIS)

    Kovalskyi, D; Sfiligoi, I; Wuerthwein, F; Yagil, A

    2014-01-01

    Databases are used in many software components of HEP computing, from monitoring and job scheduling to data storage and processing. It is not always clear at the beginning of a project if a problem can be handled by a single server, or if one needs to plan for a multi-server solution. Before a scalable solution is adopted, it helps to know how well it performs in a single server case to avoid situations when a multi-server solution is adopted mostly due to sub-optimal performance per node. This paper presents comparison benchmarks of popular open source database management systems. As a test application we use a user job monitoring system based on the Glidein workflow management system used in the CMS Collaboration.

  13. Applicability of thermodynamic database of radioactive elements developed for the Japanese performance assessment of HLW repository

    International Nuclear Information System (INIS)

    Yui, Mikazu; Shibata, Masahiro; Rai, Dhanpat; Ochs, Michael

    2003-01-01

    In 1999 Japan Nuclear Cycle Development Institute (JNC) published a second progress report (also known as H12 report) on high-level radioactive waste (HLW) disposal in Japan (JNC 1999). This report helped to develop confidence in the selected HLW disposal system and to establish the implementation body in 2000 for the disposal of HLW. JNC developed an in-house thermodynamic database for radioactive elements for performance analysis of the engineered barrier system (EBS) and the geosphere for H12 report. This paper briefly presents the status of the JNC's thermodynamic database and its applicability to perform realistic analyses of the solubilities of radioactive elements, evolution of solubility-limiting solid phases, predictions of the redox state of Pu in the neutral pH range under reducing conditions, and to estimate solubilities of radioactive elements in cementitious conditions. (author)

  14. Database on Performance of Neutron Irradiated FeCrAl Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Field, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Briggs, Samuel A. [Univ. of Wisconsin, Madison, WI (United States); Littrell, Ken [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Parish, Chad M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Yamamoto, Yukinori [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-08-01

    The present report summarizes and discusses the database on radiation tolerance for Generation I, Generation II, and commercial FeCrAl alloys. This database has been built upon mechanical testing and microstructural characterization on selected alloys irradiated within the High Flux Isotope Reactor (HFIR) at Oak Ridge National Laboratory (ORNL) up to doses of 13.8 dpa at temperatures ranging from 200°C to 550°C. The structure and performance of these irradiated alloys were characterized using advanced microstructural characterization techniques and mechanical testing. The primary objective of developing this database is to enhance the rapid development of a mechanistic understanding on the radiation tolerance of FeCrAl alloys, thereby enabling informed decisions on the optimization of composition and microstructure of FeCrAl alloys for application as an accident tolerant fuel (ATF) cladding. This report is structured to provide a brief summary of critical results related to the database on radiation tolerance of FeCrAl alloys.

  15. A high-performance spatial database based approach for pathology imaging algorithm evaluation

    Directory of Open Access Journals (Sweden)

    Fusheng Wang

    2013-01-01

    Full Text Available Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS data model. Aims: (1 Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2 Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3 Develop a set of queries to support data sampling and result comparisons; (4 Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1 algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2 algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The

  16. A high-performance spatial database based approach for pathology imaging algorithm evaluation.

    Science.gov (United States)

    Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A D; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J; Saltz, Joel H

    2013-01-01

    Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. WE HAVE CONSIDERED TWO SCENARIOS FOR ALGORITHM EVALUATION: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and

  17. Improved Syntheses and Expanded Analyses of the Enantiomerically Enriched Chiral Cobalt Complexes Co(en)[subscript 3]I[subscript 3] and Co(diNOsar)Br[subscript 3

    Science.gov (United States)

    McClellan, Michael J.; Cass, Marion E.

    2015-01-01

    This communication is a collection of additions and modifications to two previously published classic inorganic synthesis laboratory experiments. The experimental protocol for the synthesis and isolation of enantiomerically enriched ?- (or ?-)Co(en)[subscript 3]I[subscript 3] has been modified to increase reproducibility, yield, and enantiomeric…

  18. Assessing U.S. ESCO industry performance and market trends: Results from the NAESCO database project

    International Nuclear Information System (INIS)

    Osborn, Julie; Goldman, Chuck; Hopper, Nicole; Singer, Terry

    2002-01-01

    The U.S. Energy Services Company (ESCO) industry is often cited as the most successful model for the private sector delivery of energy-efficiency services. This study documents actual performance of the ESCO industry in order to provide policymakers and investors with objective informative and customers with a resource for benchmarking proposed projects relative to industry performance. We have assembled a database of nearly 1500 case studies of energy-efficiency projects - the most comprehensive data set of the U.S. ESCO industry available. These projects include$2.55B of work completed by 51 ESCOs and span much of the history of this industry

  19. Operation and Performance of the ATLAS Muon Spectrometer Databases during 2011-12 Data Taking

    CERN Document Server

    Verducci, Monica

    2014-01-01

    The size and complexity of the ATLAS experiment at the Large Hadron Collider, including its Muon Spectrometer, raise unprecedented challenges in terms of operation, software model and data management. One of the challenging tasks is the storage of non-event data produced by the calibration and alignment stream processes and by online and offline monitoring frameworks, which can unveil problems in the detector hardware and in the data processing chain. During 2011 and 2012 data taking, the software model and data processing enabled high quality track resolution as a better understanding of the detector performance was developed using the most reliable detector simulation and reconstruction. This work summarises the various aspects of the Muon Spectrometer Databases, with particular emphasis given to the Conditions Databases and their usage in the data analysis.

  20. Liver biopsy performance and histological findings among patients with chronic viral hepatitis: a Danish database study

    DEFF Research Database (Denmark)

    Christensen, Peer Brehm; Krarup, Henrik Bygum; Møller, Axel

    2007-01-01

    We investigated the variance of liver biopsy frequency and histological findings among patients with chronic viral hepatitis attending 10 medical centres in Denmark. Patients who tested positive for HBsAg or HCV- RNA were retrieved from a national clinical database (DANHEP) and demographic data...... had developed in 23% after 20 y of infection. Age above 40 y was a better predictor of cirrhosis than elevated ALT. National database comparison may identify factors of importance for improved management of patients with chronic viral hepatitis. Udgivelsesdato: 2007-null......, laboratory analyses and liver biopsy results were collected. A total of 1586 patients were identified of whom 69.7% had hepatitis C, 28.9% hepatitis B, and 1.5% were coinfected. In total, 771 (48.6%) had a biopsy performed (range 33.3-78.7%). According to the Metavir classification, 29.3% had septal fibrosis...

  1. The performance of disk arrays in shared-memory database machines

    Science.gov (United States)

    Katz, Randy H.; Hong, Wei

    1993-01-01

    In this paper, we examine how disk arrays and shared memory multiprocessors lead to an effective method for constructing database machines for general-purpose complex query processing. We show that disk arrays can lead to cost-effective storage systems if they are configured from suitably small formfactor disk drives. We introduce the storage system metric data temperature as a way to evaluate how well a disk configuration can sustain its workload, and we show that disk arrays can sustain the same data temperature as a more expensive mirrored-disk configuration. We use the metric to evaluate the performance of disk arrays in XPRS, an operational shared-memory multiprocessor database system being developed at the University of California, Berkeley.

  2. Does SDDS Subscription Reduce Borrowing Costs for Emerging Market Economies?

    OpenAIRE

    John Cady

    2005-01-01

    Does macroeconomic data transparency-as signaled by subscription to the IMF's Special Data Dissemination Standard (SDDS)-help reduce borrowing costs in international capital markets? This question is examined using data on new issues of sovereign foreign-currency-denominated (U.S. dollar, yen, and euro) bonds for several emerging market economies. Panel econometric estimates indicate that spreads on new bond issues declined on average by close to 20 percent, or by an average of about 55 basis...

  3. Open Access, Library Subscriptions, and Article Processing Charges

    KAUST Repository

    Vijayakumar, J.K.

    2016-05-01

    Hybrid journals contains articles behind a pay-wall to be subscribed, as well as papers made open access when author pays article processing charge (APC). In such cases, an Institution will end up paying twice and Publishers tend to double-dip. Discussions and pilot models are emerging on pricing options, such as “offset pricing,” [where APCs are adjusted or discounted with subscription costs as vouchers or reductions in next year subscriptions, APCs beyond the subscription costs are modestly capped etc] and thus reduce Institutions’ cost. This presentation will explain different models available and how can we attain a transparent costing structure, where the scholarly community can feel the fairness in Publishers’ pricing mechanisms. Though most of the offset systems are developed through national level or consortium level negotiations, experience of individual institutions, like KAUST that subscribe to large e-journals collections, is important in making right decisions on saving Institutes costs and support openness in scholarly communications.

  4. Open Access, Library Subscriptions, and Article Processing Charges

    KAUST Repository

    Vijayakumar, J.K.; Tamarkin, Molly

    2016-01-01

    Hybrid journals contains articles behind a pay-wall to be subscribed, as well as papers made open access when author pays article processing charge (APC). In such cases, an Institution will end up paying twice and Publishers tend to double-dip. Discussions and pilot models are emerging on pricing options, such as “offset pricing,” [where APCs are adjusted or discounted with subscription costs as vouchers or reductions in next year subscriptions, APCs beyond the subscription costs are modestly capped etc] and thus reduce Institutions’ cost. This presentation will explain different models available and how can we attain a transparent costing structure, where the scholarly community can feel the fairness in Publishers’ pricing mechanisms. Though most of the offset systems are developed through national level or consortium level negotiations, experience of individual institutions, like KAUST that subscribe to large e-journals collections, is important in making right decisions on saving Institutes costs and support openness in scholarly communications.

  5. Relational database hybrid model, of high performance and storage capacity for nuclear engineering applications

    International Nuclear Information System (INIS)

    Gomes Neto, Jose

    2008-01-01

    The objective of this work is to present the relational database, named FALCAO. It was created and implemented to support the storage of the monitored variables in the IEA-R1 research reactor, located in the Instituto de Pesquisas Energeticas e Nucleares, IPEN/CNEN-SP. The data logical model and its direct influence in the integrity of the provided information are carefully considered. The concepts and steps of normalization and de normalization including the entities and relations involved in the logical model are presented. It is also presented the effects of the model rules in the acquisition, loading and availability of the final information, under the performance concept since the acquisition process loads and provides lots of information in small intervals of time. The SACD application, through its functionalities, presents the information stored in the FALCAO database in a practical and optimized form. The implementation of the FALCAO database occurred successfully and its existence leads to a considerably favorable situation. It is now essential to the routine of the researchers involved, not only due to the substantial improvement of the process but also to the reliability associated to it. (author)

  6. Accelerating the energy retrofit of commercial buildings using a database of energy efficiency performance

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Hong, Tianzhen; Piette, Mary Ann; Sawaya, Geof; Chen, Yixing; Taylor-Lange, Sarah C.

    2015-01-01

    Small and medium-sized commercial buildings can be retrofitted to significantly reduce their energy use, however it is a huge challenge as owners usually lack of the expertise and resources to conduct detailed on-site energy audit to identify and evaluate cost-effective energy technologies. This study presents a DEEP (database of energy efficiency performance) that provides a direct resource for quick retrofit analysis of commercial buildings. DEEP, compiled from the results of about ten million EnergyPlus simulations, enables an easy screening of ECMs (energy conservation measures) and retrofit analysis. The simulations utilize prototype models representative of small and mid-size offices and retails in California climates. In the formulation of DEEP, large scale EnergyPlus simulations were conducted on high performance computing clusters to evaluate hundreds of individual and packaged ECMs covering envelope, lighting, heating, ventilation, air-conditioning, plug-loads, and service hot water. The architecture and simulation environment to create DEEP is flexible and can expand to cover additional building types, additional climates, and new ECMs. In this study DEEP is integrated into a web-based retrofit toolkit, the Commercial Building Energy Saver, which provides a platform for energy retrofit decision making by querying DEEP and unearthing recommended ECMs, their estimated energy savings and financial payback. - Highlights: • A DEEP (database of energy efficiency performance) supports building retrofit. • DEEP is an SQL database with pre-simulated results from 10 million EnergyPlus runs. • DEEP covers 7 building types, 6 vintages, 16 climates, and 100 energy measures. • DEEP accelerates retrofit of small commercial buildings to save energy use and cost. • DEEP can be expanded and integrated with third-party energy software tools.

  7. Regionally Selective Requirement for D[subscript 1]/D[subscript 5] Dopaminergic Neurotransmission in the Medial Prefrontal Cortex in Object-in-Place Associative Recognition Memory

    Science.gov (United States)

    Savalli, Giorgia; Bashir, Zafar I.; Warburton, E. Clea

    2015-01-01

    Object-in-place (OiP) memory is critical for remembering the location in which an object was last encountered and depends conjointly on the medial prefrontal cortex, perirhinal cortex, and hippocampus. Here we examined the role of dopamine D[subscript 1]/D[subscript 5] receptor neurotransmission within these brain regions for OiP memory. Bilateral…

  8. K[subscript a] and K[subscript b] from pH and Conductivity Measurements: A General Chemistry Laboratory Exercise

    Science.gov (United States)

    Nyasulu, Frazier; Moehring, Michael; Arthasery, Phyllis; Barlag, Rebecca

    2011-01-01

    The acid ionization constant, K[subscript a], of acetic acid and the base ionization constant, K[subscript b], of ammonia are determined easily and rapidly using a datalogger, a pH sensor, and a conductivity sensor. To decrease sample preparation time and to minimize waste, sequential aliquots of a concentrated standard are added to a known volume…

  9. Synthesis of "trans"-4,5-Bis-dibenzylaminocyclopent-2-Enone from Furfural Catalyzed by ErCl[subscript 3]·6H[subscript 2]O

    Science.gov (United States)

    Estevão, Mónica S.; Martins, Ricardo J. V.; Alfonso, Carlos A. M.

    2017-01-01

    An experiment exploring the chemistry of the carbonyl group for the one-step synthesis of "trans"-4,5- dibenzylaminocyclopent-2-enone is described. The reaction of furfural and dibenzylamine in the environmentally friendly solvent ethanol and catalyzed by the Lewis acid ErCl[subscript 3]·6H[subscript 2]O afforded the product in high…

  10. Performance of an open-source heart sound segmentation algorithm on eight independent databases.

    Science.gov (United States)

    Liu, Chengyu; Springer, David; Clifford, Gari D

    2017-08-01

    Heart sound segmentation is a prerequisite step for the automatic analysis of heart sound signals, facilitating the subsequent identification and classification of pathological events. Recently, hidden Markov model-based algorithms have received increased interest due to their robustness in processing noisy recordings. In this study we aim to evaluate the performance of the recently published logistic regression based hidden semi-Markov model (HSMM) heart sound segmentation method, by using a wider variety of independently acquired data of varying quality. Firstly, we constructed a systematic evaluation scheme based on a new collection of heart sound databases, which we assembled for the PhysioNet/CinC Challenge 2016. This collection includes a total of more than 120 000 s of heart sounds recorded from 1297 subjects (including both healthy subjects and cardiovascular patients) and comprises eight independent heart sound databases sourced from multiple independent research groups around the world. Then, the HSMM-based segmentation method was evaluated using the assembled eight databases. The common evaluation metrics of sensitivity, specificity, accuracy, as well as the [Formula: see text] measure were used. In addition, the effect of varying the tolerance window for determining a correct segmentation was evaluated. The results confirm the high accuracy of the HSMM-based algorithm on a separate test dataset comprised of 102 306 heart sounds. An average [Formula: see text] score of 98.5% for segmenting S1 and systole intervals and 97.2% for segmenting S2 and diastole intervals were observed. The [Formula: see text] score was shown to increases with an increases in the tolerance window size, as expected. The high segmentation accuracy of the HSMM-based algorithm on a large database confirmed the algorithm's effectiveness. The described evaluation framework, combined with the largest collection of open access heart sound data, provides essential resources for

  11. Treatment performances of French constructed wetlands: results from a database collected over the last 30 years.

    Science.gov (United States)

    Morvannou, A; Forquet, N; Michel, S; Troesch, S; Molle, P

    2015-01-01

    Approximately 3,500 constructed wetlands (CWs) provide raw wastewater treatment in France for small communities (Built during the past 30 years, most consist of two vertical flow constructed wetlands (VFCWs) in series (stages). Many configurations exist, with systems associated with horizontal flow filters or waste stabilization ponds, vertical flow with recirculation, partially saturated systems, etc. A database analyzed 10 years earlier on the classical French system summarized the global performances data. This paper provides a similar analysis of performance data from 415 full-scale two-stage VFCWs from an improved database expanded by monitoring data available from Irstea and the French technical department. Trends presented in the first study are confirmed, exhibiting high chemical oxygen demand (COD), total suspended solids (TSS) and total Kjeldahl nitrogen (TKN) removal rates (87%, 93% and 84%, respectively). Typical concentrations at the second-stage outlet are 74 mgCOD L(-1), 17 mgTSS L(-1) and 11 mgTKN L(-1). Pollutant removal performances are summarized in relation to the loads applied at the first treatment stage. While COD and TSS removal rates remain stable over the range of applied loads, the spreading of TKN removal rates increases as applied loads increase.

  12. Performance of Point and Range Queries for In-memory Databases using Radix Trees on GPUs

    Energy Technology Data Exchange (ETDEWEB)

    Alam, Maksudul [ORNL; Yoginath, Srikanth B [ORNL; Perumalla, Kalyan S [ORNL

    2016-01-01

    In in-memory database systems augmented by hardware accelerators, accelerating the index searching operations can greatly increase the runtime performance of database queries. Recently, adaptive radix trees (ART) have been shown to provide very fast index search implementation on the CPU. Here, we focus on an accelerator-based implementation of ART. We present a detailed performance study of our GPU-based adaptive radix tree (GRT) implementation over a variety of key distributions, synthetic benchmarks, and actual keys from music and book data sets. The performance is also compared with other index-searching schemes on the GPU. GRT on modern GPUs achieves some of the highest rates of index searches reported in the literature. For point queries, a throughput of up to 106 million and 130 million lookups per second is achieved for sparse and dense keys, respectively. For range queries, GRT yields 600 million and 1000 million lookups per second for sparse and dense keys, respectively, on a large dataset of 64 million 32-bit keys.

  13. Neutron metrology file NMF-90. An integrated database for performing neutron spectrum adjustment calculations

    International Nuclear Information System (INIS)

    Kocherov, N.P.

    1996-01-01

    The Neutron Metrology File NMF-90 is an integrated database for performing neutron spectrum adjustment (unfolding) calculations. It contains 4 different adjustment codes, the dosimetry reaction cross-section library IRDF-90/NMF-G with covariances files, 6 input data sets for reactor benchmark neutron fields and a number of utility codes for processing and plotting the input and output data. The package consists of 9 PC HD diskettes and manuals for the codes. It is distributed by the Nuclear Data Section of the IAEA on request free of charge. About 10 MB of diskspace is needed to install and run a typical reactor neutron dosimetry unfolding problem. (author). 8 refs

  14. A database for CO2 Separation Performances of MOFs based on Computational Materials Screening.

    Science.gov (United States)

    Altintas, Cigdem; Avci, Gokay; Daglar, Hilal; Nemati Vesali Azar, Ayda; Velioglu, Sadiye; Erucar, Ilknur; Keskin, Seda

    2018-05-03

    Metal organic frameworks (MOFs) have been considered as great candidates for CO2 capture. Considering the very large number of available MOFs, high-throughput computational screening plays a critical role in identifying the top performing materials for target applications in a time-effective manner. In this work, we used molecular simulations to screen the most recent and complete MOF database for identifying the most promising materials for CO2 separation from flue gas (CO2/N2) and landfill gas (CO2/CH4) under realistic operating conditions. We first validated our approach by comparing the results of our molecular simulations for the CO2 uptakes, CO2/N2 and CO2/CH4 selectivities of various types of MOFs with the available experimental data. We then computed binary CO2/N2 and CO2/CH4 mixture adsorption data for the entire MOF database and used these results to calculate several adsorbent selection metrics such as selectivity, working capacity, adsorbent performance score, regenerability, and separation potential. MOFs were ranked based on the combination of these metrics and the top performing MOF adsorbents that can achieve CO2/N2 and CO2/CH4 separations with high performance were identified. Molecular simulations for the adsorption of a ternary CO2/N2/CH4 mixture were performed for these top materials in order to provide a more realistic performance assessment of MOF adsorbents. Structure-performance analysis showed that MOFs with ΔQ>30 kJ/mol, 3.8 A≤PLD≤5 A, 5 A≤LCD≤7.5 A, 0.5≤ϕ≤0.75, SA≤1,000 m2/g, ρ>1 g/cm 3 are the best candidates for selective separation of CO2 from flue gas and landfill gas. This information will be very useful to design novel MOFs with the desired structural features that can lead to high CO2 separation potentials. Finally, an online, freely accessible database https://cosmoserc.ku.edu.tr was established, for the first time in the literature, which reports all computed adsorbent metrics of 3,816 MOFs for CO2/N2, CO2/CH4

  15. The Impact of Data-Based Science Instruction on Standardized Test Performance

    Science.gov (United States)

    Herrington, Tia W.

    Increased teacher accountability efforts have resulted in the use of data to improve student achievement. This study addressed teachers' inconsistent use of data-driven instruction in middle school science. Evidence of the impact of data-based instruction on student achievement and school and district practices has been well documented by researchers. In science, less information has been available on teachers' use of data for classroom instruction. Drawing on data-driven decision making theory, the purpose of this study was to examine whether data-based instruction impacted performance on the science Criterion Referenced Competency Test (CRCT) and to explore the factors that impeded its use by a purposeful sample of 12 science teachers at a data-driven school. The research questions addressed in this study included understanding: (a) the association between student performance on the science portion of the CRCT and data-driven instruction professional development, (b) middle school science teachers' perception of the usefulness of data, and (c) the factors that hindered the use of data for science instruction. This study employed a mixed methods sequential explanatory design. Data collected included 8th grade CRCT data, survey responses, and individual teacher interviews. A chi-square test revealed no improvement in the CRCT scores following the implementation of professional development on data-driven instruction (chi 2 (1) = .183, p = .67). Results from surveys and interviews revealed that teachers used data to inform their instruction, indicating time as the major hindrance to their use. Implications for social change include the development of lesson plans that will empower science teachers to deliver data-based instruction and students to achieve identified academic goals.

  16. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  17. Performance of a TV white space database with different terrain resolutions and propagation models

    Directory of Open Access Journals (Sweden)

    A. M. Fanan

    2017-11-01

    Full Text Available Cognitive Radio has now become a realistic option for the solution of the spectrum scarcity problem in wireless communication. TV channels (the primary user can be protected from secondary-user interference by accurate prediction of TV White Spaces (TVWS by using appropriate propagation modelling. In this paper we address two related aspects of channel occupancy prediction for cognitive radio. Firstly we investigate the best combination of empirical propagation model and spatial resolution of terrain data for predicting TVWS by examining the performance of three propagation models (Extended-Hata, Davidson-Hata and Egli in the TV band 470 to 790 MHz along with terrain data resolutions of 1000, 100 and 30 m, when compared with a comprehensive set of propagation measurements taken in randomly-selected locations around Hull, UK. Secondly we describe how such models can be integrated into a database-driven tool for cognitive radio channel selection within the TVWS environment.

  18. JNC thermodynamic database for performance assessment of high-level radioactive waste disposal system

    Energy Technology Data Exchange (ETDEWEB)

    Yui, Mikazu; Azuma, Jiro; Shibata, Masahiro [Japan Nuclear Cycle Development Inst., Tokai Works, Waste Isolation Research Division, Tokai, Ibaraki (Japan)

    1999-11-01

    This report is a summary of status, frozen datasets, and future tasks of the JNC (Japan Nuclear Cycle Development Institute) thermodynamic database (JNC-TDB) for assessing performance of high-level radioactive waste in geological environments. The JNC-TDB development was carried out after the first progress report on geological disposal research in Japan (H-3). In the development, thermodynamic data (equilibrium constants at 25degC, I=0) for important radioactive elements were selected/determined based on original experimental data using different models (e.g., SIT, Pitzer). As a result, the reliability and traceability of the data for most of the important elements were improved over those of the PNC-TDB used in H-3 report. For detailed information of data analysis and selections for each element, see the JNC technical reports listed in this document. (author)

  19. Comparison of performance indicators of different types of reactors based on ISOE database

    International Nuclear Information System (INIS)

    Janzekovic, H.; Krizman, M.

    2005-01-01

    The optimisation of the operation of a nuclear power plant (NPP) is a challenging issue due to the fact that besides general management issues, a risk associated to nuclear facilities should be included. In order to optimise the radiation protection programmes in around 440 reactors in operation with more than 500 000 monitored workers each year, the international exchange of performance indicators (PI) related to radiation protection issues seems to be essential. Those indicators are a function of a type of a reactor as well as the age and the quality of the management of the reactor. in general three main types of radiation protection PI could be recognised. These are: occupational exposure of workers, public exposure and management of PI related to radioactive waste. The occupational exposure could be efficiently studied using ISOC database. The dependence of occupational exposure on different types of reactors, e.g. PWR, BWR, are given, analysed and compared. (authors)

  20. Performance Improvement with Web Based Database on Library Information System of Smk Yadika 5

    Directory of Open Access Journals (Sweden)

    Pualam Dipa Nusantara

    2015-12-01

    Full Text Available The difficulty in managing the data of books collection in the library is a problem that is often faced by the librarian that effect the quality of service. Arrangement and recording a collection of books in the file system of separate applications Word and Excel, as well as transaction handling borrowing and returning books, therehas been no integrated records. Library system can manage the book collection. This system can reduce the problems often experienced by library staff when serving students in borrowing books. There so frequent difficulty in managing the books that still in borrowed state. This system will also record a collection of late fees or lost library conducted by students (borrowers. The conclusion of this study is library performance can be better with the library system using web database.

  1. Psychophysical studies of the performance of an image database retrieval system

    Science.gov (United States)

    Papathomas, Thomas V.; Conway, Tiffany E.; Cox, Ingemar J.; Ghosn, Joumana; Miller, Matt L.; Minka, Thomas P.; Yianilos, Peter N.

    1998-07-01

    We describe psychophysical experiments conducted to study PicHunter, a content-based image retrieval (CBIR) system. Experiment 1 studies the importance of using (a) semantic information, (2) memory of earlier input and (3) relative, rather than absolute, judgements of image similarity. The target testing paradigm is used in which a user must search for an image identical to a target. We find that the best performance comes from a version of PicHunter that uses only semantic cues, with memory and relative similarity judgements. Second best is use of both pictorial and semantic cues, with memory and relative similarity judgements. Most reports of CBIR systems provide only qualitative measures of performance based on how similar retrieved images are to a target. Experiment 2 puts PicHunter into this context with a more rigorous test. We first establish a baseline for our database by measuring the time required to find an image that is similar to a target when the images are presented in random order. Although PicHunter's performance is measurably better than this, the test is weak because even random presentation of images yields reasonably short search times. This casts doubt on the strength of results given in other reports where no baseline is established.

  2. CERN Library - Reduction of subscriptions to scientific journals

    CERN Multimedia

    2005-01-01

    The Library Working Group for Acquisitions has identified some scientific journal subscriptions as candidates for cancellation. Although the 2005 budget is unchanged with respect to 2004 thanks to the efforts of the Management, it does not take account of inflation, which for many years has been much higher for scientific literature than the normal cost-of-living index. For 2006, the inflation rate is estimated to be 7-8%. Moreover, the Library does not only intend to compensate for the loss of purchasing power but also to make available some funds to promote new Open Access publishing models. (See Bulletin No.15/2005) The list of candidates can be found on the Library homepage (http://library.cern.ch/). In addition, some subscriptions will be converted to online-only, i.e. CERN will no longer order the print version of certain journals. We invite users to carefully check the list (http://library.cern.ch/). Comments on this proposal should be sent to the WGA Chairman, Rudiger Voss, with a copy to the Hea...

  3. Differentiation of several interstitial lung disease patterns in HRCT images using support vector machine: role of databases on performance

    Science.gov (United States)

    Kale, Mandar; Mukhopadhyay, Sudipta; Dash, Jatindra K.; Garg, Mandeep; Khandelwal, Niranjan

    2016-03-01

    Interstitial lung disease (ILD) is complicated group of pulmonary disorders. High Resolution Computed Tomography (HRCT) considered to be best imaging technique for analysis of different pulmonary disorders. HRCT findings can be categorised in several patterns viz. Consolidation, Emphysema, Ground Glass Opacity, Nodular, Normal etc. based on their texture like appearance. Clinician often find it difficult to diagnosis these pattern because of their complex nature. In such scenario computer-aided diagnosis system could help clinician to identify patterns. Several approaches had been proposed for classification of ILD patterns. This includes computation of textural feature and training /testing of classifier such as artificial neural network (ANN), support vector machine (SVM) etc. In this paper, wavelet features are calculated from two different ILD database, publically available MedGIFT ILD database and private ILD database, followed by performance evaluation of ANN and SVM classifiers in terms of average accuracy. It is found that average classification accuracy by SVM is greater than ANN where trained and tested on same database. Investigation continued further to test variation in accuracy of classifier when training and testing is performed with alternate database and training and testing of classifier with database formed by merging samples from same class from two individual databases. The average classification accuracy drops when two independent databases used for training and testing respectively. There is significant improvement in average accuracy when classifiers are trained and tested with merged database. It infers dependency of classification accuracy on training data. It is observed that SVM outperforms ANN when same database is used for training and testing.

  4. Performance monitoring in hip fracture surgery--how big a database do we really need?

    Science.gov (United States)

    Edwards, G A D; Metcalfe, A J; Johansen, A; O'Doherty, D

    2010-04-01

    Systems for collecting information about patient care are increasingly common in orthopaedic practice. Databases can allow various comparisons to be made over time. Significant decisions regarding service delivery and clinical practice may be made based on their results. We set out to determine the number of cases needed for comparison of 30-day mortality, inpatient wound infection rates and mean hospital length of stay, with a power of 80% for the demonstration of an effect at a significance level of pdata on 1050 hip fracture patients admitted to a city teaching hospital. Detection of a 10% difference in 30-day mortality would require 14,065 patients in each arm of any comparison, demonstration of a 50% difference would require 643 patients in each arm; for wound infections, demonstration of a 10% difference in incidence would require 23,921 patients in each arm and 1127 patients for demonstration of a 50% difference; for length of stay, a difference of 10% would require 1479 patients and 6660 patients for a 50% difference. This study demonstrates the importance of considering the population sizes before comparisons are made on the basis of basic hip fracture outcome data. Our data also help illustrate the impact of sample size considerations when interpreting the results of performance monitoring. Many researchers will be used to the fact that rare outcomes such as inpatient mortality or wound infection require large sample sizes before differences can be reliably demonstrated between populations. This study gives actual figures that researchers could use when planning studies. Statistically meaningful analyses will only be possible with major multi-centre collaborations, as will be possible if hospital Trusts participate in the National Hip Fracture Database. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  5. An Experiment Illustrating the Change in Ligand p"K"[subscript a] upon Protein Binding

    Science.gov (United States)

    Chenprakhon, Pirom; Panijpan, Bhinyo; Chaiyen, Pimchai

    2012-01-01

    The modulation of ligand p"K"[subscript a] due to its surrounding environment is a crucial feature that controls many biological phenomena. For example, the shift in the p"K"[subscript a] of substrates or catalytic residues at enzyme active sites upon substrate binding often triggers and controls enzymatic reactions. In this work, we developed an…

  6. 11 CFR 100.111 - Gift, subscription, loan, advance or deposit of money.

    Science.gov (United States)

    2010-01-01

    ... 11 Federal Elections 1 2010-01-01 2010-01-01 false Gift, subscription, loan, advance or deposit of... DEFINITIONS (2 U.S.C. 431) Definition of Expenditure (2 U.S.C. 431(9)) § 100.111 Gift, subscription, loan, advance or deposit of money. (a) A purchase, payment, distribution, loan (except for a loan made in...

  7. OLIO+: an osteopathic medicine database.

    Science.gov (United States)

    Woods, S E

    1991-01-01

    OLIO+ is a bibliographic database designed to meet the information needs of the osteopathic medical community. Produced by the American Osteopathic Association (AOA), OLIO+ is devoted exclusively to the osteopathic literature. The database is available only by subscription through AOA and may be accessed from any data terminal with modem or IBM-compatible personal computer with telecommunications software that can emulate VT100 or VT220. Apple access is also available, but some assistance from OLIO+ support staff may be necessary to modify the Apple keyboard.

  8. Managing without a subscription agent: the experience of doing it yourself

    Directory of Open Access Journals (Sweden)

    Lisa Lovén

    2017-11-01

    Full Text Available In October 2015 Stockholm University Library (SUB decided to no longer use the services of a subscription agent for managing individual journal subscriptions. Instead, SUB has taken a do-it-yourself (DIY approach to subscriptions management and now renews and orders new journals directly from each publisher. In the light of two years of experience, this article discusses the key findings of this new way of working with subscriptions, the differences between the first and second year of renewing directly with publishers and the pros and cons of not using an agent. The article ends with a few recommendations and things for other libraries to consider before making the decision to do without a subscription agent and explains why SUB has decided to continue with the DIY approach. 'Based on a breakout session presented at the 40th UKSG Annual Conference, Harrogate, April 2017 '

  9. MTF Database: A Repository of Students' Academic Performance Measurements for the Development of Techniques for Evaluating Team Functioning

    Science.gov (United States)

    Hsiung, Chin-Min; Zheng, Xiang-Xiang

    2015-01-01

    The Measurements for Team Functioning (MTF) database contains a series of student academic performance measurements obtained at a national university in Taiwan. The measurements are acquired from unit tests and homework tests performed during a core mechanical engineering course, and provide an objective means of assessing the functioning of…

  10. Performance analysis of a real-time database with optimistic concurrency control

    NARCIS (Netherlands)

    Sassen, S.A.E.; Wal, van der J.

    1997-01-01

    For a real-time shared-memory database with Optimistic Concurrency Control (OCC), an approximation for the transaction response-time distribution and thus for the deadline miss probability is obtained. Transactions arrive at the database according to a Poisson process. There is a limited number of

  11. Update of a thermodynamic database for radionuclides to assist solubility limits calculation for performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Duro, L.; Grive, M.; Cera, E.; Domenech, C.; Bruno, J. (Enviros Spain S.L., Barcelona (ES))

    2006-12-15

    This report presents and documents the thermodynamic database used in the assessment of the radionuclide solubility limits within the SR-Can Exercise. It is a supporting report to the solubility assessment. Thermodynamic data are reviewed for 20 radioelements from Groups A and B, lanthanides and actinides. The development of this database is partially based on the one prepared by PSI and NAGRA. Several changes, updates and checks for internal consistency and completeness to the reference NAGRA-PSI 01/01 database have been conducted when needed. These modifications are mainly related to the information from the various experimental programmes and scientific literature available until the end of 2003. Some of the discussions also refer to a previous database selection conducted by Enviros Spain on behalf of ANDRA, where the reader can find additional information. When possible, in order to optimize the robustness of the database, the description of the solubility of the different radionuclides calculated by using the reported thermodynamic database is tested in front of experimental data available in the open scientific literature. When necessary, different procedures to estimate gaps in the database have been followed, especially accounting for temperature corrections. All the methodologies followed are discussed in the main text

  12. Update of a thermodynamic database for radionuclides to assist solubility limits calculation for performance assessment

    International Nuclear Information System (INIS)

    Duro, L.; Grive, M.; Cera, E.; Domenech, C.; Bruno, J.

    2006-12-01

    This report presents and documents the thermodynamic database used in the assessment of the radionuclide solubility limits within the SR-Can Exercise. It is a supporting report to the solubility assessment. Thermodynamic data are reviewed for 20 radioelements from Groups A and B, lanthanides and actinides. The development of this database is partially based on the one prepared by PSI and NAGRA. Several changes, updates and checks for internal consistency and completeness to the reference NAGRA-PSI 01/01 database have been conducted when needed. These modifications are mainly related to the information from the various experimental programmes and scientific literature available until the end of 2003. Some of the discussions also refer to a previous database selection conducted by Enviros Spain on behalf of ANDRA, where the reader can find additional information. When possible, in order to optimize the robustness of the database, the description of the solubility of the different radionuclides calculated by using the reported thermodynamic database is tested in front of experimental data available in the open scientific literature. When necessary, different procedures to estimate gaps in the database have been followed, especially accounting for temperature corrections. All the methodologies followed are discussed in the main text

  13. Collecting Taxes Database

    Data.gov (United States)

    US Agency for International Development — The Collecting Taxes Database contains performance and structural indicators about national tax systems. The database contains quantitative revenue performance...

  14. Comparative performance measures of relational and object-oriented databases using High Energy Physics data

    International Nuclear Information System (INIS)

    Marstaller, J.

    1993-12-01

    The major experiments at the SSC are expected to produce up to 1 Petabyte of data per year. The use of database techniques can significantly reduce the time it takes to access data. The goal of this project was to test which underlying data model, the relational or the object-oriented, would be better suited for archival and accessing high energy data. We describe the relational and the object-oriented data model and their implementation in commercial database management systems. To determine scalability we tested both implementations for 10-MB and 100-MB databases using storage and timing criteria

  15. Preparing College Students To Search Full-Text Databases: Is Instruction Necessary?

    Science.gov (United States)

    Riley, Cheryl; Wales, Barbara

    Full-text databases allow Central Missouri State University's clients to access some of the serials that libraries have had to cancel due to escalating subscription costs; EbscoHost, the subject of this study, is one such database. The database is available free to all Missouri residents. A survey was designed consisting of 21 questions intended…

  16. Overall models and experimental database for UO2 and MOX fuel increasing performance

    International Nuclear Information System (INIS)

    Bernard, L.C.; Blanpain, P.

    2001-01-01

    Framatome steady-state fission gas release database includes more than 290 fuel rods irradiated in commercial and experimental reactors with rod average burnups up to 67 GWd/tM. The transient database includes close to 60 fuel rods with burnups up to 62 GWd//tM. The hold time for these rods ranged from several minutes to many hours and the linear heat generation rates ranged from 30 kW/m to 50 kW/m. The quality of the fission gas release model is state-of-the-art as the uncertainty of the model is comparable to other code models. Framatome is also greatly concerned with the MOX fuel performance and modeling given that, since 1997, more than 1500 MOX fuel assemblies have been delivered to French and foreign PWRs. The paper focuses on the significant data acquired through surveillance and analytical programs used for the validation and the improvement of the MOX fuel modeling. (author)

  17. JCZS: An Intermolecular Potential Database for Performing Accurate Detonation and Expansion Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Baer, M.R.; Hobbs, M.L.; McGee, B.C.

    1998-11-03

    Exponential-13,6 (EXP-13,6) potential pammeters for 750 gases composed of 48 elements were determined and assembled in a database, referred to as the JCZS database, for use with the Jacobs Cowperthwaite Zwisler equation of state (JCZ3-EOS)~l) The EXP- 13,6 force constants were obtained by using literature values of Lennard-Jones (LJ) potential functions, by using corresponding states (CS) theory, by matching pure liquid shock Hugoniot data, and by using molecular volume to determine the approach radii with the well depth estimated from high-pressure isen- tropes. The JCZS database was used to accurately predict detonation velocity, pressure, and temperature for 50 dif- 3 Accurate predictions were also ferent explosives with initial densities ranging from 0.25 glcm3 to 1.97 g/cm . obtained for pure liquid shock Hugoniots, static properties of nitrogen, and gas detonations at high initial pressures.

  18. Database of Low-e Storm Window Energy Performance across U.S. Climate Zones

    Energy Technology Data Exchange (ETDEWEB)

    Culp, Thomas D.; Cort, Katherine A.

    2014-09-04

    This is an update of a report that describes process, assumptions, and modeling results produced Create a Database of U.S. Climate-Based Analysis for Low-E Storm Windows. The scope of the overall effort is to develop a database of energy savings and cost effectiveness of low-E storm windows in residential homes across a broad range of U.S. climates using the National Energy Audit Tool (NEAT) and RESFEN model calculations. This report includes a summary of the results, NEAT and RESFEN background, methodology, and input assumptions, and an appendix with detailed results and assumptions by cliamte zone.

  19. High performance technique for database applicationsusing a hybrid GPU/CPU platform

    KAUST Repository

    Zidan, Mohammed A.; Bonny, Talal; Salama, Khaled N.

    2012-01-01

    Hybrid GPU/CPU platform. In particular, our technique solves the problem of the low efficiency result- ing from running short-length sequences in a database on a GPU. To verify our technique, we applied it to the widely used Smith-Waterman algorithm

  20. Plan for Developing a Materials Performance Database for the Texas Department of Transportation

    Science.gov (United States)

    1999-09-01

    The materials used within the Texas Department of Transportation (TxDOT) are undergoing a period of change. The purpose of this report is to develop the information necessary to develop (for TxDOT) a method or a database for monitoring the performanc...

  1. Biogas composition and engine performance, including database and biogas property model

    NARCIS (Netherlands)

    Bruijstens, A.J.; Beuman, W.P.H.; Molen, M. van der; Rijke, J. de; Cloudt, R.P.M.; Kadijk, G.; Camp, O.M.G.C. op den; Bleuanus, W.A.J.

    2008-01-01

    In order to enable this evaluation of the current biogas quality situation in the EU; results are presented in a biogas database. Furthermore the key gas parameter Sonic Bievo Index (influence on open loop A/F-ratio) is defined and other key gas parameters like the Methane Number (knock resistance)

  2. Global and regional emissions estimates of 1,1-difluoroethane (HFC-152a, CH[subscript 3]CHF[subscript 2]) from in situ and air archive observations

    OpenAIRE

    Prinn, Ronald G.

    2015-01-01

    High frequency, in situ observations from 11 globally distributed sites for the period 1994–2014 and archived air measurements dating from 1978 onward have been used to determine the global growth rate of 1,1-difluoroethane (HFC-152a, CH[subscript 3]CHF[subscript 2]). These observations have been combined with a range of atmospheric transport models to derive global emission estimates in a top-down approach. HFC-152a is a greenhouse gas with a short atmospheric lifetime of about 1.5 years. Si...

  3. A selected thermodynamic database for REE to be used in HLNW performance assessment exercises

    Energy Technology Data Exchange (ETDEWEB)

    Spahiu, K; Bruno, J [MBT Tecnologia Ambiental, Cerdanyola (Spain)

    1995-01-01

    A selected thermodynamic database for the Rare Earth Elements (REE) to be used in the safety assessment of high-level nuclear waste deposition has been compiled. Thermodynamic data for the aqueous species of the REE with the most important ligands relevant for granitic groundwater conditions have been selected and validated. The dominant soluble species under repository conditions are the carbonate complexes of REE. The solubilities of the oxides, hydroxides, carbonates, hydroxycarbonates, phosphates and other important solids have been selected and validated. Solubilities and solubility limiting solids in repository conditions have been estimated with the selected database. At the initial stages of fuel dissolution, the UO{sub 2} matrix dissolution will determine the concentrations of REE. Later on, solid phosphates, hydroxycarbonates and carbonates may limit their solubility. Recommendations for further studies on important systems in repository conditions have been presented. 136 refs, 13 figs, 16 tabs.

  4. High Performance Protein Sequence Database Scanning on the Cell Broadband Engine

    Directory of Open Access Journals (Sweden)

    Adrianto Wirawan

    2009-01-01

    Full Text Available The enormous growth of biological sequence databases has caused bioinformatics to be rapidly moving towards a data-intensive, computational science. As a result, the computational power needed by bioinformatics applications is growing rapidly as well. The recent emergence of low cost parallel multicore accelerator technologies has made it possible to reduce execution times of many bioinformatics applications. In this paper, we demonstrate how the Cell Broadband Engine can be used as a computational platform to accelerate two approaches for protein sequence database scanning: exhaustive and heuristic. We present efficient parallelization techniques for two representative algorithms: the dynamic programming based Smith–Waterman algorithm and the popular BLASTP heuristic. Their implementation on a Playstation®3 leads to significant runtime savings compared to corresponding sequential implementations.

  5. A selected thermodynamic database for REE to be used in HLNW performance assessment exercises

    International Nuclear Information System (INIS)

    Spahiu, K.; Bruno, J.

    1995-01-01

    A selected thermodynamic database for the Rare Earth Elements (REE) to be used in the safety assessment of high-level nuclear waste deposition has been compiled. Thermodynamic data for the aqueous species of the REE with the most important ligands relevant for granitic groundwater conditions have been selected and validated. The dominant soluble species under repository conditions are the carbonate complexes of REE. The solubilities of the oxides, hydroxides, carbonates, hydroxycarbonates, phosphates and other important solids have been selected and validated. Solubilities and solubility limiting solids in repository conditions have been estimated with the selected database. At the initial stages of fuel dissolution, the UO 2 matrix dissolution will determine the concentrations of REE. Later on, solid phosphates, hydroxycarbonates and carbonates may limit their solubility. Recommendations for further studies on important systems in repository conditions have been presented. 136 refs, 13 figs, 16 tabs

  6. Trials by Juries: Suggested Practices for Database Trials

    Science.gov (United States)

    Ritterbush, Jon

    2012-01-01

    Librarians frequently utilize product trials to assess the content and usability of a database prior to committing funds to a new subscription or purchase. At the 2012 Electronic Resources and Libraries Conference in Austin, Texas, three librarians presented a panel discussion on their institutions' policies and practices regarding database…

  7. Synthesis and Characterization of a Perovskite Barium Zirconate (BaZrO[subscript 3]): An Experiment for an Advanced Inorganic Chemistry Laboratory

    Science.gov (United States)

    Thananatthanachon, Todsapon

    2016-01-01

    In this experiment, the students explore the synthesis of a crystalline solid-state material, barium zirconate (BaZrO3) by two different synthetic methods: (a) the wet chemical method using BaCl[subscript 2]·2H[subscript 2]O and ZrOCl[subscript 2]·8H[subscript 2]O as the precursors, and (b) the solid-state reaction from BaCO[subscript 3] and…

  8. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  9. Refactoring databases evolutionary database design

    CERN Document Server

    Ambler, Scott W

    2006-01-01

    Refactoring has proven its value in a wide range of development projects–helping software professionals improve system designs, maintainability, extensibility, and performance. Now, for the first time, leading agile methodologist Scott Ambler and renowned consultant Pramodkumar Sadalage introduce powerful refactoring techniques specifically designed for database systems. Ambler and Sadalage demonstrate how small changes to table structures, data, stored procedures, and triggers can significantly enhance virtually any database design–without changing semantics. You’ll learn how to evolve database schemas in step with source code–and become far more effective in projects relying on iterative, agile methodologies. This comprehensive guide and reference helps you overcome the practical obstacles to refactoring real-world databases by covering every fundamental concept underlying database refactoring. Using start-to-finish examples, the authors walk you through refactoring simple standalone databas...

  10. Development of database and QA systems for post closure performance assessment on a potential HLW repository

    International Nuclear Information System (INIS)

    Hwang, Y. S.; Kim, S. G.; Kang, C. H.

    2002-01-01

    In TSPA of long-term post closure radiological safety on permanent disposal of HLW in Korea, appropriate management of input and output data through QA is necessary. The robust QA system is developed using the T2R3 principles applicable for five major steps in R and D's. The proposed system is implemented in the web-based system so that all participants in TSRA are able to access the system. In addition, the internet based input database for TSPA is developed. Currently data from literature surveys, domestic laboratory and field experiments as well as expert elicitation are applied for TSPA

  11. Patterns of Undergraduates' Use of Scholarly Databases in a Large Research University

    Science.gov (United States)

    Mbabu, Loyd Gitari; Bertram, Albert; Varnum, Ken

    2013-01-01

    Authentication data was utilized to explore undergraduate usage of subscription electronic databases. These usage patterns were linked to the information literacy curriculum of the library. The data showed that out of the 26,208 enrolled undergraduate students, 42% of them accessed a scholarly database at least once in the course of the entire…

  12. Leading product-related environmental performance indicators: a selection guide and database

    DEFF Research Database (Denmark)

    Issa, Isabela I.; Pigosso, Daniela Cristina Antelmi; McAloone, Tim C.

    2015-01-01

    Ecodesign is a proactive environmental management and improvement approach employed in the product development process, which aims to minimize the environmental impacts caused during a product's life cycle and thus improve its environmental performance. The establishment of measurable environmental...... in the selection and application of environmental performance indicators - a more structured approach is still lacking. This paper presents the efforts made to identify and systematize existing leading product-related environmental performance indicators, based on a systematic literature review, and to develop...

  13. Development and Exploration of a Regional Stormwater BMP Performance Database to Parameterize an Integrated Decision Support Tool (i-DST)

    Science.gov (United States)

    Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.

    2017-12-01

    Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.

  14. NoSQL database scaling

    OpenAIRE

    Žardin, Norbert

    2017-01-01

    NoSQL database scaling is a decision, where system resources or financial expenses are traded for database performance or other benefits. By scaling a database, database performance and resource usage might increase or decrease, such changes might have a negative impact on an application that uses the database. In this work it is analyzed how database scaling affect database resource usage and performance. As a results, calculations are acquired, using which database scaling types and differe...

  15. Improving the Computational Performance of Ontology-Based Classification Using Graph Databases

    Directory of Open Access Journals (Sweden)

    Thomas J. Lampoltshammer

    2015-07-01

    Full Text Available The increasing availability of very high-resolution remote sensing imagery (i.e., from satellites, airborne laser scanning, or aerial photography represents both a blessing and a curse for researchers. The manual classification of these images, or other similar geo-sensor data, is time-consuming and leads to subjective and non-deterministic results. Due to this fact, (semi- automated classification approaches are in high demand in affected research areas. Ontologies provide a proper way of automated classification for various kinds of sensor data, including remotely sensed data. However, the processing of data entities—so-called individuals—is one of the most cost-intensive computational operations within ontology reasoning. Therefore, an approach based on graph databases is proposed to overcome the issue of a high time consumption regarding the classification task. The introduced approach shifts the classification task from the classical Protégé environment and its common reasoners to the proposed graph-based approaches. For the validation, the authors tested the approach on a simulation scenario based on a real-world example. The results demonstrate a quite promising improvement of classification speed—up to 80,000 times faster than the Protégé-based approach.

  16. Motion database of disguised and non-disguised team handball penalty throws by novice and expert performers

    Directory of Open Access Journals (Sweden)

    Fabian Helm

    2017-12-01

    Full Text Available This article describes the motion database for a large sample (n = 2400 of 7-m penalty throws in team handball that includes 1600 disguised throws. Throws were performed by both novice (n = 5 and expert (n = 5 penalty takers. The article reports the methods and materials used to capture the motion data. The database itself is accessible for download via JLU Web Server and provides all raw files in a three-dimensional motion data format (.c3d. Additional information is given on the marker placement of the penalty taker, goalkeeper, and ball together with details on the skill level and/or playing history of the expert group. The database was first used by Helm et al. (2017 [1] to investigate the kinematic patterns of disguised movements. Results of this analysis are reported and discussed in their article “Kinematic patterns underlying disguised movements: Spatial and temporal dissimilarity compared to genuine movement patterns” (doi:10.1016/j.humov.2017.05.010 [1]. Keywords: Motion capture data, Disguise, Expertise

  17. The Relationship between Searches Performed in Online Databases and the Number of Full-Text Articles Accessed: Measuring the Interaction between Database and E-Journal Collections

    Science.gov (United States)

    Lamothe, Alain R.

    2011-01-01

    The purpose of this paper is to report the results of a quantitative analysis exploring the interaction and relationship between the online database and electronic journal collections at the J. N. Desmarais Library of Laurentian University. A very strong relationship exists between the number of searches and the size of the online database…

  18. First Toronto Conference on Database Users. Systems that Enhance User Performance.

    Science.gov (United States)

    Doszkocs, Tamas E.; Toliver, David

    1987-01-01

    The first of two papers discusses natural language searching as a user performance enhancement tool, focusing on artificial intelligence applications for information retrieval and problems with natural language processing. The second presents a conceptual framework for further development and future design of front ends to online bibliographic…

  19. Immunoassay for Visualization of Protein-Protein Interactions on Ni-Nitrilotriacetate Support: Example of a Laboratory Exercise with Recombinant Heterotrimeric G[alpha][subscript i2][beta][subscript 1[gamma]2] Tagged by Hexahistidine from sf9 Cells

    Science.gov (United States)

    Bavec, Aljosa

    2004-01-01

    We have developed an "in vitro assay" for following the interaction between the [alpha][subscript i2] subunit and [beta][subscript 1[gamma]2] dimer from sf9 cells. This method is suitable for education purposes because it is easy, reliable, nonexpensive, can be applied for a big class of 20 students, and avoid the commonly used kinetic approach,…

  20. Independent Review of Mitigating System Performance Indicator Reporting in the EPIX Database

    Energy Technology Data Exchange (ETDEWEB)

    Wierman, Thomas Edward [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2009-05-01

    This report summarizes work done to verify the component, failure mode, and method of detection information provided in the Equipment Performance Information Exchange (EPIX) to support implementation of Mitigating Systems Performance Indices. This task is to select reports from EPIX and determine if their categorization as MSPI or non-MSPI failures is consistent with the development of unreliability baseline failure rates, and whether this significantly affects estimates of plant risk. This review is of all MSPI devices in EPIX that were reported as failures. The components include emergency generators; motor-driven, turbine-driven, and enginedriven pumps; and air and motor-operated valves. The date range for this report includes all MSPI device reported failures from 2003 to the most current EPIX data at the INL (up to the 3rd quarter 2008).

  1. Development of an Expanded, High Reliability Cost and Performance Database for In Situ Remediation Technologies

    Science.gov (United States)

    2016-03-01

    Biomass (cells/mL) 1.0 E+4 1.8 E+7 5.3 E+6 -- 2.7 E+7 DNA (ug/L) 0.04 72 21 -- 108 Dehalococcoides (cells/L) -- -- -- -- ə.0 E+4 Field Measurements...indicates that bioremediation processes could remain on- going for some time at this site. Total biomass concentrations in the most recent sample...of medium to coarse-grained sand and crushed shells . The Pilot Test was performed with three injection wells and three extraction wells over a 20

  2. Sustainability Initiatives and Organizational Performance: An Analysis of Publications in the WEB of SCIENCE DATABASE

    Directory of Open Access Journals (Sweden)

    Eduardo Luís Hepper

    2016-07-01

    Full Text Available Brazil is going through a time of reflection about the preservation of natural resources, an issue that is increasingly considered in its agenda. The search for balance between environmental, social and economic aspects has been a challenge for business survival over the years and has led companies to adopt initiatives focused on sustainability. The objective of this article is to analyse how the international scientific production addresses sustainable practices and initiatives and their relationship with organizational performance. Considering this scope, a bibliometric study of the publications located on Web of Science - Social Sciences Citation Index (WoS-SSCI was developed. There were 33 articles identified and selected on the subject. Journals that stand out in quantity of articles and number of citations are the Journal of Cleaner Production and Strategic Management Journal, respectively. Analysing the results, a growing concern about this issue and the increase in publications was noticed after the 2000s. The results found, in general, associate sustainable practices to positive organizational performance, such as increased profit on the product sold, quality improvement, improved reputation, and waste reduction, among others gains identified.

  3. Blockade of IP[subscript 3]-Mediated SK Channel Signaling in the Rat Medial Prefrontal Cortex Improves Spatial Working Memory

    Science.gov (United States)

    Brennan, Avis R.; Dolinsky, Beth; Vu, Mai-Anh T.; Stanley, Marion; Yeckel, Mark F.; Arnsten, Amy F. T.

    2008-01-01

    Planning and directing thought and behavior require the working memory (WM) functions of prefrontal cortex. WM is compromised by stress, which activates phosphatidylinositol (PI)-mediated IP[subscript 3]-PKC intracellular signaling. PKC overactivation impairs WM operations and in vitro studies indicate that IP[subscript 3] receptor (IP[subscript…

  4. Comparative research performance of top universities from the northeastern Brazil on three pharmacological disciplines as seen in scopus database

    Directory of Open Access Journals (Sweden)

    Jean P. Kamdem, PhD

    2017-12-01

    Full Text Available Objectives: Postgraduate programmes around the world are periodically subjected to research performance evaluation through bibliometric indicators. In this research, we characterized and compared the research performance of 15 universities from Northeastern Brazil, in which 13 were among the top Universities of the Latin America. Methods: Specifically, total documents, citations and the h-index of each university were retrieved from the Elsevier Scopus database and were analysed not only for historical scientific achievement but also across the period of the past 6 years (2010–2015. Using these bibliometric indicators, we also investigated the performance of programmes at these Universities that have their papers indexed in the Scopus database under the category of “Pharmacology, Toxicology and Pharmaceuticals” for the same period. Results: We found that the Federal University of Pernambuco (UFPE and the Federal University of Ceará (UFC were the most productive institutions, producing 17847 and 15048 documents, respectively. The number of papers published by each of these universities in the past six years represented more than 50% of their entire productivity. With regards to their scientific output in “Pharmacology, Toxicology and Pharmaceutics”, UFC showed the highest number of published documents followed by UFPE and the Federal University of Paraíba (UFPB. UFC received the highest h-index (with and without self-citations and number of citations and shared their most cited papers with foreign institutions from the USA and Germany. However, papers from UFC were published in journals with lower impact factors (2.322. Conclusions: The present study shows where each of these universities stands and can be helpful in identifying potential collaborators in these areas of knowledge. Keywords: Citations, CNPq, h-index, Northeastern Brazil, UFC

  5. How Often Is p[subscript rep] Close to the True Replication Probability?

    Science.gov (United States)

    Trafimow, David; MacDonald, Justin A.; Rice, Stephen; Clason, Dennis L.

    2010-01-01

    Largely due to dissatisfaction with the standard null hypothesis significance testing procedure, researchers have begun to consider alternatives. For example, Killeen (2005a) has argued that researchers should calculate p[subscript rep] that is purported to indicate the probability that, if the experiment in question were replicated, the obtained…

  6. Killeen's (2005) "p[subscript rep]" Coefficient: Logical and Mathematical Problems

    Science.gov (United States)

    Maraun, Michael; Gabriel, Stephanie

    2010-01-01

    In his article, "An Alternative to Null-Hypothesis Significance Tests," Killeen (2005) urged the discipline to abandon the practice of "p[subscript obs]"-based null hypothesis testing and to quantify the signal-to-noise characteristics of experimental outcomes with replication probabilities. He described the coefficient that he…

  7. 31 CFR 344.8 - What other provisions apply to subscriptions for Demand Deposit securities?

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false What other provisions apply to subscriptions for Demand Deposit securities? 344.8 Section 344.8 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC...

  8. Non-Exercise Estimation of VO[subscript 2]max Using the International Physical Activity Questionnaire

    Science.gov (United States)

    Schembre, Susan M.; Riebe, Deborah A.

    2011-01-01

    Non-exercise equations developed from self-reported physical activity can estimate maximal oxygen uptake (VO[subscript 2]max) as well as sub-maximal exercise testing. The International Physical Activity Questionnaire is the most widely used and validated self-report measure of physical activity. This study aimed to develop and test a VO[subscript…

  9. Rethinking the Subscription Paradigm for Journals: Using Interlibrary Loan in Collection Development for Serials

    Science.gov (United States)

    Barton, Gail Perkins; Relyea, George E.; Knowlton, Steven A.

    2018-01-01

    Many librarians evaluate local Interlibrary Loan (ILL) statistics as part of collection development decisions concerning new subscriptions. In this study, the authors examine whether the number of ILL article requests received in one academic year can predict the use of those same journal titles once they are added as library resources. There is…

  10. The Essence Of Government Shares Subscription A Review The Implementation Of State-Owned Enterprises

    Directory of Open Access Journals (Sweden)

    Urbanisasi

    2015-08-01

    Full Text Available The purpose of this study was to determine and explain the mechanisms and the implementation of government share subscription in the implementation of SOEs legal standing of the government shares subscription in the implementation of the state budget that separated in the implementation of SOEs and its legal implications of state loss or not and also legal accountability for losses arising out of shares subscription of SOE. In this study the authors use normative legal research. The data obtained in this study will be analyzed using qualitative normative method with inductive logic. Results from the study indicate that state shares subscription in the establishment of SOE or limited company with funds derived from State Budget are separated. Thus the government no longer has any authority in the field of civil law as a business entity. A clear separation of the status of country as business and as government organizer carries consequences. With the separation then there is clarity about the concept of the state financial losses. SOE as one form of business entity that aim to make a profit is a separate legal entity and has responsibilities that are separately anyway though formed and capital originating from the state finances and the loss of one transaction or in legal entity cannot be categorized as a state finance loss because the state has functioned as a private legal entity.

  11. 11 CFR 100.52 - Gift, subscription, loan, advance or deposit of money.

    Science.gov (United States)

    2010-01-01

    ... money. 100.52 Section 100.52 Federal Elections FEDERAL ELECTION COMMISSION GENERAL SCOPE AND DEFINITIONS..., advance or deposit of money. (a) A gift, subscription, loan (except for a loan made in accordance with 11 CFR 100.72 and 100.73), advance, or deposit of money or anything of value made by any person for the...

  12. Data-Driven Transition: Joint Reporting of Subscription Expenditure and Publication Costs

    Directory of Open Access Journals (Sweden)

    Irene Barbers

    2018-04-01

    Full Text Available The transition process from the subscription model to the open access model in the world of scholarly publishing brings a variety of challenges to libraries. Within this evolving landscape, the present article takes a focus on budget control for both subscription and publication expenditure with the opportunity to enable the shift from one to the other. To reach informed decisions with a solid base of data to be used in negotiations with publishers, the diverse already-existing systems for managing publications costs and for managing journal subscriptions have to be adapted to allow comprehensive reporting on publication expenditure and subscription expenditure. In the case presented here, two separate systems are described and the establishment of joint reporting covering both these systems is introduced. Some of the results of joint reporting are presented as an example of how such a comprehensive monitoring can support management decisions and negotiations. On a larger scale, the establishment of the National Open Access Monitor in Germany is introduced, bringing together a diverse range of data from several already-existing systems, including, among others, holdings information, usage data, and data on publication fees. This system will enable libraries to access all relevant data with a single user interface.

  13. Statistical analyses of variability/reproducibility of environmentally assisted cyclic crack growth rate data utilizing JAERI Material Performance Database (JMPD)

    International Nuclear Information System (INIS)

    Tsuji, Hirokazu; Yokoyama, Norio; Nakajima, Hajime; Kondo, Tatsuo

    1993-05-01

    Statistical analyses were conducted by using the cyclic crack growth rate data for pressure vessel steels stored in the JAERI Material Performance Database (JMPD), and comparisons were made on variability and/or reproducibility of the data between obtained by ΔK-increasing and by ΔK-constant type tests. Based on the results of the statistical analyses, it was concluded that ΔK-constant type tests are generally superior to the commonly used ΔK-increasing type ones from the viewpoint of variability and/or reproducibility of the data. Such a tendency was more pronounced in the tests conducted in simulated LWR primary coolants than those in air. (author)

  14. Olfactory Bulb [alpha][subscript 2]-Adrenoceptor Activation Promotes Rat Pup Odor-Preference Learning via a cAMP-Independent Mechanism

    Science.gov (United States)

    Shakhawat, Amin MD.; Harley, Carolyn W.; Yuan, Qi

    2012-01-01

    In this study, three lines of evidence suggest a role for [alpha][subscript 2]-adrenoreceptors in rat pup odor-preference learning: olfactory bulb infusions of the [alpha][subscript 2]-antagonist, yohimbine, prevents learning; the [alpha][subscript 2]-agonist, clonidine, paired with odor, induces learning; and subthreshold clonidine paired with…

  15. Downregulation of GABA[Subscript A] Receptor Protein Subunits a6, ß2, d, e, ?2, ?, and ?2 in Superior Frontal Cortex of Subjects with Autism

    Science.gov (United States)

    Fatemi, S. Hossein; Reutiman, Teri J.; Folsom, Timothy D.; Rustan, Oyvind G.; Rooney, Robert J.; Thuras, Paul D.

    2014-01-01

    We measured protein and mRNA levels for nine gamma-aminobutyric acid A (GABA[subscript A]) receptor subunits in three brain regions (cerebellum, superior frontal cortex, and parietal cortex) in subjects with autism versus matched controls. We observed changes in mRNA for a number of GABA[subscript A] and GABA[subscript B] subunits and overall…

  16. mRNA and Protein Levels for GABA[subscript A][alpha]4, [alpha]5, [beta]1 and GABA[subscript B]R1 Receptors are Altered in Brains from Subjects with Autism

    Science.gov (United States)

    Fatemi, S. Hossein; Reutiman, Teri J.; Folsom, Timothy D.; Rooney, Robert J.; Patel, Diven H.; Thuras, Paul D.

    2010-01-01

    We have shown altered expression of gamma-aminobutyric acid A (GABA[subscript A]) and gamma-aminobutyric acid B (GABA[subscript B]) receptors in the brains of subjects with autism. In the current study, we sought to verify our western blotting data for GABBR1 via qRT-PCR and to expand our previous work to measure mRNA and protein levels of 3…

  17. The cost and performance of utility commercial lighting programs. A report from the Database on Energy Efficiency Programs (DEEP) project

    Energy Technology Data Exchange (ETDEWEB)

    Eto, J.; Vine, E.; Shown, L.; Sonnenblick, R.; Payne, C. [Lawrence Berkeley Lab., CA (United States). Energy and Environment Div.

    1994-05-01

    The objective of the Database on Energy Efficiency Programs (DEEP) is to document the measured cost and performance of utility-sponsored, energy-efficiency, demand-side management (DSM) programs. Consistent documentation of DSM programs is a challenging goal because of problems with data consistency, evaluation methodologies, and data reporting formats that continue to limit the usefulness and comparability of individual program results. This first DEEP report investigates the results of 20 recent commercial lighting DSM programs. The report, unlike previous reports of its kind, compares the DSM definitions and methodologies that each utility uses to compute costs and energy savings and then makes adjustments to standardize reported program results. All 20 programs were judged cost-effective when compared to avoided costs in their local areas. At an average cost of 3.9{cents}/kWh, however, utility-sponsored energy efficiency programs are not ``too cheap to meter.`` While it is generally agreed upon that utilities must take active measures to minimize the costs and rate impacts of DSM programs, the authors believe that these activities will be facilitated by industry adoption of standard definitions and reporting formats, so that the best program designs can be readily identified and adopted.

  18. The influence of the negative-positive ratio and screening database size on the performance of machine learning-based virtual screening.

    Science.gov (United States)

    Kurczab, Rafał; Bojarski, Andrzej J

    2017-01-01

    The machine learning-based virtual screening of molecular databases is a commonly used approach to identify hits. However, many aspects associated with training predictive models can influence the final performance and, consequently, the number of hits found. Thus, we performed a systematic study of the simultaneous influence of the proportion of negatives to positives in the testing set, the size of screening databases and the type of molecular representations on the effectiveness of classification. The results obtained for eight protein targets, five machine learning algorithms (SMO, Naïve Bayes, Ibk, J48 and Random Forest), two types of molecular fingerprints (MACCS and CDK FP) and eight screening databases with different numbers of molecules confirmed our previous findings that increases in the ratio of negative to positive training instances greatly influenced most of the investigated parameters of the ML methods in simulated virtual screening experiments. However, the performance of screening was shown to also be highly dependent on the molecular library dimension. Generally, with the increasing size of the screened database, the optimal training ratio also increased, and this ratio can be rationalized using the proposed cost-effectiveness threshold approach. To increase the performance of machine learning-based virtual screening, the training set should be constructed in a way that considers the size of the screening database.

  19. Dissonance and Neutralization of Subscription Streaming Era Digital Music Piracy : An Initial Exploration

    OpenAIRE

    Riekkinen, Janne

    2016-01-01

    Both legal and illegal forms of digital music consumption continue to evolve with wider adoption of subscription streaming services. With this paper, we aim to extend theory on digital music piracy by showing that the rising controversy and diminishing acceptance of illegal forms of consumption call for new theoretical components and interactions. We introduce a model that integrates insights from neutralization and cognitive dissonance theories. As an initial empirical test of th...

  20. Piracy versus Netflix : Subscription Video on Demand Dissatisfaction as an Antecedent of Piracy

    OpenAIRE

    Riekkinen, Janne

    2018-01-01

    Drawing from cognitive dissonance and neutralization theories, this study seeks to improve the understanding on consumer decision-making between the current legal and illegal video consumption alternatives. We develop and test a research model featuring Subscription Video on Demand (SVOD) satisfaction and various dimensions of SVOD quality as antecedents of video piracy neutralizations and attitudes. Based on results from an online survey among Finnish SVOD users, SVOD satisfaction is primari...

  1. Automated Oracle database testing

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Ensuring database stability and steady performance in the modern world of agile computing is a major challenge. Various changes happening at any level of the computing infrastructure: OS parameters & packages, kernel versions, database parameters & patches, or even schema changes, all can potentially harm production services. This presentation shows how an automatic and regular testing of Oracle databases can be achieved in such agile environment.

  2. Using a Spreadsheet to Solve the Schro¨dinger Equations for the Energies of the Ground Electronic State and the Two Lowest Excited States of H[subscript2

    Science.gov (United States)

    Ge, Yingbin; Rittenhouse, Robert C.; Buchanan, Jacob C.; Livingston, Benjamin

    2014-01-01

    We have designed an exercise suitable for a lab or project in an undergraduate physical chemistry course that creates a Microsoft Excel spreadsheet to calculate the energy of the S[subscript 0] ground electronic state and the S[subscript 1] and T[subscript 1] excited states of H[subscript 2]. The spreadsheet calculations circumvent the…

  3. 47 CFR 73.644 - Subscription TV transmission systems.

    Science.gov (United States)

    2010-10-01

    ... station must perform such tests and measurements to determine that the transmitted encoded signal conforms... the system being used. A copy of the measurement data is to be maintained in the station files and... being used of both the aural and visual baseband signals and the transmitted radiofrequency signals, and...

  4. JAEA thermodynamic database for performance assessment of geological disposal of high-level and TRU wastes. Selection of thermodynamic data of selenium

    International Nuclear Information System (INIS)

    Doi, Reisuke; Kitamura, Akira; Yui, Mikazu

    2010-02-01

    Within the scope of the JAEA thermodynamic database project for performance assessment of geological disposal of high-level and TRU radioactive wastes, the selection of the thermodynamic data on the inorganic compounds and complexes of selenium was carried out. Selection of thermodynamic data of selenium was based on a thermodynamic database of selenium published by the Nuclear Energy Agency in the Organisation for Economic Co-operation and Development (OECD/NEA). The remarks of a thermodynamic database by OECD/NEA found by the authors were noted in this report and then thermodynamic data was reviewed after surveying latest literatures. Some thermodynamic values of iron selenides were not selected by the OECD/NEA due to low reliability. But they were important for the performance assessment of geological disposal of radioactive wastes, so we selected them as a tentative value with specifying reliability and needs of the value to be determined. (author)

  5. Methylphenidate and Atomoxetine Enhance Prefrontal Function through alpha[subscript 2]-Adrenergic and Dopamine D[subscript 1] Receptors

    Science.gov (United States)

    Gamo, Nao J.; Wang, Min; Arnsten, Amy F. T.

    2010-01-01

    Objective: This study examined the effects of the attention-deficit/hyperactivity disorder treatments, methylphenidate (MPH) and atomoxetine (ATM), on prefrontal cortex (PFC) function in monkeys and explored the receptor mechanisms underlying enhancement of PFC function at the behavioral and cellular levels. Method: Monkeys performed a working…

  6. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  7. Sorption, Diffusion and Solubility Databases for Performance Assessment; Base de Datos de Sorcion, Difusion y Solubilidad para la Evacuacion del Comportamiento

    Energy Technology Data Exchange (ETDEWEB)

    Garcia Gutierrez, M [Ciemat, Madrid (Spain)

    2000-07-01

    This report presents a deterministic and probabilistic databases for application in Performance Assessment of a high-level radioactive waste disposal. This work includes a theoretical description of sorption, diffusion and solubility phenomena of radionuclides in geological media. The report presents and compares the database of different nuclear wastes management agencies, describes the materials in the Spanish reference system, and the results of sorption diffusion and solubility in this system, with both the deterministic and probabilistic approximation. the probabilistic approximation is presented in the form of probability density functions (pdf). (Author) 52 refs.

  8. Biofuel Database

    Science.gov (United States)

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  9. Community Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This excel spreadsheet is the result of merging at the port level of several of the in-house fisheries databases in combination with other demographic databases such...

  10. Stackfile Database

    Science.gov (United States)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  11. JAEA thermodynamic database for performance assessment of geological disposal of high-level and TRU wastes. Selection of thermodynamic data of cobalt and nickel

    International Nuclear Information System (INIS)

    Kitamura, Akira; Yui, Mikazu; Kirishima, Akira; Saito, Takumi; Shibutani, Sanae; Tochiyama, Osamu

    2009-11-01

    Within the scope of the JAEA thermodynamic database project for performance assessment of geological disposal of high-level and TRU wastes, the selection of the thermodynamic data on the inorganic compounds and complexes of cobalt and nickel have been carried out. For cobalt, extensive literature survey has been performed and all the obtained literatures have been carefully reviewed to select the thermodynamic data. Selection of thermodynamic data of nickel has been based on a thermodynamic database published by the Nuclear Energy Agency in the Organisation for Economic Co-operation and Development (OECD/NEA), which has been carefully reviewed by the authors, and then thermodynamic data have been selected after surveying latest literatures. Based on the similarity of chemical properties between cobalt and nickel, complementary thermodynamic data of nickel and cobalt species expected under the geological disposal condition have been selected to complete the thermodynamic data set for the performance assessment of geological disposal of radioactive wastes. (author)

  12. Promoting public transport as a subscription service: Effects of a free month travel card

    DEFF Research Database (Denmark)

    Thøgersen, John

    2009-01-01

    Newspapers, book clubs, telephone services and many other subscription services are often marketed to new customers by means of a free or substantially discounted trial period. This article evaluates this method as a means to promote commuting by public transport in a field experiment and based...... that had an effect was the free month travel card, which led to a significant increase in commuting by public transport.As expected, the effect was mediated through a change in behavioural intentions rather than a change in perceived constraints. As expected, the effect became weaker when the promotion...

  13. Magazine "Companion Websites" and the Demand for Newsstand Sales and Subscriptions

    DEFF Research Database (Denmark)

    Kaiser, Ulrich; Kongsted, H.C.

    2012-01-01

    analysis finds some support for the widespread belief that the Internet cannibalizes print media. On average, a 1% increase in companion website traffic is associated with a weakly significant decrease in total print circulation by 0.15%. This association is mainly driven by a statistically significant...... and negative mapping between website visits and kiosk sales, although they do not find any statistically significant relationship between website visits and subscriptions. The latter finding is reassuring for publishers because advertisers value a large subscriber base. Moreover, the authors show...

  14. The Role of Subscription-Based Patrol and Restitution in the Future of Liberty

    Directory of Open Access Journals (Sweden)

    Gil Guillory

    2009-02-01

    Full Text Available Market anarchists are often keen to know how we might rid ourselves of the twin evils institutionalized in the state: taxation and monopoly. A possible future history for North America is suggested, focusing upon the implications of the establishment of a subscription-based patrol and restitution business sector. We favor Rothbard over Higgs regarding crises and liberty. We favor Barnett over Rothbard regarding vertical integration of security. We examine derived demand for adjudication, mediation and related goods; and we advance the thesis that private adjudication will tend to libertarianly just decisions. We show how firms will actively build civil society, strengthening and coordinating Nisbettian intermediating institutions.

  15. Drug interaction databases in medical literature

    DEFF Research Database (Denmark)

    Kongsholm, Gertrud Gansmo; Nielsen, Anna Katrine Toft; Damkier, Per

    2015-01-01

    PURPOSE: It is well documented that drug-drug interaction databases (DIDs) differ substantially with respect to classification of drug-drug interactions (DDIs). The aim of this study was to study online available transparency of ownership, funding, information, classifications, staff training...... available transparency of ownership, funding, information, classifications, staff training, and underlying documentation varies substantially among various DIDs. Open access DIDs had a statistically lower score on parameters assessed....... and the three most commonly used subscription DIDs in the medical literature. The following parameters were assessed for each of the databases: Ownership, classification of interactions, primary information sources, and staff qualification. We compared the overall proportion of yes/no answers from open access...

  16. Supply Chain Initiatives Database

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-11-01

    The Supply Chain Initiatives Database (SCID) presents innovative approaches to engaging industrial suppliers in efforts to save energy, increase productivity and improve environmental performance. This comprehensive and freely-accessible database was developed by the Institute for Industrial Productivity (IIP). IIP acknowledges Ecofys for their valuable contributions. The database contains case studies searchable according to the types of activities buyers are undertaking to motivate suppliers, target sector, organization leading the initiative, and program or partnership linkages.

  17. JAEA thermodynamic database for performance assessment of geological disposal of high-level and TRU wastes. Refinement of thermodynamic data for tetravalent thorium, uranium, neptunium and plutonium

    International Nuclear Information System (INIS)

    Fujiwara, Kenso; Kitamura, Akira; Yui, Mikazu

    2010-03-01

    Within the scope of the JAEA thermodynamic database project for performance assessment of geological disposal of high-level and TRU radioactive wastes, the refinement of the thermodynamic data for the inorganic compounds and complexes of Thorium(IV), Uranium(IV), Neptunium(IV) and Plutonium(IV) was carried out. Refinement of thermodynamic data for the element was performed on a basis of the thermodynamic database for actinide published by the Nuclear Energy Agency in the Organisation for Economic Co-operation and Development (OECD/NEA). Additionally, the latest data after publication of thermodynamic data by OECD/NEA were reevaluated to determine whether the data should be included in the JAEA-TDB. (author)

  18. JAEA thermodynamic database for performance assessment of geological disposal of high-level and TRU wastes. Refinement of thermodynamic data for trivalent actinoids and samarium

    International Nuclear Information System (INIS)

    Kitamura, Akira; Fujiwara, Kenso; Yui, Mikazu

    2010-01-01

    Within the scope of the JAEA thermodynamic database project for performance assessment of geological disposal of high-level radioactive and TRU wastes, the refinement of the thermodynamic data for the inorganic compounds and complexes of trivalent actinoids (actinium(III), plutonium(III), americium(III) and curium(III)) and samarium(III) was carried out. Refinement of thermodynamic data for these elements was based on the thermodynamic database for americium published by the Nuclear Energy Agency in the Organisation for Economic Co-operation and Development (OECD/NEA). Based on the similarity of chemical properties among trivalent actinoids and samarium, complementary thermodynamic data for their species expected under the geological disposal conditions were selected to complete the thermodynamic data set for the performance assessment of geological disposal of radioactive wastes. (author)

  19. Role of library's subscription licenses in promoting open access to scientific research

    KAUST Repository

    Buck, Stephen

    2018-04-30

    This presentation, based on KAUST’’s experience to date, will attempt to explain the different ways of bringing Open Access models to scientific Publisher’s licenses. Our dual approach with offset pricing is to redirect subscription money to publishing money and embed green open access deposition terms in understandable language in our license agreements. Resolving the inherent complexities in open access publishing, repository depositions and offsetting models will save libraries money and also time wasted on tedious and unnecessary administration work. Researchers will also save their time with overall clarity and transparency. This will enable trust and, where mistakes are made, and there inevitably will be with untried models, we can learn from these mistakes and make better, more robust services with auto deposition of our articles to our repository fed by Publishers’ themselves. The plan is to cover all Publishers with OA license terms for KAUST author’s right while continuing our subscription to them. There are marketing campaigns, awareness sessions are planned, in addition to establishing Libguides to help researchers, in addition to manage offset pricing models.

  20. Role of library's subscription licenses in promoting open access to scientific research

    KAUST Repository

    Buck, Stephen

    2018-01-01

    This presentation, based on KAUST’’s experience to date, will attempt to explain the different ways of bringing Open Access models to scientific Publisher’s licenses. Our dual approach with offset pricing is to redirect subscription money to publishing money and embed green open access deposition terms in understandable language in our license agreements. Resolving the inherent complexities in open access publishing, repository depositions and offsetting models will save libraries money and also time wasted on tedious and unnecessary administration work. Researchers will also save their time with overall clarity and transparency. This will enable trust and, where mistakes are made, and there inevitably will be with untried models, we can learn from these mistakes and make better, more robust services with auto deposition of our articles to our repository fed by Publishers’ themselves. The plan is to cover all Publishers with OA license terms for KAUST author’s right while continuing our subscription to them. There are marketing campaigns, awareness sessions are planned, in addition to establishing Libguides to help researchers, in addition to manage offset pricing models.

  1. Operational Experience of an Open-Access, Subscription-Based Mass Spectrometry and Proteomics Facility

    Science.gov (United States)

    Williamson, Nicholas A.

    2018-03-01

    This paper discusses the successful adoption of a subscription-based, open-access model of service delivery for a mass spectrometry and proteomics facility. In 2009, the Mass Spectrometry and Proteomics Facility at the University of Melbourne (Australia) moved away from the standard fee for service model of service provision. Instead, the facility adopted a subscription- or membership-based, open-access model of service delivery. For a low fixed yearly cost, users could directly operate the instrumentation but, more importantly, there were no limits on usage other than the necessity to share available instrument time with all other users. All necessary training from platform staff and many of the base reagents were also provided as part of the membership cost. These changes proved to be very successful in terms of financial outcomes for the facility, instrument access and usage, and overall research output. This article describes the systems put in place as well as the overall successes and challenges associated with the operation of a mass spectrometry/proteomics core in this manner. [Figure not available: see fulltext.

  2. Evaluated and estimated solubility of some elements for performance assessment of geological disposal of high-level radioactive waste using updated version of thermodynamic database

    International Nuclear Information System (INIS)

    Kitamura, Akira; Doi, Reisuke; Yoshida, Yasushi

    2011-01-01

    Japan Atomic Energy Agency (JAEA) established the thermodynamic database (JAEA-TDB) for performance assessment of geological disposal of high-level radioactive waste (HLW) and TRU waste. Twenty-five elements which were important for the performance assessment of geological disposal were selected for the database. JAEA-TDB enhanced reliability of evaluation and estimation of their solubility through selecting the latest and the most reliable thermodynamic data at present. We evaluated and estimated solubility of the 25 elements in the simulated porewaters established in the 'Second Progress Report for Safety Assessment of Geological Disposal of HLW in Japan' using the JAEA-TDB and compared with those using the previous thermodynamic database (JNC-TDB). It was found that most of the evaluated and estimated solubility values were not changed drastically, but the solubility and speciation of dominant aqueous species for some elements using the JAEA-TDB were different from those using the JNC-TDB. We discussed about how to provide reliable solubility values for the performance assessment. (author)

  3. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  4. The L-Type Voltage-Gated Calcium Channel Ca [subscript V] 1.2 Mediates Fear Extinction and Modulates Synaptic Tone in the Lateral Amygdala

    Science.gov (United States)

    Temme, Stephanie J.; Murphy, Geoffrey G.

    2017-01-01

    L-type voltage-gated calcium channels (LVGCCs) have been implicated in both the formation and the reduction of fear through Pavlovian fear conditioning and extinction. Despite the implication of LVGCCs in fear learning and extinction, studies of the individual LVGCC subtypes, Ca[subscript V]1.2 and Ca[subscript V] 1.3, using transgenic mice have…

  5. Online Database Allows for Quick and Easy Monitoring and Reporting of Supplementary Feeding Program Performance: An Analysis of World Vision CMAM Programs (2006-2013)

    International Nuclear Information System (INIS)

    Emary, Colleen; Aidam, Bridget; Roberton, Tim

    2014-01-01

    Full text: Background: Despite the widespread implementation of interventions to address moderate acute malnutrition (MAM), lack of robust monitoring systems have hindered evaluation of the effectiveness of approaches to prevent and treat MAM. Since 2006, World Vision (WV) has provided supplementary feeding to 280,518 children 6-59 months of age (U5) and 105,949 pregnant and lactating women (PLW) as part of Community Based Management of Acute Malnutrition (CMAM) programming. The Excel-based system initially used for monitoring individual site programs faced numerous challenges. It was time consuming, prone to human error, lost data as a result of staff turnover and hence use of data to inform program performance was limited. In 2010, World Vision International (WVI)’s Nutrition Centre of Expertise (NCOE) established an online database to overcome these limitations. The aim of the database was to improve monitoring and reporting of WV’s CMAM programs. As of December 2013, the database has been rolled out in 14 countries Burundi, Chad, DRC, Ethiopia, Kenya, Mali, Mauritania, Niger, Sudan, Pakistan, South Sudan, Somalia, Zimbabwe and Zambia. Methods: The database includes data on admissions (mid-upper arm circumference, weight for height, oedema, referral) and discharge outcomes (recovered, died, defaulted, non-recovered, referral) for Supplementary Feeding Programs (SFPs) for children U5 as well as PLWs. A quantitative analysis of the data sets available was conducted to identify issues with data quality and draw findings from the data itself. Variations in program performance as compared to Sphere standards were determined by country and aggregated over the 14 countries. In addition, time trend analyses were conducted to determine significant different and seasonality effects. Results: Most data was related to program admissions from 2010 to July 2013, though some retrospective program data was available from 2006 to 2009. The countries with the largest number

  6. Peer review quality and transparency of the peer-review process in open access and subscription journals

    NARCIS (Netherlands)

    Wicherts, J.M.

    2016-01-01

    Background Recent controversies highlighting substandard peer review in Open Access (OA) and traditional (subscription) journals have increased the need for authors, funders, publishers, and institutions to assure quality of peer-review in academic journals. I propose that transparency of the

  7. Nanosized TiO[subscript 2] for Photocatalytic Water Splitting Studied by Oxygen Sensor and Data Logger

    Science.gov (United States)

    Zhang, Ruinan; Liu, Song; Yuan, Hongyan; Xiao, Dan; Choi, Martin M. F.

    2012-01-01

    Photocatalytic water splitting by semiconductor photocatalysts has attracted considerable attention in the past few decades. In this experiment, nanosized titanium dioxide (nano-TiO[subscript 2]) particles are used to photocatalytically split water, which is then monitored by an oxygen sensor. Sacrificial reagents such as organics (EDTA) and metal…

  8. Effects of GABA[subscript A] Modulators on the Repeated Acquisition of Response Sequences in Squirrel Monkeys

    Science.gov (United States)

    Campbell, Una C.; Winsauer, Peter J.; Stevenson, Michael W.; Moerschbaecher, Joseph M.

    2004-01-01

    The present study investigated the effects of positive and negative GABA[subscript A] modulators under three different baselines of repeated acquisition in squirrel monkeys in which the monkeys acquired a three-response sequence on three keys under a second-order fixed-ratio (FR) schedule of food reinforcement. In two of these baselines, the…

  9. The A[subscript 1c] Blood Test: An Illustration of Principles from General and Organic Chemistry

    Science.gov (United States)

    Kerber, Robert C.

    2007-01-01

    The glycated hemoglobin blood test, usually designated as the A[subscript 1c] test, is a key measure of the effectiveness of glucose control in diabetics. The chemistry of glucose in the bloodstream, which underlies the test and its impact, provides an illustration of the importance of chemical equilibrium and kinetics to a major health problem.…

  10. Economics of Scholarly Publishing: Exploring the Causes of Subscription Price Variations of Scholarly Journals in Business Subject-Specific Areas

    Science.gov (United States)

    Liu, Lewis G.

    2011-01-01

    This empirical research investigates subscription price variations of scholarly journals in five business subject-specific areas using the semilogarithmic regression model. It has two main purposes. The first is to address the unsettled debate over whether or not and to what extent commercial publishers reap monopoly profits by overcharging…

  11. RDD Databases

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database was established to oversee documents issued in support of fishery research activities including experimental fishing permits (EFP), letters of...

  12. Snowstorm Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Snowstorm Database is a collection of over 500 snowstorms dating back to 1900 and updated operationally. Only storms having large areas of heavy snowfall (10-20...

  13. Dealer Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The dealer reporting databases contain the primary data reported by federally permitted seafood dealers in the northeast. Electronic reporting was implemented May 1,...

  14. National database

    DEFF Research Database (Denmark)

    Kristensen, Helen Grundtvig; Stjernø, Henrik

    1995-01-01

    Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen.......Artikel om national database for sygeplejeforskning oprettet på Dansk Institut for Sundheds- og Sygeplejeforskning. Det er målet med databasen at samle viden om forsknings- og udviklingsaktiviteter inden for sygeplejen....

  15. DataBase on Demand

    International Nuclear Information System (INIS)

    Aparicio, R Gaspar; Gomez, D; Wojcik, D; Coz, I Coterillo

    2012-01-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  16. Experiment Databases

    Science.gov (United States)

    Vanschoren, Joaquin; Blockeel, Hendrik

    Next to running machine learning algorithms based on inductive queries, much can be learned by immediately querying the combined results of many prior studies. Indeed, all around the globe, thousands of machine learning experiments are being executed on a daily basis, generating a constant stream of empirical information on machine learning techniques. While the information contained in these experiments might have many uses beyond their original intent, results are typically described very concisely in papers and discarded afterwards. If we properly store and organize these results in central databases, they can be immediately reused for further analysis, thus boosting future research. In this chapter, we propose the use of experiment databases: databases designed to collect all the necessary details of these experiments, and to intelligently organize them in online repositories to enable fast and thorough analysis of a myriad of collected results. They constitute an additional, queriable source of empirical meta-data based on principled descriptions of algorithm executions, without reimplementing the algorithms in an inductive database. As such, they engender a very dynamic, collaborative approach to experimentation, in which experiments can be freely shared, linked together, and immediately reused by researchers all over the world. They can be set up for personal use, to share results within a lab or to create open, community-wide repositories. Here, we provide a high-level overview of their design, and use an existing experiment database to answer various interesting research questions about machine learning algorithms and to verify a number of recent studies.

  17. Global Ocean Surface Water Partial Pressure of CO2 Database: Measurements Performed During 1968-2007 (Version 2007)

    Energy Technology Data Exchange (ETDEWEB)

    Kozyr, Alex [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Carbon Dioxide Information Analysis Center

    2008-09-30

    More than 4.1 million measurements of surface water partial pressure of CO2 obtained over the global oceans during 1968-2007 are listed in the Lamont-Doherty Earth Observatory (LDEO) database, which includes open ocean and coastal water measurements. The data assembled include only those measured by equilibrator-CO2 analyzer systems and have been quality-controlled based on the stability of the system performance, the reliability of calibrations for CO2 analysis, and the internal consistency of data. To allow re-examination of the data in the future, a number of measured parameters relevant to pCO2 measurements are listed. The overall uncertainty for the pCO2 values listed is estimated to be ± 2.5 µatm on the average. For simplicity and for ease of reference, this version is referred to as 2007, meaning that data collected through 31 December 2007 has been included. It is our intention to update this database annually. There are 37 new cruise/ship files in this update. In addition, some editing has been performed on existing files so this should be considered a V2007 file. Also we have added a column reporting the partial pressure of CO2 in seawater in units of Pascals. The data presented in this database include the analyses of partial pressure of CO2 (pCO2), sea surface temperature (SST), sea surface salinity (SSS), pressure of the equilibration, and barometric pressure in the outside air from the ship’s observation system. The global pCO2 data set is available free of charge as a numeric data package (NDP) from the Carbon Dioxide Information Analysis Center (CDIAC). The NDP consists of the oceanographic data files and this printed documentation, which describes the procedures and methods used to obtain the data.

  18. Important announcement to INSPEC database users

    CERN Multimedia

    DSU Department

    2008-01-01

    The Library is in the process of transferring CERN’s subscription to the online INSPEC database to a new provider, which involves a new access platform. Users who have saved searches or alerts on the present system should record the details of their search strings as soon as possible whilst the old platform is still available, and manually move them to the new platform which will become available very soon. Access to the older platform will shortly be switched off, after which it will not be possible to access any saved information stored there. Access to the new platform will be available soon from the Library’s INSPEC page: http://library.cern.ch/information_resources/inspec.html

  19. Fire test database

    International Nuclear Information System (INIS)

    Lee, J.A.

    1989-01-01

    This paper describes a project recently completed for EPRI by Impell. The purpose of the project was to develop a reference database of fire tests performed on non-typical fire rated assemblies. The database is designed for use by utility fire protection engineers to locate test reports for power plant fire rated assemblies. As utilities prepare to respond to Information Notice 88-04, the database will identify utilities, vendors or manufacturers who have specific fire test data. The database contains fire test report summaries for 729 tested configurations. For each summary, a contact is identified from whom a copy of the complete fire test report can be obtained. Five types of configurations are included: doors, dampers, seals, wraps and walls. The database is computerized. One version for IBM; one for Mac. Each database is accessed through user-friendly software which allows adding, deleting, browsing, etc. through the database. There are five major database files. One each for the five types of tested configurations. The contents of each provides significant information regarding the test method and the physical attributes of the tested configuration. 3 figs

  20. The Danish fetal medicine database

    DEFF Research Database (Denmark)

    Ekelund, Charlotte Kvist; Kopp, Tine Iskov; Tabor, Ann

    2016-01-01

    trimester ultrasound scan performed at all public hospitals in Denmark are registered in the database. Main variables/descriptive data: Data on maternal characteristics, ultrasonic, and biochemical variables are continuously sent from the fetal medicine units’Astraia databases to the central database via...... analyses are sent to the database. Conclusion: It has been possible to establish a fetal medicine database, which monitors first-trimester screening for chromosomal abnormalities and second-trimester screening for major fetal malformations with the input from already collected data. The database...

  1. A NOVEL APPROACH FOR PERFORMANCE ENHANCEMENT OF E-COMMERCE SOLUTIONS BY FRIENDS RECOMMENDATION SYSTEM AND NEO4J DATABASE

    OpenAIRE

    Shahina C P

    2016-01-01

    In the past, selling needs brick and mortar store and it is limited to a local customer base. But today, ecommerce websites make it easy for the customers around the world to shop by virtual store. Performance of ecommerce websites depend not only the quality and price of the product but also the customer centric suggestions about the product and services and retrieval speed of websites. There is more possibility to buy a product brand suggested by friends or relatives than by viewing commend...

  2. A Case for Database Filesystems

    Energy Technology Data Exchange (ETDEWEB)

    Adams, P A; Hax, J C

    2009-05-13

    Data intensive science is offering new challenges and opportunities for Information Technology and traditional relational databases in particular. Database filesystems offer the potential to store Level Zero data and analyze Level 1 and Level 3 data within the same database system [2]. Scientific data is typically composed of both unstructured files and scalar data. Oracle SecureFiles is a new database filesystem feature in Oracle Database 11g that is specifically engineered to deliver high performance and scalability for storing unstructured or file data inside the Oracle database. SecureFiles presents the best of both the filesystem and the database worlds for unstructured content. Data stored inside SecureFiles can be queried or written at performance levels comparable to that of traditional filesystems while retaining the advantages of the Oracle database.

  3. Database Replication Prototype

    OpenAIRE

    Vandewall, R.

    2000-01-01

    This report describes the design of a Replication Framework that facilitates the implementation and com-parison of database replication techniques. Furthermore, it discusses the implementation of a Database Replication Prototype and compares the performance measurements of two replication techniques based on the Atomic Broadcast communication primitive: pessimistic active replication and optimistic active replication. The main contributions of this report can be split into four parts....

  4. Database computing in HEP

    International Nuclear Information System (INIS)

    Day, C.T.; Loken, S.; MacFarlane, J.F.; May, E.; Lifka, D.; Lusk, E.; Price, L.E.; Baden, A.

    1992-01-01

    The major SSC experiments are expected to produce up to 1 Petabyte of data per year each. Once the primary reconstruction is completed by farms of inexpensive processors. I/O becomes a major factor in further analysis of the data. We believe that the application of database techniques can significantly reduce the I/O performed in these analyses. We present examples of such I/O reductions in prototype based on relational and object-oriented databases of CDF data samples

  5. Sorption databases for the cementitious near-field of a L/ILW repository for performance assessment

    International Nuclear Information System (INIS)

    Bradbury, M.H.; Sarott, F.A.

    1995-03-01

    Approximately 95% of the material in the L/ILW repository for short-lived low- and intermediate-level wastes consists of concrete; the remaining approx. 5% consists of steel (4%) and high molecular weight organic waste components (1%). Radionuclide sorption onto concrete represents one of the most important retardation mechanisms in the disposal caverns. This report compiles the sorption properties of hydrated cement, the most important sorbing material present in concrete, in the form of data sets for safety relevant nuclides under repository conditions; these data can then be used directly in performance assessment. Processes which affect sorption onto cement in the disposal caverns are documented in different data sets in this report. In this report, the distribution coefficients for radionuclides on cement are based to a large extent on values measured under repository-relevant conditions; this is true for cement without complexants in particular. (author) figs., tabs., refs

  6. Conceptual considerations for CBM databases

    Energy Technology Data Exchange (ETDEWEB)

    Akishina, E. P.; Aleksandrov, E. I.; Aleksandrov, I. N.; Filozova, I. A.; Ivanov, V. V.; Zrelov, P. V. [Lab. of Information Technologies, JINR, Dubna (Russian Federation); Friese, V.; Mueller, W. [GSI, Darmstadt (Germany)

    2014-07-01

    We consider a concept of databases for the Cm experiment. For this purpose, an analysis of the databases for large experiments at the LHC at CERN has been performed. Special features of various DBMS utilized in physical experiments, including relational and object-oriented DBMS as the most applicable ones for the tasks of these experiments, were analyzed. A set of databases for the CBM experiment, DBMS for their developments as well as use cases for the considered databases are suggested.

  7. Conceptual considerations for CBM databases

    International Nuclear Information System (INIS)

    Akishina, E.P.; Aleksandrov, E.I.; Aleksandrov, I.N.; Filozova, I.A.; Ivanov, V.V.; Zrelov, P.V.; Friese, V.; Mueller, W.

    2014-01-01

    We consider a concept of databases for the Cm experiment. For this purpose, an analysis of the databases for large experiments at the LHC at CERN has been performed. Special features of various DBMS utilized in physical experiments, including relational and object-oriented DBMS as the most applicable ones for the tasks of these experiments, were analyzed. A set of databases for the CBM experiment, DBMS for their developments as well as use cases for the considered databases are suggested.

  8. A framework for organizing cancer-related variations from existing databases, publications and NGS data using a High-performance Integrated Virtual Environment (HIVE).

    Science.gov (United States)

    Wu, Tsung-Jung; Shamsaddini, Amirhossein; Pan, Yang; Smith, Krista; Crichton, Daniel J; Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    Years of sequence feature curation by UniProtKB/Swiss-Prot, PIR-PSD, NCBI-CDD, RefSeq and other database biocurators has led to a rich repository of information on functional sites of genes and proteins. This information along with variation-related annotation can be used to scan human short sequence reads from next-generation sequencing (NGS) pipelines for presence of non-synonymous single-nucleotide variations (nsSNVs) that affect functional sites. This and similar workflows are becoming more important because thousands of NGS data sets are being made available through projects such as The Cancer Genome Atlas (TCGA), and researchers want to evaluate their biomarkers in genomic data. BioMuta, an integrated sequence feature database, provides a framework for automated and manual curation and integration of cancer-related sequence features so that they can be used in NGS analysis pipelines. Sequence feature information in BioMuta is collected from the Catalogue of Somatic Mutations in Cancer (COSMIC), ClinVar, UniProtKB and through biocuration of information available from publications. Additionally, nsSNVs identified through automated analysis of NGS data from TCGA are also included in the database. Because of the petabytes of data and information present in NGS primary repositories, a platform HIVE (High-performance Integrated Virtual Environment) for storing, analyzing, computing and curating NGS data and associated metadata has been developed. Using HIVE, 31 979 nsSNVs were identified in TCGA-derived NGS data from breast cancer patients. All variations identified through this process are stored in a Curated Short Read archive, and the nsSNVs from the tumor samples are included in BioMuta. Currently, BioMuta has 26 cancer types with 13 896 small-scale and 308 986 large-scale study-derived variations. Integration of variation data allows identifications of novel or common nsSNVs that can be prioritized in validation studies. Database URL: BioMuta: http

  9. Bringing together the work of subscription and open access specialists: challenges and changes at the University of Sussex

    Directory of Open Access Journals (Sweden)

    Eleanor Craig

    2017-03-01

    Full Text Available The rise in open access (OA publishing has required library staff across many UK academic institutions to take on new roles and responsibilities to support academics. At the same time, the long-established work of negotiating with publishers around journal subscriptions is changing as such deals now usually include OA payment or discount plans in many different forms that vary from publisher to publisher. This article outlines some of the issues we encountered at the University of Sussex Library whilst trying to pull together the newer strand of OA advocacy and funder compliance work with existing responsibilities for managing subscription deals. It considers the challenges faced in effectively bringing together Library staff with knowledge in these areas, and outlines the steps we have taken so far to ensure OA publishing is taken into account wherever appropriate.

  10. Effects of spatial location and household wealth on health insurance subscription among women in Ghana.

    Science.gov (United States)

    Kumi-Kyereme, Akwasi; Amo-Adjei, Joshua

    2013-06-17

    This study compares ownership of health insurance among Ghanaian women with respect to wealth status and spatial location. We explore the overarching research question by employing geographic and proxy means targeting through interactive analysis of wealth status and spatial issues. The paper draws on the 2008 Ghana Demographic and Health Survey. Bivariate descriptive analysis coupled with binary logistic regression estimation technique was used to analyse the data. By wealth status, the likelihood of purchasing insurance was significantly higher among respondents from the middle, richer and richest households compared to the poorest (reference category) and these differences widened more profoundly in the Northern areas after interacting wealth with zone of residence. Among women at the bottom of household wealth (poorest and poorer), there were no statistically significant differences in insurance subscription in all the areas. The results underscore the relevance of geographic and proxy means targeting in identifying populations who may be need of special interventions as part of the efforts to increase enrolment as well as means of social protection against the vulnerable.

  11. Effect of Eight Weekly Aerobic Training Program on Auditory Reaction Time and MaxVO[subscript 2] in Visual Impairments

    Science.gov (United States)

    Taskin, Cengiz

    2016-01-01

    The aim of study was to examine the effect of eight weekly aerobic exercises on auditory reaction time and MaxVO[subscript 2] in visual impairments. Forty visual impairment children that have blind 3 classification from the Turkey, experimental group; (age = 15.60 ± 1.10 years; height = 164.15 ± 4.88 cm; weight = 66.60 ± 4.77 kg) for twenty…

  12. Prediction of VO[subscript 2]max in Children and Adolescents Using Exercise Testing and Physical Activity Questionnaire Data

    Science.gov (United States)

    Black, Nate E.; Vehrs, Pat R.; Fellingham, Gilbert W.; George, James D.; Hager, Ron

    2016-01-01

    Purpose: The purpose of this study was to evaluate the use of a treadmill walk-jog-run exercise test previously validated in adults and physical activity questionnaire data to estimate maximum oxygen consumption (VO[subscript 2]max) in boys (n = 62) and girls (n = 66) aged 12 to 17 years old. Methods: Data were collected from Physical Activity…

  13. Danish Urogynaecological Database

    DEFF Research Database (Denmark)

    Hansen, Ulla Darling; Gradel, Kim Oren; Larsen, Michael Due

    2016-01-01

    , complications if relevant, implants used if relevant, 3-6-month postoperative recording of symptoms, if any. A set of clinical quality indicators is being maintained by the steering committee for the database and is published in an annual report which also contains extensive descriptive statistics. The database......The Danish Urogynaecological Database is established in order to ensure high quality of treatment for patients undergoing urogynecological surgery. The database contains details of all women in Denmark undergoing incontinence surgery or pelvic organ prolapse surgery amounting to ~5,200 procedures...... has a completeness of over 90% of all urogynecological surgeries performed in Denmark. Some of the main variables have been validated using medical records as gold standard. The positive predictive value was above 90%. The data are used as a quality monitoring tool by the hospitals and in a number...

  14. [Reading behavior and preferences regarding subscriptions to scientific journals : Results of a survey of members of the German Society for General and Visceral Surgery].

    Science.gov (United States)

    Ronellenfitsch, U; Klinger, C; Buhr, H J; Post, S

    2015-11-01

    The purpose of surgical literature is to publish the latest study results and to provide continuing medical education to readers. For optimal allocation of resources, institutional subscribers, professional societies and scientific publishers require structured data on reading and subscription preferences of potential readers of surgical literature. To obtain representative data on the preferences of German general and visceral surgeons regarding reading of and subscription to scientific journals. All members of the German Society for General and Visceral Surgery (DGAV) were invited to participate in a web-based survey. Questions were asked on the affiliation and position of the member, individual journal subscriptions, institutional access to scientific journals, preferences regarding electronic or print articles and special subscriptions for society members. Answers were descriptively analyzed. A total of 630 out of 4091 (15 %) members participated in the survey and 73 % of the respondents had at least 1 individual subscription to a scientific journal. The most frequently subscribed journal was Der Chirurg (47 % of respondents). The institutional access to journals was deemed insufficient by 48 % of respondents, predominantly in primary care hospitals and outpatient clinics. Almost half of the respondents gave sufficient importance to reading printed versions of articles for which they would pay extra fees. A group subscription for society members was perceived as advantageous as long as no relevant extra costs were incurred. This structured survey among members of the DGAV provides data on preferences regarding reading of and subscription to scientific journals. Individual subscriptions to journals are still common, possibly due to suboptimal institutional access particularly at smaller non-academic institutions. In an age of online publications it seems surprising that many respondents place a high value on printed versions. The results are relevant for

  15. Faculty Decisions on Serials Subscriptions Differ Significantly from Decisions Predicted by a Bibliometric Tool.

    Directory of Open Access Journals (Sweden)

    Sue F. Phelps

    2016-03-01

    Full Text Available Objective – To compare faculty choices of serials subscription cancellations to the scores of a bibliometric tool. Design – Natural experiment. Data was collected about faculty valuations of serials. The California Digital Library Weighted Value Algorithm (CDL-WVA was used to measure the value of journals to a particular library. These two sets of scores were then compared. Setting – A public research university in the United States of America. Subjects – Teaching and research faculty, as well as serials data. Methods – Experimental methodology was used to compare faculty valuations of serials (based on their journal cancellation choices to bibliometric valuations of the same journal titles (determined by CDL-WVA scores to identify the match rate between the faculty choices and the bibliographic data. Faculty were asked to select titles to cancel that totaled approximately 30% of the budget for their disciplinary fund code. This “keep” or “cancel” choice was the binary variable for the study. Usage data was gathered for articles downloaded through the link resolver for titles in each disciplinary dataset, and the CDL-WVA scores were determined for each journal title based on utility, quality, and cost effectiveness. Titles within each dataset were ranked highest to lowest using the CDL-WVA scores within each fund code, and then by subscription cost for titles with the same CDL-WVA score. The journal titles selected for comparison were those that ranked above the approximate 30% of titles chosen for cancellation by faculty and CDL-WVA scores. Researchers estimated an odds ratio of faculty choosing to keep a title and a CDL-WVA score that indicated the title should be kept. The p-value for that result was less than 0.0001, indicating that there was a negligible probability that the results were by chance. They also applied logistic regression to quantify the association between the numeric score of CDL-WVA and the binary variable

  16. Perception of quality of health delivery and health insurance subscription in Ghana.

    Science.gov (United States)

    Amo-Adjei, Joshua; Anku, Prince Justin; Amo, Hannah Fosuah; Effah, Mavis Osei

    2016-07-29

    National health insurance schemes (NHIS) in developing countries and perhaps in developed countries as well is a considered a pro-poor intervention by helping to bridge the financial burden of access to quality health care. Perceptions of quality of health service could have immense impacts on enrolment. This paper shows how perception of service quality under Ghana's insurance programme contributes to health insurance subscription. The study used the 2014 Ghana Demographic and Health Survey (GDHS) dataset. Both descriptive proportions and binary logistic regression techniques were applied to generate results that informed the discussion. Our results show that a high proportion of females (33 %) and males (35 %) felt that the quality of health provided to holders of the NHIS card was worse. As a result, approximately 30 % of females and 22%who perceived health care as worse by holding an insurance card did not own an insurance policy. While perceptions of differences in quality among females were significantly different (AOR = 0.453 [95 % CI = 0.375, 0.555], among males, the differences in perceptions of quality of health services under the NHIS were independent in the multivariable analysis. Beyond perceptions of quality, being resident in the Upper West region was an important predictor of health insurance ownership for both males and females. For such a social and pro-poor intervention, investing in quality of services to subscribers, especially women who experience enormous health risks in the reproductive period can offer important gains to sustaining the scheme as well as offering affordable health services.

  17. JAEA thermodynamic database for performance assessment of geological disposal of high-level and TRU wastes. Selection of thermodynamic data of molybdenum

    International Nuclear Information System (INIS)

    Kitamura, Akira; Kirishima, Akira; Saito, Takumi; Shibutani, Sanae; Tochiyama, Osamu

    2010-06-01

    Within the scope of the JAEA thermodynamic database project for performance assessment of geological disposal of high-level radioactive and TRU wastes, the selection of the thermodynamic data on the inorganic compounds and complexes of molybdenum were carried out. We focused to select thermodynamic data of aqueous species and compounds which could form under repository conditions for the disposal of radioactive wastes, i.e. relatively low concentration of molybdenum and from near neutral through alkaline conditions. Selection of thermodynamic data was based on the guidelines by the Nuclear Energy Agency in the Organisation for Economic Co-operation and Development (OECD/NEA). Extensive literature survey was performed and all the obtained articles were carefully reviewed to select the thermodynamic data for molybdenum. Thermodynamic data at 25degC and zero ionic strength were determined from accepted thermodynamic data which were considered to be reliable. We especially paid attention to select formation constant of molybdate ion (MoO 4 2- ) with hydrogen ion (H + ) in detail. This is the first report in showing selection of thermodynamic data for molybdenum with detailed reviewing process. (author)

  18. The Danish Testicular Cancer database

    DEFF Research Database (Denmark)

    Daugaard, Gedske; Kier, Maria Gry Gundgaard; Bandak, Mikkel

    2016-01-01

    AIM: The nationwide Danish Testicular Cancer database consists of a retrospective research database (DaTeCa database) and a prospective clinical database (Danish Multidisciplinary Cancer Group [DMCG] DaTeCa database). The aim is to improve the quality of care for patients with testicular cancer (TC......) in Denmark, that is, by identifying risk factors for relapse, toxicity related to treatment, and focusing on late effects. STUDY POPULATION: All Danish male patients with a histologically verified germ cell cancer diagnosis in the Danish Pathology Registry are included in the DaTeCa databases. Data...... collection has been performed from 1984 to 2007 and from 2013 onward, respectively. MAIN VARIABLES AND DESCRIPTIVE DATA: The retrospective DaTeCa database contains detailed information with more than 300 variables related to histology, stage, treatment, relapses, pathology, tumor markers, kidney function...

  19. Statistical Measures Alone Cannot Determine Which Database (BNI, CINAHL, MEDLINE, or EMBASE Is the Most Useful for Searching Undergraduate Nursing Topics. A Review of: Stokes, P., Foster, A., & Urquhart, C. (2009. Beyond relevance and recall: Testing new user-centred measures of database performance. Health Information and Libraries Journal, 26(3, 220-231.

    Directory of Open Access Journals (Sweden)

    Giovanna Badia

    2011-03-01

    Full Text Available Objective – The research project sought to determine which of four databases was the most useful for searching undergraduate nursing topics. Design – Comparative database evaluation. Setting – Nursing and midwifery students at Homerton School of Health Studies (now part of Anglia Ruskin University, Cambridge, United Kingdom, in 2005-2006. Subjects – The subjects were four databases: British Nursing Index (BNI, CINAHL, MEDLINE, and EMBASE.Methods – This was a comparative study using title searches to compare BNI (BritishNursing Index, CINAHL, MEDLINE and EMBASE.According to the authors, this is the first study to compare BNI with other databases. BNI is a database produced by British libraries that indexes the nursing and midwifery literature. It covers over 240 British journals, and includes references to articles from health sciences journals that are relevant to nurses and midwives (British Nursing Index, n.d..The researchers performed keyword searches in the title field of the four databases for the dissertation topics of nine nursing and midwifery students enrolled in undergraduate dissertation modules. The list of titles of journals articles on their topics were given to the students and they were asked to judge the relevancy of the citations. The title searches were evaluated in each of the databases using the following criteria: • precision (the number of relevant results obtained in the database for a search topic, divided by the total number of results obtained in the database search;• recall (the number of relevant results obtained in the database for a search topic, divided by the total number of relevant results obtained on that topic from all four database searches;• novelty (the number of relevant results that were unique in the database search, which was calculated as a percentage of the total number of relevant results found in the database;• originality (the number of unique relevant results obtained in the

  20. The magnet database system

    International Nuclear Information System (INIS)

    Baggett, P.; Delagi, N.; Leedy, R.; Marshall, W.; Robinson, S.L.; Tompkins, J.C.

    1991-01-01

    This paper describes the current status of MagCom, a central database of SSC magnet information that is available to all magnet scientists via network connections. The database has been designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will help magnet scientists to track and control the production process and to correlate the performance of magnets with the properties of their constituents

  1. The Danish Sarcoma Database

    DEFF Research Database (Denmark)

    Jørgensen, Peter Holmberg; Lausten, Gunnar Schwarz; Pedersen, Alma B

    2016-01-01

    AIM: The aim of the database is to gather information about sarcomas treated in Denmark in order to continuously monitor and improve the quality of sarcoma treatment in a local, a national, and an international perspective. STUDY POPULATION: Patients in Denmark diagnosed with a sarcoma, both...... skeletal and ekstraskeletal, are to be registered since 2009. MAIN VARIABLES: The database contains information about appearance of symptoms; date of receiving referral to a sarcoma center; date of first visit; whether surgery has been performed elsewhere before referral, diagnosis, and treatment; tumor...... of Diseases - tenth edition codes and TNM Classification of Malignant Tumours, and date of death (after yearly coupling to the Danish Civil Registration System). Data quality and completeness are currently secured. CONCLUSION: The Danish Sarcoma Database is population based and includes sarcomas occurring...

  2. Extending Database Integration Technology

    National Research Council Canada - National Science Library

    Buneman, Peter

    1999-01-01

    Formal approaches to the semantics of databases and database languages can have immediate and practical consequences in extending database integration technologies to include a vastly greater range...

  3. Quality Assessment of Studies Published in Open Access and Subscription Journals: Results of a Systematic Evaluation.

    Science.gov (United States)

    Pastorino, Roberta; Milovanovic, Sonja; Stojanovic, Jovana; Efremov, Ljupcho; Amore, Rosarita; Boccia, Stefania

    2016-01-01

    Along with the proliferation of Open Access (OA) publishing, the interest for comparing the scientific quality of studies published in OA journals versus subscription journals has also increased. With our study we aimed to compare the methodological quality and the quality of reporting of primary epidemiological studies and systematic reviews and meta-analyses published in OA and non-OA journals. In order to identify the studies to appraise, we listed all OA and non-OA journals which published in 2013 at least one primary epidemiologic study (case-control or cohort study design), and at least one systematic review or meta-analysis in the field of oncology. For the appraisal, we picked up the first studies published in 2013 with case-control or cohort study design from OA journals (Group A; n = 12), and in the same time period from non-OA journals (Group B; n = 26); the first systematic reviews and meta-analyses published in 2013 from OA journals (Group C; n = 15), and in the same time period from non-OA journals (Group D; n = 32). We evaluated the methodological quality of studies by assessing the compliance of case-control and cohort studies to Newcastle and Ottawa Scale (NOS) scale, and the compliance of systematic reviews and meta-analyses to Assessment of Multiple Systematic Reviews (AMSTAR) scale. The quality of reporting was assessed considering the adherence of case-control and cohort studies to STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) checklist, and the adherence of systematic reviews and meta-analyses to Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) checklist. Among case-control and cohort studies published in OA and non-OA journals, we did not observe significant differences in the median value of NOS score (Group A: 7 (IQR 7-8) versus Group B: 8 (7-9); p = 0.5) and in the adherence to STROBE checklist (Group A, 75% versus Group B, 80%; p = 0.1). The results did not change after adjustment

  4. Economics of access versus ownership the costs and benefits of access to scholarly articles via interlibrary loan and journal subscriptions

    CERN Document Server

    Kingma, Bruce

    2013-01-01

    The Economics of Access Versus Ownership offers library professionals a model economic analysis of providing access to journal articles through interlibrary loan as compared to library subscriptions to the journals. This model enables library directors to do an economic analysis of interlibrary loan and collection development in their own libraries and to then make cost-efficient decisions about the use of these services.This practical book's analysis and conclusions are based on 1994/95 academic year research conducted by the State University of New York libraries at Albany, Binghamton, Buffa

  5. "Mind the gap!" Evaluation of the performance gap attributable to exception reporting and target thresholds in the new GMS contract: National database analysis

    Directory of Open Access Journals (Sweden)

    Cookson Richard

    2008-06-01

    Full Text Available Abstract Background The 2003 revision of the UK GMS contract rewards general practices for performance against clinical quality indicators. Practices can exempt patients from treatment, and can receive maximum payment for less than full coverage of eligible patients. This paper aims to estimate the gap between the percentage of maximum incentive gained and the percentage of patients receiving indicated care (the pay-performance gap, and to estimate how much of the gap is attributable respectively to thresholds and to exception reporting. Methods Analysis of Quality Outcomes Framework data in the National Primary Care Database and exception reporting data from the Information Centre from 8407 practices in England in 2005 – 6. The main outcome measures were the gap between the percentage of maximum incentive gained and the percentage of patients receiving indicated care at the practice level, both for individual indicators and a combined composite score. An additional outcome was the percentage of that gap attributable respectively to exception reporting and maximum threshold targets set at less than 100%. Results The mean pay-performance gap for the 65 aggregated clinical indicators was 13.3% (range 2.9% to 48%. 52% of this gap (6.9% of eligible patients is attributable to thresholds being set at less than 100%, and 48% to patients being exception reported. The gap was greater than 25% in 9 indicators: beta blockers and cholesterol control in heart disease; cholesterol control in stroke; influenza immunization in asthma; blood pressure, sugar and cholesterol control in diabetes; seizures in epilepsy and treatment of hypertension. Conclusion Threshold targets and exception reporting introduce an incentive ceiling, which substantially reduces the percentage of eligible patients that UK practices need to treat in order to receive maximum incentive payments for delivering that care. There are good clinical reasons for exception reporting, but after

  6. Database reliability engineering designing and operating resilient database systems

    CERN Document Server

    Campbell, Laine

    2018-01-01

    The infrastructure-as-code revolution in IT is also affecting database administration. With this practical book, developers, system administrators, and junior to mid-level DBAs will learn how the modern practice of site reliability engineering applies to the craft of database architecture and operations. Authors Laine Campbell and Charity Majors provide a framework for professionals looking to join the ranks of today’s database reliability engineers (DBRE). You’ll begin by exploring core operational concepts that DBREs need to master. Then you’ll examine a wide range of database persistence options, including how to implement key technologies to provide resilient, scalable, and performant data storage and retrieval. With a firm foundation in database reliability engineering, you’ll be ready to dive into the architecture and operations of any modern database. This book covers: Service-level requirements and risk management Building and evolving an architecture for operational visibility ...

  7. The Danish Testicular Cancer database.

    Science.gov (United States)

    Daugaard, Gedske; Kier, Maria Gry Gundgaard; Bandak, Mikkel; Mortensen, Mette Saksø; Larsson, Heidi; Søgaard, Mette; Toft, Birgitte Groenkaer; Engvad, Birte; Agerbæk, Mads; Holm, Niels Vilstrup; Lauritsen, Jakob

    2016-01-01

    The nationwide Danish Testicular Cancer database consists of a retrospective research database (DaTeCa database) and a prospective clinical database (Danish Multidisciplinary Cancer Group [DMCG] DaTeCa database). The aim is to improve the quality of care for patients with testicular cancer (TC) in Denmark, that is, by identifying risk factors for relapse, toxicity related to treatment, and focusing on late effects. All Danish male patients with a histologically verified germ cell cancer diagnosis in the Danish Pathology Registry are included in the DaTeCa databases. Data collection has been performed from 1984 to 2007 and from 2013 onward, respectively. The retrospective DaTeCa database contains detailed information with more than 300 variables related to histology, stage, treatment, relapses, pathology, tumor markers, kidney function, lung function, etc. A questionnaire related to late effects has been conducted, which includes questions regarding social relationships, life situation, general health status, family background, diseases, symptoms, use of medication, marital status, psychosocial issues, fertility, and sexuality. TC survivors alive on October 2014 were invited to fill in this questionnaire including 160 validated questions. Collection of questionnaires is still ongoing. A biobank including blood/sputum samples for future genetic analyses has been established. Both samples related to DaTeCa and DMCG DaTeCa database are included. The prospective DMCG DaTeCa database includes variables regarding histology, stage, prognostic group, and treatment. The DMCG DaTeCa database has existed since 2013 and is a young clinical database. It is necessary to extend the data collection in the prospective database in order to answer quality-related questions. Data from the retrospective database will be added to the prospective data. This will result in a large and very comprehensive database for future studies on TC patients.

  8. LHCb distributed conditions database

    International Nuclear Information System (INIS)

    Clemencic, M

    2008-01-01

    The LHCb Conditions Database project provides the necessary tools to handle non-event time-varying data. The main users of conditions are reconstruction and analysis processes, which are running on the Grid. To allow efficient access to the data, we need to use a synchronized replica of the content of the database located at the same site as the event data file, i.e. the LHCb Tier1. The replica to be accessed is selected from information stored on LFC (LCG File Catalog) and managed with the interface provided by the LCG developed library CORAL. The plan to limit the submission of jobs to those sites where the required conditions are available will also be presented. LHCb applications are using the Conditions Database framework on a production basis since March 2007. We have been able to collect statistics on the performance and effectiveness of both the LCG library COOL (the library providing conditions handling functionalities) and the distribution framework itself. Stress tests on the CNAF hosted replica of the Conditions Database have been performed and the results will be summarized here

  9. "Mr. Database" : Jim Gray and the History of Database Technologies.

    Science.gov (United States)

    Hanwahr, Nils C

    2017-12-01

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

  10. Performance Comparison of Relational and Native-XML Databases using the Semantics of the Land Command and Control Information Exchange Data Model (LC2IEDM)

    National Research Council Canada - National Science Library

    Denny, Ian M; Jahn, Dieter

    2005-01-01

    .... The majority of messaging systems store information in a document-centric free-text format that makes it difficult for command and control systems, relational databases, software agents and web...

  11. Database characterisation of HEP applications

    International Nuclear Information System (INIS)

    Piorkowski, Mariusz; Grancher, Eric; Topurov, Anton

    2012-01-01

    Oracle-based database applications underpin many key aspects of operations for both the LHC accelerator and the LHC experiments. In addition to the overall performance, the predictability of the response is a key requirement to ensure smooth operations and delivering predictability requires understanding the applications from the ground up. Fortunately, database management systems provide several tools to check, measure, analyse and gather useful information. We present our experiences characterising the performance of several typical HEP database applications performance characterisations that were used to deliver improved predictability and scalability as well as for optimising the hardware platform choice as we migrated to new hardware and Oracle 11g.

  12. Ground Reaction Force Differences in the Countermovement Jump in Girls with Different Levels of Performance

    Science.gov (United States)

    Floría, Pablo; Harrison, Andrew J.

    2013-01-01

    Purpose: The aim of this study was to ascertain the biomechanical differences between better and poorer performers of the vertical jump in a homogeneous group of children. Method: Twenty-four girls were divided into low-scoring (LOW; M [subscript age] = 6.3 ± 0.8 years) and high-scoring (HIGH; M [subscript age] = 6.6 ± 0.8 years) groups based on…

  13. The Danish Sarcoma Database

    Directory of Open Access Journals (Sweden)

    Jorgensen PH

    2016-10-01

    Full Text Available Peter Holmberg Jørgensen,1 Gunnar Schwarz Lausten,2 Alma B Pedersen3 1Tumor Section, Department of Orthopedic Surgery, Aarhus University Hospital, Aarhus, 2Tumor Section, Department of Orthopedic Surgery, Rigshospitalet, Copenhagen, 3Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus, Denmark Aim: The aim of the database is to gather information about sarcomas treated in Denmark in order to continuously monitor and improve the quality of sarcoma treatment in a local, a national, and an international perspective. Study population: Patients in Denmark diagnosed with a sarcoma, both skeletal and ekstraskeletal, are to be registered since 2009. Main variables: The database contains information about appearance of symptoms; date of receiving referral to a sarcoma center; date of first visit; whether surgery has been performed elsewhere before referral, diagnosis, and treatment; tumor characteristics such as location, size, malignancy grade, and growth pattern; details on treatment (kind of surgery, amount of radiation therapy, type and duration of chemotherapy; complications of treatment; local recurrence and metastases; and comorbidity. In addition, several quality indicators are registered in order to measure the quality of care provided by the hospitals and make comparisons between hospitals and with international standards. Descriptive data: Demographic patient-specific data such as age, sex, region of living, comorbidity, World Health Organization's International Classification of Diseases – tenth edition codes and TNM Classification of Malignant Tumours, and date of death (after yearly coupling to the Danish Civil Registration System. Data quality and completeness are currently secured. Conclusion: The Danish Sarcoma Database is population based and includes sarcomas occurring in Denmark since 2009. It is a valuable tool for monitoring sarcoma incidence and quality of treatment and its improvement, postoperative

  14. Database development and management

    CERN Document Server

    Chao, Lee

    2006-01-01

    Introduction to Database Systems Functions of a DatabaseDatabase Management SystemDatabase ComponentsDatabase Development ProcessConceptual Design and Data Modeling Introduction to Database Design Process Understanding Business ProcessEntity-Relationship Data Model Representing Business Process with Entity-RelationshipModelTable Structure and NormalizationIntroduction to TablesTable NormalizationTransforming Data Models to Relational Databases .DBMS Selection Transforming Data Models to Relational DatabasesEnforcing ConstraintsCreating Database for Business ProcessPhysical Design and Database

  15. Diet History Questionnaire: Database Revision History

    Science.gov (United States)

    The following details all additions and revisions made to the DHQ nutrient and food database. This revision history is provided as a reference for investigators who may have performed analyses with a previous release of the database.

  16. Combining evidence from multiple electronic health care databases: performances of one-stage and two-stage meta-analysis in matched case-control studies.

    Science.gov (United States)

    La Gamba, Fabiola; Corrao, Giovanni; Romio, Silvana; Sturkenboom, Miriam; Trifirò, Gianluca; Schink, Tania; de Ridder, Maria

    2017-10-01

    Clustering of patients in databases is usually ignored in one-stage meta-analysis of multi-database studies using matched case-control data. The aim of this study was to compare bias and efficiency of such a one-stage meta-analysis with a two-stage meta-analysis. First, we compared the approaches by generating matched case-control data under 5 simulated scenarios, built by varying: (1) the exposure-outcome association; (2) its variability among databases; (3) the confounding strength of one covariate on this association; (4) its variability; and (5) the (heterogeneous) confounding strength of two covariates. Second, we made the same comparison using empirical data from the ARITMO project, a multiple database study investigating the risk of ventricular arrhythmia following the use of medications with arrhythmogenic potential. In our study, we specifically investigated the effect of current use of promethazine. Bias increased for one-stage meta-analysis with increasing (1) between-database variance of exposure effect and (2) heterogeneous confounding generated by two covariates. The efficiency of one-stage meta-analysis was slightly lower than that of two-stage meta-analysis for the majority of investigated scenarios. Based on ARITMO data, there were no evident differences between one-stage (OR = 1.50, CI = [1.08; 2.08]) and two-stage (OR = 1.55, CI = [1.12; 2.16]) approaches. When the effect of interest is heterogeneous, a one-stage meta-analysis ignoring clustering gives biased estimates. Two-stage meta-analysis generates estimates at least as accurate and precise as one-stage meta-analysis. However, in a study using small databases and rare exposures and/or outcomes, a correct one-stage meta-analysis becomes essential. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Quality Assurance in Individual Monitoring: 10 Years of Performance Monitoring of the TLD Based TNO Individual Monitoring Service (invited paper)

    International Nuclear Information System (INIS)

    Dijk, J.W.E. van

    1998-01-01

    The QA subscription forms the nucleus of the Quality Assurance (QA) programme of the TLD-based Individual Monitoring Service of TNO-CSD. This QA subscription is the subscription of a dummy customer to the service. As this customer is treated exactly like a normal customer, all aspects of the service are monitored by the QA subscription. An overview is given of 10 years of monitoring the performance of the service. Various improvements over the past decade have resulted in a standard deviation in a low dose measurement of 0.01 mSv and a relative standard deviation at higher doses of 5%. These figures represent the performance under routine circumstances and thus include variations due to variations in the natural background from place to place and, for example, due to transport. (author)

  18. Quality Assurance in Individual Monitoring: 10 Years of Performance Monitoring of the TLD Based TNO Individual Monitoring Service (invited paper)

    Energy Technology Data Exchange (ETDEWEB)

    Dijk, J.W.E. van

    1998-07-01

    The QA subscription forms the nucleus of the Quality Assurance (QA) programme of the TLD-based Individual Monitoring Service of TNO-CSD. This QA subscription is the subscription of a dummy customer to the service. As this customer is treated exactly like a normal customer, all aspects of the service are monitored by the QA subscription. An overview is given of 10 years of monitoring the performance of the service. Various improvements over the past decade have resulted in a standard deviation in a low dose measurement of 0.01 mSv and a relative standard deviation at higher doses of 5%. These figures represent the performance under routine circumstances and thus include variations due to variations in the natural background from place to place and, for example, due to transport. (author)

  19. NREL: U.S. Life Cycle Inventory Database - About the LCI Database Project

    Science.gov (United States)

    About the LCI Database Project The U.S. Life Cycle Inventory (LCI) Database is a publicly available database that allows users to objectively review and compare analysis results that are based on similar source of critically reviewed LCI data through its LCI Database Project. NREL's High-Performance

  20. Declining subscriptions to the Maliando Mutual Health Organisation in Guinea-Conakry (West Africa): what is going wrong?

    Science.gov (United States)

    Criel, Bart; Waelkens, Maria Pia

    2003-10-01

    Mutual Health Organisations (MHOs) are a type of community health insurance scheme that are being developed and promoted in sub-Saharan Africa. In 1998, an MHO was organised in a rural district of Guinea to improve access to quality health care. Households paid an annual insurance fee of about US$2 per individual. Contributions were voluntary. The benefit package included free access to all first line health care services (except for a small co-payment), free paediatric care, free emergency surgical care and free obstetric care at the district hospital. Also included were part of the cost of emergency transport to the hospital. In 1998, the MHO covered 8% of the target population, but, by 1999, the subscription rate had dropped to about 6%. In March 2000, focus groups were held with members and non-members of the scheme to find out why subscription rates were so low. The research indicated that a failure to understand the scheme does not explain these low rates. On the contrary, the great majority of research subjects, members and non-members alike, acquired a very accurate understanding of the concepts and principles underlying health insurance. They value the system's re-distributive effects, which goes beyond household, next of kin or village. The participants accurately point out the sharp differences that exist between traditional financial mechanisms and the principle of health insurance, as well as the advantages and disadvantages of both. The ease with which risk-pooling is accepted as a financial mechanism which addresses specific needs demonstrates that it is not, per se, necessary to build health insurance schemes on existing or traditional systems of mutual aid. The majority of the participants consider the individual premium of 2 US dollars to be fair. There is, however, a problem of affordability for many poor and/or large families who cannot raise enough money to pay the subscription for all household members in one go. However, the main reason for

  1. Mathematics for Databases

    NARCIS (Netherlands)

    ir. Sander van Laar

    2007-01-01

    A formal description of a database consists of the description of the relations (tables) of the database together with the constraints that must hold on the database. Furthermore the contents of a database can be retrieved using queries. These constraints and queries for databases can very well be

  2. Databases and their application

    NARCIS (Netherlands)

    Grimm, E.C.; Bradshaw, R.H.W; Brewer, S.; Flantua, S.; Giesecke, T.; Lézine, A.M.; Takahara, H.; Williams, J.W.,Jr; Elias, S.A.; Mock, C.J.

    2013-01-01

    During the past 20 years, several pollen database cooperatives have been established. These databases are now constituent databases of the Neotoma Paleoecology Database, a public domain, multiproxy, relational database designed for Quaternary-Pliocene fossil data and modern surface samples. The

  3. DOT Online Database

    Science.gov (United States)

    Page Home Table of Contents Contents Search Database Search Login Login Databases Advisory Circulars accessed by clicking below: Full-Text WebSearch Databases Database Records Date Advisory Circulars 2092 5 data collection and distribution policies. Document Database Website provided by MicroSearch

  4. The magnet database system

    International Nuclear Information System (INIS)

    Ball, M.J.; Delagi, N.; Horton, B.; Ivey, J.C.; Leedy, R.; Li, X.; Marshall, B.; Robinson, S.L.; Tompkins, J.C.

    1992-01-01

    The Test Department of the Magnet Systems Division of the Superconducting Super Collider Laboratory (SSCL) is developing a central database of SSC magnet information that will be available to all magnet scientists at the SSCL or elsewhere, via network connections. The database contains information on the magnets' major components, configuration information (specifying which individual items were used in each cable, coil, and magnet), measurements made at major fabrication stages, and the test results on completed magnets. These data will facilitate the correlation of magnet performance with the properties of its constituents. Recent efforts have focused on the development of procedures for user-friendly access to the data, including displays in the format of the production open-quotes travelerclose quotes data sheets, standard summary reports, and a graphical interface for ad hoc queues and plots

  5. The Danish Testicular Cancer database

    Directory of Open Access Journals (Sweden)

    Daugaard G

    2016-10-01

    Full Text Available Gedske Daugaard,1 Maria Gry Gundgaard Kier,1 Mikkel Bandak,1 Mette Saksø Mortensen,1 Heidi Larsson,2 Mette Søgaard,2 Birgitte Groenkaer Toft,3 Birte Engvad,4 Mads Agerbæk,5 Niels Vilstrup Holm,6 Jakob Lauritsen1 1Department of Oncology 5073, Copenhagen University Hospital, Rigshospitalet, Copenhagen, 2Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus, 3Department of Pathology, Copenhagen University Hospital, Rigshospitalet, Copenhagen, 4Department of Pathology, Odense University Hospital, Odense, 5Department of Oncology, Aarhus University Hospital, Aarhus, 6Department of Oncology, Odense University Hospital, Odense, Denmark Aim: The nationwide Danish Testicular Cancer database consists of a retrospective research database (DaTeCa database and a prospective clinical database (Danish Multidisciplinary Cancer Group [DMCG] DaTeCa database. The aim is to improve the quality of care for patients with testicular cancer (TC in Denmark, that is, by identifying risk factors for relapse, toxicity related to treatment, and focusing on late effects. Study population: All Danish male patients with a histologically verified germ cell cancer diagnosis in the Danish Pathology Registry are included in the DaTeCa databases. Data collection has been performed from 1984 to 2007 and from 2013 onward, respectively. Main variables and descriptive data: The retrospective DaTeCa database contains detailed information with more than 300 variables related to histology, stage, treatment, relapses, pathology, tumor markers, kidney function, lung function, etc. A questionnaire related to late effects has been conducted, which includes questions regarding social relationships, life situation, general health status, family background, diseases, symptoms, use of medication, marital status, psychosocial issues, fertility, and sexuality. TC survivors alive on October 2014 were invited to fill in this questionnaire including 160 validated questions

  6. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 10 figs

  7. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage of MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. The data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 9 figs

  8. Dietary Supplement Ingredient Database

    Science.gov (United States)

    ... and US Department of Agriculture Dietary Supplement Ingredient Database Toggle navigation Menu Home About DSID Mission Current ... values can be saved to build a small database or add to an existing database for national, ...

  9. Energy Consumption Database

    Science.gov (United States)

    Consumption Database The California Energy Commission has created this on-line database for informal reporting ) classifications. The database also provides easy downloading of energy consumption data into Microsoft Excel (XLSX

  10. Subscribers and newspaper subscriptions in Spain at the beginning of the XX century. Notes from an asturian perspective

    Directory of Open Access Journals (Sweden)

    Víctor Rodríguez Infiesta

    2008-12-01

    Full Text Available In the first decades of the XX century the possibility of a particular newspaper having a high number of subscribers was, from the point of view of the newspaper publishers, the best guarantee of its stability. However it was not all advantages in a system which gave rise to numerous complaints and created a peculiar relationship with the subscriber. By various means —delivery men, the post and public establishments, principally— the subscription service continually generated situations illustrative of the level of development of the Spanish press in years in which progress accelerates and deficiencies and limitations also become apparent. Limitations which, particularly in an outlying region with mostly inadequate means of communication such as Asturias, could become one of the main factors delaying the establishment of a large press with mass-readership.

  11. Responsibility Towards The Customers Of Subscription-Based Software Solutions In The Context Of Using The Cloud Computing Technology

    Directory of Open Access Journals (Sweden)

    Bogdan Ștefan Ionescu

    2003-12-01

    Full Text Available The continuously transformation of the contemporary society and IT environment circumscribed its informational has led to the emergence of the cloud computing technology that provides the access to infrastructure and subscription-based software services, as well. In the context of a growing number of service providers with of cloud software, the paper aims to identify the perception of some current or potential users of the cloud solution, selected from among students enrolled in the accounting (professional or research master programs with the profile organized by the Bucharest University of Economic Studies, in terms of their expectations for cloud services, as well as the extent to which the SaaS providers are responsible for the provided services.

  12. The GABA[subscript A] Receptor Agonist Muscimol Induces an Age- and Region-Dependent Form of Long-Term Depression in the Mouse Striatum

    Science.gov (United States)

    Zhang, Xiaoqun; Yao, Ning; Chergui, Karima

    2016-01-01

    Several forms of long-term depression (LTD) of glutamatergic synaptic transmission have been identified in the dorsal striatum and in the nucleus accumbens (NAc). Such experience-dependent synaptic plasticity might play important roles in reward-related learning. The GABA[subscript A] receptor agonist muscimol was recently found to trigger a…

  13. A Pictorial Visualization of Normal Mode Vibrations of the Fullerene (C[subscript 60]) Molecule in Terms of Vibrations of a Hollow Sphere

    Science.gov (United States)

    Dunn, Janette L.

    2010-01-01

    Understanding the normal mode vibrations of a molecule is important in the analysis of vibrational spectra. However, the complicated 3D motion of large molecules can be difficult to interpret. We show how images of normal modes of the fullerene molecule C[subscript 60] can be made easier to understand by superimposing them on images of the normal…

  14. Using the "K[subscript 5]Connected Cognition Diagram" to Analyze Teachers' Communication and Understanding of Regions in Three-Dimensional Space

    Science.gov (United States)

    Moore-Russo, Deborah; Viglietti, Janine M.

    2012-01-01

    This paper reports on a study that introduces and applies the "K[subscript 5]Connected Cognition Diagram" as a lens to explore video data showing teachers' interactions related to the partitioning of regions by axes in a three-dimensional geometric space. The study considers "semiotic bundles" (Arzarello, 2006), introduces "semiotic connections,"…

  15. Alterations in CNS Activity Induced by Botulinum Toxin Treatment in Spasmodic Dysphonia: An H[subscript 2][superscript 15]O PET Study

    Science.gov (United States)

    Ali, S. Omar; Thomassen, Michael; Schulz, Geralyn M.; Hosey, Lara A.; Varga, Mary; Ludlow, Christy L.; Braun, Allen R.

    2006-01-01

    Speech-related changes in regional cerebral blood flow (rCBF) were measured using H[subscript 2][superscript 15]O positron-emission tomography in 9 adults with adductor spasmodic dysphonia (ADSD) before and after botulinum toxin (BTX) injection and 10 age- and gender-matched volunteers without neurological disorders. Scans were acquired at rest…

  16. Experimental Determination of pK[subscript a] Values and Metal Binding for Biomolecular Compounds Using [superscript 31]P NMR Spectroscopy

    Science.gov (United States)

    Swartz, Mason A.; Tubergen, Philip J.; Tatko, Chad D.; Baker, Rachael A.

    2018-01-01

    This lab experiment uses [superscript 31]P NMR spectroscopy of biomolecules to determine pK[subscript a] values and the binding energies of metal/biomolecule complexes. Solutions of adenosine nucleotides are prepared, and a series of [superscript 31]P NMR spectra are collected as a function of pH and in the absence and presence of magnesium or…

  17. PI[subscript 3]-Kinase Cascade Has a Differential Role in Acquisition and Extinction of Conditioned Fear Memory in Juvenile and Adult Rats

    Science.gov (United States)

    Slouzkey, Ilana; Maroun, Mouna

    2016-01-01

    The basolateral amygdala (BLA), medial prefrontal cortex (mPFC) circuit, plays a crucial role in acquisition and extinction of fear memory. Extinction of aversive memories is mediated, at least in part, by the phosphoinositide-3 kinase (P[subscript 3]K)/Akt pathway in adult rats. There is recent interest in the neural mechanisms that mediate fear…

  18. Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals.

    Science.gov (United States)

    Wicherts, Jelte M

    2016-01-01

    Recent controversies highlighting substandard peer review in Open Access (OA) and traditional (subscription) journals have increased the need for authors, funders, publishers, and institutions to assure quality of peer-review in academic journals. I propose that transparency of the peer-review process may be seen as an indicator of the quality of peer-review, and develop and validate a tool enabling different stakeholders to assess transparency of the peer-review process. Based on editorial guidelines and best practices, I developed a 14-item tool to rate transparency of the peer-review process on the basis of journals' websites. In Study 1, a random sample of 231 authors of papers in 92 subscription journals in different fields rated transparency of the journals that published their work. Authors' ratings of the transparency were positively associated with quality of the peer-review process but unrelated to journal's impact factors. In Study 2, 20 experts on OA publishing assessed the transparency of established (non-OA) journals, OA journals categorized as being published by potential predatory publishers, and journals from the Directory of Open Access Journals (DOAJ). Results show high reliability across items (α = .91) and sufficient reliability across raters. Ratings differentiated the three types of journals well. In Study 3, academic librarians rated a random sample of 140 DOAJ journals and another 54 journals that had received a hoax paper written by Bohannon to test peer-review quality. Journals with higher transparency ratings were less likely to accept the flawed paper and showed higher impact as measured by the h5 index from Google Scholar. The tool to assess transparency of the peer-review process at academic journals shows promising reliability and validity. The transparency of the peer-review process can be seen as an indicator of peer-review quality allowing the tool to be used to predict academic quality in new journals.

  19. Database of Low-E Storm Window Energy Performance across U.S. Climate Zones (Task ET-WIN-PNNL-FY13-01_5.3)

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Culp, Thomas D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-01

    This report describes process, assumptions, and modeling results produced in support of the Emerging Technologies Low-e Storm Windows Task 5.3: Create a Database of U.S. Climate-Based Analysis for Low-E Storm Windows. The scope of the overall effort is to develop a database of energy savings and cost effectiveness of low-E storm windows in residential homes across a broad range of U.S. climates using the National Energy Audit Tool (NEAT) and RESFEN model calculations. This report includes a summary of the results, NEAT and RESFEN background, methodology, and input assumptions, and an appendix with detailed results and assumptions by cliamte zone. Both sets of calculation results will be made publicly available through the Building America Solution Center.

  20. USAID Anticorruption Projects Database

    Data.gov (United States)

    US Agency for International Development — The Anticorruption Projects Database (Database) includes information about USAID projects with anticorruption interventions implemented worldwide between 2007 and...

  1. NoSQL databases

    OpenAIRE

    Mrozek, Jakub

    2012-01-01

    This thesis deals with database systems referred to as NoSQL databases. In the second chapter, I explain basic terms and the theory of database systems. A short explanation is dedicated to database systems based on the relational data model and the SQL standardized query language. Chapter Three explains the concept and history of the NoSQL databases, and also presents database models, major features and the use of NoSQL databases in comparison with traditional database systems. In the fourth ...

  2. Fundamentals of the NEA Thermochemical Database and its influence over national nuclear programs on the performance assessment of deep geological repositories.

    Science.gov (United States)

    Ragoussi, Maria-Eleni; Costa, Davide

    2017-03-14

    For the last 30 years, the NEA Thermochemical Database (TDB) Project (www.oecd-nea.org/dbtdb/) has been developing a chemical thermodynamic database for elements relevant to the safety of radioactive waste repositories, providing data that are vital to support the geochemical modeling of such systems. The recommended data are selected on the basis of strict review procedures and are characterized by their consistency. The results of these efforts are freely available, and have become an international point of reference in the field. As a result, a number of important national initiatives with regard to waste management programs have used the NEA TDB as their basis, both in terms of recommended data and guidelines. In this article we describe the fundamentals and achievements of the project together with the characteristics of some databases developed in national nuclear waste disposal programs that have been influenced by the NEA TDB. We also give some insights on how this work could be seen as an approach to be used in broader areas of environmental interest. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Dansk kolorektal Cancer Database

    DEFF Research Database (Denmark)

    Harling, Henrik; Nickelsen, Thomas

    2005-01-01

    The Danish Colorectal Cancer Database was established in 1994 with the purpose of monitoring whether diagnostic and surgical principles specified in the evidence-based national guidelines of good clinical practice were followed. Twelve clinical indicators have been listed by the Danish Colorectal...... Cancer Group, and the performance of each hospital surgical department with respect to these indicators is reported annually. In addition, the register contains a large collection of data that provide valuable information on the influence of comorbidity and lifestyle factors on disease outcome...

  4. Usability in Scientific Databases

    Directory of Open Access Journals (Sweden)

    Ana-Maria Suduc

    2012-07-01

    Full Text Available Usability, most often defined as the ease of use and acceptability of a system, affects the users' performance and their job satisfaction when working with a machine. Therefore, usability is a very important aspect which must be considered in the process of a system development. The paper presents several numerical data related to the history of the scientific research of the usability of information systems, as it is viewed in the information provided by three important scientific databases, Science Direct, ACM Digital Library and IEEE Xplore Digital Library, at different queries related to this field.

  5. PrimateLit Database

    Science.gov (United States)

    Primate Info Net Related Databases NCRR PrimateLit: A bibliographic database for primatology Top of any problems with this service. We welcome your feedback. The PrimateLit database is no longer being Resources, National Institutes of Health. The database is a collaborative project of the Wisconsin Primate

  6. Audit Database and Information Tracking System

    Data.gov (United States)

    Social Security Administration — This database contains information about the Social Security Administration's audits regarding SSA agency performance and compliance. These audits can be requested...

  7. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  8. KALIMER database development

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  9. Logical database design principles

    CERN Document Server

    Garmany, John; Clark, Terry

    2005-01-01

    INTRODUCTION TO LOGICAL DATABASE DESIGNUnderstanding a Database Database Architectures Relational Databases Creating the Database System Development Life Cycle (SDLC)Systems Planning: Assessment and Feasibility System Analysis: RequirementsSystem Analysis: Requirements Checklist Models Tracking and Schedules Design Modeling Functional Decomposition DiagramData Flow Diagrams Data Dictionary Logical Structures and Decision Trees System Design: LogicalSYSTEM DESIGN AND IMPLEMENTATION The ER ApproachEntities and Entity Types Attribute Domains AttributesSet-Valued AttributesWeak Entities Constraint

  10. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  11. Software listing: CHEMTOX database

    International Nuclear Information System (INIS)

    Moskowitz, P.D.

    1993-01-01

    Initially launched in 1983, the CHEMTOX Database was among the first microcomputer databases containing hazardous chemical information. The database is used in many industries and government agencies in more than 17 countries. Updated quarterly, the CHEMTOX Database provides detailed environmental and safety information on 7500-plus hazardous substances covered by dozens of regulatory and advisory sources. This brief listing describes the method of accessing data and provides ordering information for those wishing to obtain the CHEMTOX Database

  12. Asbestos Exposure Assessment Database

    Science.gov (United States)

    Arcot, Divya K.

    2010-01-01

    Exposure to particular hazardous materials in a work environment is dangerous to the employees who work directly with or around the materials as well as those who come in contact with them indirectly. In order to maintain a national standard for safe working environments and protect worker health, the Occupational Safety and Health Administration (OSHA) has set forth numerous precautionary regulations. NASA has been proactive in adhering to these regulations by implementing standards which are often stricter than regulation limits and administering frequent health risk assessments. The primary objective of this project is to create the infrastructure for an Asbestos Exposure Assessment Database specific to NASA Johnson Space Center (JSC) which will compile all of the exposure assessment data into a well-organized, navigable format. The data includes Sample Types, Samples Durations, Crafts of those from whom samples were collected, Job Performance Requirements (JPR) numbers, Phased Contrast Microscopy (PCM) and Transmission Electron Microscopy (TEM) results and qualifiers, Personal Protective Equipment (PPE), and names of industrial hygienists who performed the monitoring. This database will allow NASA to provide OSHA with specific information demonstrating that JSC s work procedures are protective enough to minimize the risk of future disease from the exposures. The data has been collected by the NASA contractors Computer Sciences Corporation (CSC) and Wyle Laboratories. The personal exposure samples were collected from devices worn by laborers working at JSC and by building occupants located in asbestos-containing buildings.

  13. Determination of the Antibiotic Oxytetracycline in Commercial Milk by Solid-Phase Extraction: A High-Performance Liquid Chromatography (HPLC) Experiment for Quantitative Instrumental Analysis

    Science.gov (United States)

    Mei-Ratliff, Yuan

    2012-01-01

    Trace levels of oxytetracylcine spiked into commercial milk samples are extracted, cleaned up, and preconcentrated using a C[subscript 18] solid-phase extraction column. The extract is then analyzed by a high-performance liquid chromatography (HPLC) instrument equipped with a UV detector and a C[subscript 18] column (150 mm x 4.6 mm x 3.5 [mu]m).…

  14. The Danish Fetal Medicine Database

    Directory of Open Access Journals (Sweden)

    Ekelund CK

    2016-10-01

    Full Text Available Charlotte Kvist Ekelund,1 Tine Iskov Kopp,2 Ann Tabor,1 Olav Bjørn Petersen3 1Department of Obstetrics, Center of Fetal Medicine, Rigshospitalet, University of Copenhagen, Copenhagen, Denmark; 2Registry Support Centre (East – Epidemiology and Biostatistics, Research Centre for Prevention and Health, Glostrup, Denmark; 3Fetal Medicine Unit, Aarhus University Hospital, Aarhus Nord, Denmark Aim: The aim of this study is to set up a database in order to monitor the detection rates and false-positive rates of first-trimester screening for chromosomal abnormalities and prenatal detection rates of fetal malformations in Denmark. Study population: Pregnant women with a first or second trimester ultrasound scan performed at all public hospitals in Denmark are registered in the database. Main variables/descriptive data: Data on maternal characteristics, ultrasonic, and biochemical variables are continuously sent from the fetal medicine units' Astraia databases to the central database via web service. Information about outcome of pregnancy (miscarriage, termination, live birth, or stillbirth is received from the National Patient Register and National Birth Register and linked via the Danish unique personal registration number. Furthermore, results of all pre- and postnatal chromosome analyses are sent to the database. Conclusion: It has been possible to establish a fetal medicine database, which monitors first-trimester screening for chromosomal abnormalities and second-trimester screening for major fetal malformations with the input from already collected data. The database is valuable to assess the performance at a regional level and to compare Danish performance with international results at a national level. Keywords: prenatal screening, nuchal translucency, fetal malformations, chromosomal abnormalities

  15. Database Description - PSCDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name PSCDB Alternative n...rial Science and Technology (AIST) Takayuki Amemiya E-mail: Database classification Structure Databases - Protein structure Database...554-D558. External Links: Original website information Database maintenance site Graduate School of Informat...available URL of Web services - Need for user registration Not available About This Database Database Descri...ption Download License Update History of This Database Site Policy | Contact Us Database Description - PSCDB | LSDB Archive ...

  16. Directory of IAEA databases

    International Nuclear Information System (INIS)

    1991-11-01

    The first edition of the Directory of IAEA Databases is intended to describe the computerized information sources available to IAEA staff members. It contains a listing of all databases produced at the IAEA, together with information on their availability

  17. Native Health Research Database

    Science.gov (United States)

    ... Indian Health Board) Welcome to the Native Health Database. Please enter your search terms. Basic Search Advanced ... To learn more about searching the Native Health Database, click here. Tutorial Video The NHD has made ...

  18. Cell Centred Database (CCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Cell Centered Database (CCDB) is a web accessible database for high resolution 2D, 3D and 4D data from light and electron microscopy, including correlated imaging.

  19. E3 Staff Database

    Data.gov (United States)

    US Agency for International Development — E3 Staff database is maintained by E3 PDMS (Professional Development & Management Services) office. The database is Mysql. It is manually updated by E3 staff as...

  20. JT-60 database system, 2

    International Nuclear Information System (INIS)

    Itoh, Yasuhiro; Kurihara, Kenichi; Kimura, Toyoaki.

    1987-07-01

    The JT-60 central control system, ''ZENKEI'' collects the control and instrumentation data relevant to discharge and device status data for plant monitoring. The former of the engineering data amounts to about 3 Mbytes per shot of discharge. The ''ZENKEI'' control system which consists of seven minicomputers for on-line real-time control has little performance of handling such a large amount of data for physical and engineering analysis. In order to solve this problem, it was planned to establish the experimental database on the Front-end Processor (FEP) of general purpose large computer in JAERI Computer Center. The database management system (DBMS), therefore, has been developed for creating the database during the shot interval. The engineering data are shipped up from ''ZENKEI'' to FEP through the dedicated communication line after the shot. The hierarchical data model has been adopted in this database, which consists of the data files with tree structure of three keys of system, discharge type and shot number. The JT-60 DBMS provides the data handling packages of subroutines for interfacing the database with user's application programs. The subroutine packages for supporting graphic processing and the function of access control for security of the database are also prepared in this DBMS. (author)

  1. Database on wind characteristics - Contents of database bank

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2004-01-01

    and Denmark, with Denmark as the Operating Agent. The reporting of the continuation of Annex XVII falls in two separate parts. Part one accounts in detailsfor the available data in the established database bank, and part two describes various data analyses performed with the overall purpose of improving...

  2. NIRS database of the original research database

    International Nuclear Information System (INIS)

    Morita, Kyoko

    1991-01-01

    Recently, library staffs arranged and compiled the original research papers that have been written by researchers for 33 years since National Institute of Radiological Sciences (NIRS) established. This papers describes how the internal database of original research papers has been created. This is a small sample of hand-made database. This has been cumulating by staffs who have any knowledge about computer machine or computer programming. (author)

  3. The Effects of the Rope Jump Training Program in Physical Education Lessons on Strength, Speed and VO[subscript 2] Max in Children

    Science.gov (United States)

    Eler, Nebahat; Acar, Hakan

    2018-01-01

    The aim of this study is to examine the effects of rope-jump training program in physical education lessons on strength, speed and VO[subscript 2] max in 10-12 year old boys. 240 male students; rope-jump group (n = 120) and control group (n = 120) participated in the study. Rope-Jump group continued 10 weeks of regular physical education and sport…

  4. Scopus database: a review.

    Science.gov (United States)

    Burnham, Judy F

    2006-03-08

    The Scopus database provides access to STM journal articles and the references included in those articles, allowing the searcher to search both forward and backward in time. The database can be used for collection development as well as for research. This review provides information on the key points of the database and compares it to Web of Science. Neither database is inclusive, but complements each other. If a library can only afford one, choice must be based in institutional needs.

  5. Aviation Safety Issues Database

    Science.gov (United States)

    Morello, Samuel A.; Ricks, Wendell R.

    2009-01-01

    The aviation safety issues database was instrumental in the refinement and substantiation of the National Aviation Safety Strategic Plan (NASSP). The issues database is a comprehensive set of issues from an extremely broad base of aviation functions, personnel, and vehicle categories, both nationally and internationally. Several aviation safety stakeholders such as the Commercial Aviation Safety Team (CAST) have already used the database. This broader interest was the genesis to making the database publically accessible and writing this report.

  6. Inleiding database-systemen

    NARCIS (Netherlands)

    Pels, H.J.; Lans, van der R.F.; Pels, H.J.; Meersman, R.A.

    1993-01-01

    Dit artikel introduceert de voornaamste begrippen die een rol spelen rond databases en het geeft een overzicht van de doelstellingen, de functies en de componenten van database-systemen. Hoewel de functie van een database intuitief vrij duidelijk is, is het toch een in technologisch opzicht complex

  7. 75 FR 5513 - Determination of Rates and Terms for Preexisting Subscription Services and Satellite Digital...

    Science.gov (United States)

    2010-02-03

    ...), 114, and 801(b)(1). 0 2. Section 382.12 is revised to read as follows: Sec. 382.12 Royalty fees for... monthly royalty fee to be paid by a Licensee for the public performance of sound recordings pursuant to 17... LIBRARY OF CONGRESS Copyright Royalty Board 37 CFR Part 382 [Docket No. 2006-1 CRB DSTRA...

  8. 78 FR 31842 - Determination of Rates and Terms for Preexisting Subscription Services and Satellite Digital...

    Science.gov (United States)

    2013-05-28

    ... the profit on the consolidated financial statements of Music Choice over the past five years, 2007.... 1572:3-1576:2 (Del Beccaro), masks the financial performance of the PSS business. As a consolidated... unprofitability under the existing PSS rate come from the oblique presentation of its financial data and a...

  9. Database Description - RMOS | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name RMOS Alternative nam...arch Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice Microarray Data and other Gene Expression Database...s Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description The Ric...19&lang=en Whole data download - Referenced database Rice Expression Database (RED) Rice full-length cDNA Database... (KOME) Rice Genome Integrated Map Database (INE) Rice Mutant Panel Database (Tos17) Rice Genome Annotation Database

  10. Ageing Management Program Database

    International Nuclear Information System (INIS)

    Basic, I.; Vrbanic, I.; Zabric, I.; Savli, S.

    2008-01-01

    The aspects of plant ageing management (AM) gained increasing attention over the last ten years. Numerous technical studies have been performed to study the impact of ageing mechanisms on the safe and reliable operation of nuclear power plants. National research activities have been initiated or are in progress to provide the technical basis for decision making processes. The long-term operation of nuclear power plants is influenced by economic considerations, the socio-economic environment including public acceptance, developments in research and the regulatory framework, the availability of technical infrastructure to maintain and service the systems, structures and components as well as qualified personnel. Besides national activities there are a number of international activities in particular under the umbrella of the IAEA, the OECD and the EU. The paper discusses the process, procedure and database developed for Slovenian Nuclear Safety Administration (SNSA) surveillance of ageing process of Nuclear power Plant Krsko.(author)

  11. Integral Representation of the Pictorial Proof of Sum of [superscript n][subscript k=1]k[superscript 2] = 1/6n(n+1)(2n+1)

    Science.gov (United States)

    Kobayashi, Yukio

    2011-01-01

    The pictorial proof of the sum of [superscript n][subscript k=1] k[superscript 2] = 1/6n(n+1)(2n+1) is represented in the form of an integral. The integral representations are also applicable to the sum of [superscript n][subscript k-1] k[superscript m] (m greater than or equal to 3). These representations reveal that the sum of [superscript…

  12. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  13. Keyword Search in Databases

    CERN Document Server

    Yu, Jeffrey Xu; Chang, Lijun

    2009-01-01

    It has become highly desirable to provide users with flexible ways to query/search information over databases as simple as keyword search like Google search. This book surveys the recent developments on keyword search over databases, and focuses on finding structural information among objects in a database using a set of keywords. Such structural information to be returned can be either trees or subgraphs representing how the objects, that contain the required keywords, are interconnected in a relational database or in an XML database. The structural keyword search is completely different from

  14. Nuclear power economic database

    International Nuclear Information System (INIS)

    Ding Xiaoming; Li Lin; Zhao Shiping

    1996-01-01

    Nuclear power economic database (NPEDB), based on ORACLE V6.0, consists of three parts, i.e., economic data base of nuclear power station, economic data base of nuclear fuel cycle and economic database of nuclear power planning and nuclear environment. Economic database of nuclear power station includes data of general economics, technique, capital cost and benefit, etc. Economic database of nuclear fuel cycle includes data of technique and nuclear fuel price. Economic database of nuclear power planning and nuclear environment includes data of energy history, forecast, energy balance, electric power and energy facilities

  15. Thermal barrier coatings: Coating methods, performance, and heat engine applications. (Latest citations from the EI Compendex*plus database). Published Search

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    The bibliography contains citations concerning conference proceedings on coating methods, performance evaluations, and applications of thermal barrier coatings as protective coatings for heat engine components against high temperature corrosions and chemical erosions. The developments of thermal barrier coating techniques for high performance and reliable gas turbines, diesel engines, jet engines, and internal combustion engines are presented. Topics include plasma sprayed coating methods, yttria stabilized zirconia coatings, coating life models, coating failure and durability, thermal shock and cycling, and acoustic emission analysis of coatings. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  16. Thermal barrier coatings: Coating methods, performance, and heat engine applications. (Latest citations from the EI Compendex*plus database). Published Search

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    The bibliography contains citations concerning conference proceedings on coating methods, performance evaluations, and applications of thermal barrier coatings as protective coatings for heat engine components against high temperature corrosions and chemical erosions. The developments of thermal barrier coating techniques for high performance and reliable gas turbines, diesel engines, jet engines, and internal combustion engines are presented. Topics include plasma sprayed coating methods, yttria stabilized zirconia coatings, coating life models, coating failure and durability, thermal shock and cycling, and acoustic emission analysis of coatings. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  17. Database Description - RPD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ase Description General information of database Database name RPD Alternative name Rice Proteome Database...titute of Crop Science, National Agriculture and Food Research Organization Setsuko Komatsu E-mail: Database... classification Proteomics Resources Plant databases - Rice Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database... description Rice Proteome Database contains information on protei...and entered in the Rice Proteome Database. The database is searchable by keyword,

  18. Database Description - JSNP | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name JSNP Alternative nam...n Science and Technology Agency Creator Affiliation: Contact address E-mail : Database...sapiens Taxonomy ID: 9606 Database description A database of about 197,000 polymorphisms in Japanese populat...1):605-610 External Links: Original website information Database maintenance site Institute of Medical Scien...er registration Not available About This Database Database Description Download License Update History of This Database

  19. Database Description - ASTRA | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name ASTRA Alternative n...tics Journal Search: Contact address Database classification Nucleotide Sequence Databases - Gene structure,...3702 Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description The database represents classified p...(10):1211-6. External Links: Original website information Database maintenance site National Institute of Ad... for user registration Not available About This Database Database Description Dow

  20. Database Description - RED | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ase Description General information of database Database name RED Alternative name Rice Expression Database...enome Research Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice Database classifi...cation Microarray, Gene Expression Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database descripti... Article title: Rice Expression Database: the gateway to rice functional genomics...nt Science (2002) Dec 7 (12):563-564 External Links: Original website information Database maintenance site

  1. Database Description - PLACE | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name PLACE Alternative name A Database...Kannondai, Tsukuba, Ibaraki 305-8602, Japan National Institute of Agrobiological Sciences E-mail : Databas...e classification Plant databases Organism Taxonomy Name: Tracheophyta Taxonomy ID: 58023 Database...99, Vol.27, No.1 :297-300 External Links: Original website information Database maintenance site National In...- Need for user registration Not available About This Database Database Descripti

  2. Optimizing the Quality of Dynamic Context Subscriptions for Scarce Network Resources

    DEFF Research Database (Denmark)

    Shawky, Ahmed; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2012-01-01

    Scalable access to dynamic context information is a key challenge for future context-sensitive systems. When increasing the access frequency, the information accuracy can improve but at the same time the additional context management traffic may reduce network performance, which creates...... the opposite effect on information reliability. In order to understand and control this trade-off, this paper develops a model that allows to calculate context reliability, captured by the so-called mismatch probability, in relation to the network load. The model is subsequently used for a real time algorithm...

  3. Database Description - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Database Description General information of database Database n... BioResource Center Hiroshi Masuya Database classification Plant databases - Arabidopsis thaliana Organism T...axonomy Name: Arabidopsis thaliana Taxonomy ID: 3702 Database description The Arabidopsis thaliana phenome i...heir effective application. We developed the new Arabidopsis Phenome Database integrating two novel database...seful materials for their experimental research. The other, the “Database of Curated Plant Phenome” focusing

  4. Development of a PSA information database system

    International Nuclear Information System (INIS)

    Kim, Seung Hwan

    2005-01-01

    The need to develop the PSA information database for performing a PSA has been growing rapidly. For example, performing a PSA requires a lot of data to analyze, to evaluate the risk, to trace the process of results and to verify the results. PSA information database is a system that stores all PSA related information into the database and file system with cross links to jump to the physical documents whenever they are needed. Korea Atomic Energy Research Institute is developing a PSA information database system, AIMS (Advanced Information Management System for PSA). The objective is to integrate and computerize all the distributed information of a PSA into a system and to enhance the accessibility to PSA information for all PSA related activities. This paper describes how we implemented such a database centered application in the view of two areas, database design and data (document) service

  5. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  6. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  7. Danish Colorectal Cancer Group Database.

    Science.gov (United States)

    Ingeholm, Peter; Gögenur, Ismail; Iversen, Lene H

    2016-01-01

    The aim of the database, which has existed for registration of all patients with colorectal cancer in Denmark since 2001, is to improve the prognosis for this patient group. All Danish patients with newly diagnosed colorectal cancer who are either diagnosed or treated in a surgical department of a public Danish hospital. The database comprises an array of surgical, radiological, oncological, and pathological variables. The surgeons record data such as diagnostics performed, including type and results of radiological examinations, lifestyle factors, comorbidity and performance, treatment including the surgical procedure, urgency of surgery, and intra- and postoperative complications within 30 days after surgery. The pathologists record data such as tumor type, number of lymph nodes and metastatic lymph nodes, surgical margin status, and other pathological risk factors. The database has had >95% completeness in including patients with colorectal adenocarcinoma with >54,000 patients registered so far with approximately one-third rectal cancers and two-third colon cancers and an overrepresentation of men among rectal cancer patients. The stage distribution has been more or less constant until 2014 with a tendency toward a lower rate of stage IV and higher rate of stage I after introduction of the national screening program in 2014. The 30-day mortality rate after elective surgery has been reduced from >7% in 2001-2003 to database is a national population-based clinical database with high patient and data completeness for the perioperative period. The resolution of data is high for description of the patient at the time of diagnosis, including comorbidities, and for characterizing diagnosis, surgical interventions, and short-term outcomes. The database does not have high-resolution oncological data and does not register recurrences after primary surgery. The Danish Colorectal Cancer Group provides high-quality data and has been documenting an increase in short- and long

  8. Structural Basis of Multifunctionality in a Vitamin B[subscript 12]-processing Enzyme

    Energy Technology Data Exchange (ETDEWEB)

    Koutmos, Markos; Gherasim, Carmen; Smith, Janet L.; Banerjee, Ruma (Michigan)

    2012-07-11

    An early step in the intracellular processing of vitamin B{sub 12} involves CblC, which exhibits dual reactivity, catalyzing the reductive decyanation of cyanocobalamin (vitamin B{sub 12}), and the dealkylation of alkylcobalamins (e.g. methylcobalamin; MeCbl). Insights into how the CblC scaffold supports this chemical dichotomy have been unavailable despite it being the most common locus of patient mutations associated with inherited cobalamin disorders that manifest in both severe homocystinuria and methylmalonic aciduria. Herein, we report structures of human CblC, with and without bound MeCbl, which provide novel biochemical insights into its mechanism of action. Our results reveal that CblC is the most divergent member of the NADPH-dependent flavin reductase family and can use FMN or FAD as a prosthetic group to catalyze reductive decyanation. Furthermore, CblC is the first example of an enzyme with glutathione transferase activity that has a sequence and structure unrelated to the GST superfamily. CblC thus represents an example of evolutionary adaptation of a common structural platform to perform diverse chemistries. The CblC structure allows us to rationalize the biochemical basis of a number of pathological mutations associated with severe clinical phenotypes.

  9. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  10. Database Optimizing Services

    Directory of Open Access Journals (Sweden)

    Adrian GHENCEA

    2010-12-01

    Full Text Available Almost every organization has at its centre a database. The database provides support for conducting different activities, whether it is production, sales and marketing or internal operations. Every day, a database is accessed for help in strategic decisions. The satisfaction therefore of such needs is entailed with a high quality security and availability. Those needs can be realised using a DBMS (Database Management System which is, in fact, software for a database. Technically speaking, it is software which uses a standard method of cataloguing, recovery, and running different data queries. DBMS manages the input data, organizes it, and provides ways of modifying or extracting the data by its users or other programs. Managing the database is an operation that requires periodical updates, optimizing and monitoring.

  11. National Database of Geriatrics

    DEFF Research Database (Denmark)

    Kannegaard, Pia Nimann; Vinding, Kirsten L; Hare-Bruun, Helle

    2016-01-01

    AIM OF DATABASE: The aim of the National Database of Geriatrics is to monitor the quality of interdisciplinary diagnostics and treatment of patients admitted to a geriatric hospital unit. STUDY POPULATION: The database population consists of patients who were admitted to a geriatric hospital unit....... Geriatric patients cannot be defined by specific diagnoses. A geriatric patient is typically a frail multimorbid elderly patient with decreasing functional ability and social challenges. The database includes 14-15,000 admissions per year, and the database completeness has been stable at 90% during the past......, percentage of discharges with a rehabilitation plan, and the part of cases where an interdisciplinary conference has taken place. Data are recorded by doctors, nurses, and therapists in a database and linked to the Danish National Patient Register. DESCRIPTIVE DATA: Descriptive patient-related data include...

  12. Tradeoffs in distributed databases

    OpenAIRE

    Juntunen, R. (Risto)

    2016-01-01

    Abstract In a distributed database data is spread throughout the network into separated nodes with different DBMS systems (Date, 2000). According to CAP-theorem three database properties — consistency, availability and partition tolerance cannot be achieved simultaneously in distributed database systems. Two of these properties can be achieved but not all three at the same time (Brewer, 2000). Since this theorem there has b...

  13. Visualizing molecular juggling within a B[subscript 12]-dependent methyltransferase complex

    Energy Technology Data Exchange (ETDEWEB)

    Kung, Yan; Ando, Nozomi; Doukov, Tzanko I.; Blasiak, Leah C.; Bender, Güne; #351; ; Seravalli, Javier; Ragsdale, Stephen W.; Drennan, Catherine L. (MIT); (Michigan); (UNL)

    2013-04-08

    Derivatives of vitamin B{sub 12} are used in methyl group transfer in biological processes as diverse as methionine synthesis in humans and CO{sub 2} fixation in acetogenic bacteria. This seemingly straightforward reaction requires large, multimodular enzyme complexes that adopt multiple conformations to alternately activate, protect and perform catalysis on the reactive B{sub 12} cofactor. Crystal structures determined thus far have provided structural information for only fragments of these complexes, inspiring speculation about the overall protein assembly and conformational movements inherent to activity. Here we present X-ray crystal structures of a complete 220 kDa complex that contains all enzymes responsible for B{sub 12}-dependent methyl transfer, namely the corrinoid iron-sulphur protein and its methyltransferase from the model acetogen Moorella thermoacetica. These structures provide the first three-dimensional depiction of all protein modules required for the activation, protection and catalytic steps of B{sub 12}-dependent methyl transfer. In addition, the structures capture B{sub 12} at multiple locations between its 'resting' and catalytic positions, allowing visualization of the dramatic protein rearrangements that enable methyl transfer and identification of the trajectory for B{sub 12} movement within the large enzyme scaffold. The structures are also presented alongside in crystallo spectroscopic data, which confirm enzymatic activity within crystals and demonstrate the largest known conformational movements of proteins in a crystalline state. Taken together, this work provides a model for the molecular juggling that accompanies turnover and helps explain why such an elaborate protein framework is required for such a simple, yet biologically essential reaction.

  14. Database Description - SAHG | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name SAHG Alternative nam...h: Contact address Chie Motono Tel : +81-3-3599-8067 E-mail : Database classification Structure Databases - ...e databases - Protein properties Organism Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database description... Links: Original website information Database maintenance site The Molecular Profiling Research Center for D...stration Not available About This Database Database Description Download License Update History of This Database Site Policy | Contact Us Database Description - SAHG | LSDB Archive ...

  15. Intermodal Passenger Connectivity Database -

    Data.gov (United States)

    Department of Transportation — The Intermodal Passenger Connectivity Database (IPCD) is a nationwide data table of passenger transportation terminals, with data on the availability of connections...

  16. Transporter Classification Database (TCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Transporter Classification Database details a comprehensive classification system for membrane transport proteins known as the Transporter Classification (TC)...

  17. Residency Allocation Database

    Data.gov (United States)

    Department of Veterans Affairs — The Residency Allocation Database is used to determine allocation of funds for residency programs offered by Veterans Affairs Medical Centers (VAMCs). Information...

  18. Smart Location Database - Service

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Smart Location Database (SLD) summarizes over 80 demographic, built environment, transit service, and destination accessibility attributes for every census block...

  19. Veterans Administration Databases

    Science.gov (United States)

    The Veterans Administration Information Resource Center provides database and informatics experts, customer service, expert advice, information products, and web technology to VA researchers and others.

  20. IVR EFP Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database contains trip-level reports submitted by vessels participating in Exempted Fishery projects with IVR reporting requirements.

  1. Towards Sensor Database Systems

    DEFF Research Database (Denmark)

    Bonnet, Philippe; Gehrke, Johannes; Seshadri, Praveen

    2001-01-01

    . These systems lack flexibility because data is extracted in a predefined way; also, they do not scale to a large number of devices because large volumes of raw data are transferred regardless of the queries that are submitted. In our new concept of sensor database system, queries dictate which data is extracted...... from the sensors. In this paper, we define the concept of sensor databases mixing stored data represented as relations and sensor data represented as time series. Each long-running query formulated over a sensor database defines a persistent view, which is maintained during a given time interval. We...... also describe the design and implementation of the COUGAR sensor database system....

  2. Database Publication Practices

    DEFF Research Database (Denmark)

    Bernstein, P.A.; DeWitt, D.; Heuer, A.

    2005-01-01

    There has been a growing interest in improving the publication processes for database research papers. This panel reports on recent changes in those processes and presents an initial cut at historical data for the VLDB Journal and ACM Transactions on Database Systems.......There has been a growing interest in improving the publication processes for database research papers. This panel reports on recent changes in those processes and presents an initial cut at historical data for the VLDB Journal and ACM Transactions on Database Systems....

  3. Smart Location Database - Download

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Smart Location Database (SLD) summarizes over 80 demographic, built environment, transit service, and destination accessibility attributes for every census block...

  4. Database Description - RMG | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available ase Description General information of database Database name RMG Alternative name ...raki 305-8602, Japan National Institute of Agrobiological Sciences E-mail : Database... classification Nucleotide Sequence Databases Organism Taxonomy Name: Oryza sativa Japonica Group Taxonomy ID: 39947 Database...rnal: Mol Genet Genomics (2002) 268: 434–445 External Links: Original website information Database...available URL of Web services - Need for user registration Not available About This Database Database Descri

  5. Database Description - KOME | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name KOME Alternative nam... Sciences Plant Genome Research Unit Shoshi Kikuchi E-mail : Database classification Plant databases - Rice ...Organism Taxonomy Name: Oryza sativa Taxonomy ID: 4530 Database description Information about approximately ...Hayashizaki Y, Kikuchi S. Journal: PLoS One. 2007 Nov 28; 2(11):e1235. External Links: Original website information Database...OS) Rice mutant panel database (Tos17) A Database of Plant Cis-acting Regulatory

  6. Update History of This Database - Arabidopsis Phenome Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Arabidopsis Phenome Database Update History of This Database Date Update contents 2017/02/27 Arabidopsis Phenome Data...base English archive site is opened. - Arabidopsis Phenome Database (http://jphenom...e.info/?page_id=95) is opened. About This Database Database Description Download License Update History of This Database... Site Policy | Contact Us Update History of This Database - Arabidopsis Phenome Database | LSDB Archive ...

  7. Update History of This Database - SKIP Stemcell Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SKIP Stemcell Database Update History of This Database Date Update contents 2017/03/13 SKIP Stemcell Database... English archive site is opened. 2013/03/29 SKIP Stemcell Database ( https://www.skip.med.k...eio.ac.jp/SKIPSearch/top?lang=en ) is opened. About This Database Database Description Download License Update History of This Databa...se Site Policy | Contact Us Update History of This Database - SKIP Stemcell Database | LSDB Archive ...

  8. Update History of This Database - Yeast Interacting Proteins Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Yeast Interacting Proteins Database Update History of This Database Date Update contents 201...0/03/29 Yeast Interacting Proteins Database English archive site is opened. 2000/12/4 Yeast Interacting Proteins Database...( http://itolab.cb.k.u-tokyo.ac.jp/Y2H/ ) is released. About This Database Database Description... Download License Update History of This Database Site Policy | Contact Us Update History of This Database... - Yeast Interacting Proteins Database | LSDB Archive ...

  9. Artificial Neural Networks for differential diagnosis of breast lesions in MR-Mammography: A systematic approach addressing the influence of network architecture on diagnostic performance using a large clinical database

    International Nuclear Information System (INIS)

    Dietzel, Matthias; Baltzer, Pascal A.T.; Dietzel, Andreas; Zoubi, Ramy; Gröschel, Tobias; Burmeister, Hartmut P.; Bogdan, Martin; Kaiser, Werner A.

    2012-01-01

    Rationale and objectives: Differential diagnosis of lesions in MR-Mammography (MRM) remains a complex task. The aim of this MRM study was to design and to test robustness of Artificial Neural Network architectures to predict malignancy using a large clinical database. Materials and methods: For this IRB-approved investigation standardized protocols and study design were applied (T1w-FLASH; 0.1 mmol/kgBW Gd-DTPA; T2w-TSE; histological verification after MRM). All lesions were evaluated by two experienced (>500 MRM) radiologists in consensus. In every lesion, 18 previously published descriptors were assessed and documented in the database. An Artificial Neural Network (ANN) was developed to process this database (The-MathWorks/Inc., feed-forward-architecture/resilient back-propagation-algorithm). All 18 descriptors were set as input variables, whereas histological results (malignant vs. benign) was defined as classification variable. Initially, the ANN was optimized in terms of “Training Epochs” (TE), “Hidden Layers” (HL), “Learning Rate” (LR) and “Neurons” (N). Robustness of the ANN was addressed by repeated evaluation cycles (n: 9) with receiver operating characteristics (ROC) analysis of the results applying 4-fold Cross Validation. The best network architecture was identified comparing the corresponding Area under the ROC curve (AUC). Results: Histopathology revealed 436 benign and 648 malignant lesions. Enhancing the level of complexity could not increase diagnostic accuracy of the network (P: n.s.). The optimized ANN architecture (TE: 20, HL: 1, N: 5, LR: 1.2) was accurate (mean-AUC 0.888; P: <0.001) and robust (CI: 0.885–0.892; range: 0.880–0.898). Conclusion: The optimized neural network showed robust performance and high diagnostic accuracy for prediction of malignancy on unknown data.

  10. Artificial Neural Networks for differential diagnosis of breast lesions in MR-Mammography: a systematic approach addressing the influence of network architecture on diagnostic performance using a large clinical database.

    Science.gov (United States)

    Dietzel, Matthias; Baltzer, Pascal A T; Dietzel, Andreas; Zoubi, Ramy; Gröschel, Tobias; Burmeister, Hartmut P; Bogdan, Martin; Kaiser, Werner A

    2012-07-01

    Differential diagnosis of lesions in MR-Mammography (MRM) remains a complex task. The aim of this MRM study was to design and to test robustness of Artificial Neural Network architectures to predict malignancy using a large clinical database. For this IRB-approved investigation standardized protocols and study design were applied (T1w-FLASH; 0.1 mmol/kgBW Gd-DTPA; T2w-TSE; histological verification after MRM). All lesions were evaluated by two experienced (>500 MRM) radiologists in consensus. In every lesion, 18 previously published descriptors were assessed and documented in the database. An Artificial Neural Network (ANN) was developed to process this database (The-MathWorks/Inc., feed-forward-architecture/resilient back-propagation-algorithm). All 18 descriptors were set as input variables, whereas histological results (malignant vs. benign) was defined as classification variable. Initially, the ANN was optimized in terms of "Training Epochs" (TE), "Hidden Layers" (HL), "Learning Rate" (LR) and "Neurons" (N). Robustness of the ANN was addressed by repeated evaluation cycles (n: 9) with receiver operating characteristics (ROC) analysis of the results applying 4-fold Cross Validation. The best network architecture was identified comparing the corresponding Area under the ROC curve (AUC). Histopathology revealed 436 benign and 648 malignant lesions. Enhancing the level of complexity could not increase diagnostic accuracy of the network (P: n.s.). The optimized ANN architecture (TE: 20, HL: 1, N: 5, LR: 1.2) was accurate (mean-AUC 0.888; P: <0.001) and robust (CI: 0.885-0.892; range: 0.880-0.898). The optimized neural network showed robust performance and high diagnostic accuracy for prediction of malignancy on unknown data. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Download - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Download First of all, please read the license of this database. Data ...1.4 KB) Simple search and download Downlaod via FTP FTP server is sometimes jammed. If it is, access [here]. About This Database Data...base Description Download License Update History of This Database Site Policy | Contact Us Download - Trypanosomes Database | LSDB Archive ...

  12. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  13. Database Cancellation: The "Hows" and "Whys"

    Science.gov (United States)

    Shapiro, Steven

    2012-01-01

    Database cancellation is one of the most difficult tasks performed by a librarian. This may seem counter-intuitive but, psychologically, it is certainly true. When a librarian or a team of librarians has invested a great deal of time doing research, talking to potential users, and conducting trials before deciding to subscribe to a database, they…

  14. Oracle database 12c the complete reference

    CERN Document Server

    Bryla, Bob

    2014-01-01

    Maintain a scalable, highly available enterprise platform and reduce complexity by leveraging the powerful new tools and cloud enhancements of Oracle Database 12c. This authoritative Oracle Press guide offers complete coverage of installation, configuration, tuning, and administration. Find out how to build and populate Oracle databases, perform effective queries, design applications, and secure your enterprise data

  15. TWRS technical baseline database manager definition document

    International Nuclear Information System (INIS)

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  16. Directory of IAEA databases

    International Nuclear Information System (INIS)

    1992-12-01

    This second edition of the Directory of IAEA Databases has been prepared within the Division of Scientific and Technical Information (NESI). Its main objective is to describe the computerized information sources available to staff members. This directory contains all databases produced at the IAEA, including databases stored on the mainframe, LAN's and PC's. All IAEA Division Directors have been requested to register the existence of their databases with NESI. For the second edition database owners were requested to review the existing entries for their databases and answer four additional questions. The four additional questions concerned the type of database (e.g. Bibliographic, Text, Statistical etc.), the category of database (e.g. Administrative, Nuclear Data etc.), the available documentation and the type of media used for distribution. In the individual entries on the following pages the answers to the first two questions (type and category) is always listed, but the answers to the second two questions (documentation and media) is only listed when information has been made available

  17. HIV Structural Database

    Science.gov (United States)

    SRD 102 HIV Structural Database (Web, free access)   The HIV Protease Structural Database is an archive of experimentally determined 3-D structures of Human Immunodeficiency Virus 1 (HIV-1), Human Immunodeficiency Virus 2 (HIV-2) and Simian Immunodeficiency Virus (SIV) Proteases and their complexes with inhibitors or products of substrate cleavage.

  18. Balkan Vegetation Database

    NARCIS (Netherlands)

    Vassilev, Kiril; Pedashenko, Hristo; Alexandrova, Alexandra; Tashev, Alexandar; Ganeva, Anna; Gavrilova, Anna; Gradevska, Asya; Assenov, Assen; Vitkova, Antonina; Grigorov, Borislav; Gussev, Chavdar; Filipova, Eva; Aneva, Ina; Knollová, Ilona; Nikolov, Ivaylo; Georgiev, Georgi; Gogushev, Georgi; Tinchev, Georgi; Pachedjieva, Kalina; Koev, Koycho; Lyubenova, Mariyana; Dimitrov, Marius; Apostolova-Stoyanova, Nadezhda; Velev, Nikolay; Zhelev, Petar; Glogov, Plamen; Natcheva, Rayna; Tzonev, Rossen; Boch, Steffen; Hennekens, Stephan M.; Georgiev, Stoyan; Stoyanov, Stoyan; Karakiev, Todor; Kalníková, Veronika; Shivarov, Veselin; Russakova, Veska; Vulchev, Vladimir

    2016-01-01

    The Balkan Vegetation Database (BVD; GIVD ID: EU-00-019; http://www.givd.info/ID/EU-00- 019) is a regional database that consists of phytosociological relevés from different vegetation types from six countries on the Balkan Peninsula (Albania, Bosnia and Herzegovina, Bulgaria, Kosovo, Montenegro

  19. World Database of Happiness

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    1995-01-01

    textabstractABSTRACT The World Database of Happiness is an ongoing register of research on subjective appreciation of life. Its purpose is to make the wealth of scattered findings accessible, and to create a basis for further meta-analytic studies. The database involves four sections:
    1.

  20. Dictionary as Database.

    Science.gov (United States)

    Painter, Derrick

    1996-01-01

    Discussion of dictionaries as databases focuses on the digitizing of The Oxford English dictionary (OED) and the use of Standard Generalized Mark-Up Language (SGML). Topics include the creation of a consortium to digitize the OED, document structure, relational databases, text forms, sequence, and discourse. (LRW)

  1. Children's Culture Database (CCD)

    DEFF Research Database (Denmark)

    Wanting, Birgit

    a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996......a Dialogue inspired database with documentation, network (individual and institutional profiles) and current news , paper presented at the research seminar: Electronic access to fiction, Copenhagen, November 11-13, 1996...

  2. Atomic Spectra Database (ASD)

    Science.gov (United States)

    SRD 78 NIST Atomic Spectra Database (ASD) (Web, free access)   This database provides access and search capability for NIST critically evaluated data on atomic energy levels, wavelengths, and transition probabilities that are reasonably up-to-date. The NIST Atomic Spectroscopy Data Center has carried out these critical compilations.

  3. Consumer Product Category Database

    Science.gov (United States)

    The Chemical and Product Categories database (CPCat) catalogs the use of over 40,000 chemicals and their presence in different consumer products. The chemical use information is compiled from multiple sources while product information is gathered from publicly available Material Safety Data Sheets (MSDS). EPA researchers are evaluating the possibility of expanding the database with additional product and use information.

  4. Trends in performance indicators of neuroimaging anatomy research publications: a bibliometric study of major neuroradiology journal output over four decades based on web of science database.

    Science.gov (United States)

    Wing, Louise; Massoud, Tarik F

    2015-01-01

    Quantitative, qualitative, and innovative application of bibliometric research performance indicators to anatomy and radiology research and education can enhance cross-fertilization between the two disciplines. We aim to use these indicators to identify long-term trends in dissemination of publications in neuroimaging anatomy (including both productivity and citation rates), which has subjectively waned in prestige during recent years. We examined publications over the last 40 years in two neuroradiological journals, AJNR and Neuroradiology, and selected and categorized all neuroimaging anatomy research articles according to theme and type. We studied trends in their citation activity over time, and mathematically analyzed these trends for 1977, 1987, and 1997 publications. We created a novel metric, "citation half-life at 10 years postpublication" (CHL-10), and used this to examine trends in the skew of citation numbers for anatomy articles each year. We identified 367 anatomy articles amongst a total of 18,110 in these journals: 74.2% were original articles, with study of normal anatomy being the commonest theme (46.7%). We recorded a mean of 18.03 citations for each anatomy article, 35% higher than for general neuroradiology articles. Graphs summarizing the rise (upslope) in citation rates after publication revealed similar trends spanning two decades. CHL-10 trends demonstrated that more recently published anatomy articles were likely to take longer to reach peak citation rate. Bibliometric analysis suggests that anatomical research in neuroradiology is not languishing. This novel analytical approach can be applied to other aspects of neuroimaging research, and within other subspecialties in radiology and anatomy, and also to foster anatomical education. © 2014 Wiley Periodicals, Inc.

  5. YMDB: the Yeast Metabolome Database

    Science.gov (United States)

    Jewison, Timothy; Knox, Craig; Neveu, Vanessa; Djoumbou, Yannick; Guo, An Chi; Lee, Jacqueline; Liu, Philip; Mandal, Rupasri; Krishnamurthy, Ram; Sinelnikov, Igor; Wilson, Michael; Wishart, David S.

    2012-01-01

    The Yeast Metabolome Database (YMDB, http://www.ymdb.ca) is a richly annotated ‘metabolomic’ database containing detailed information about the metabolome of Saccharomyces cerevisiae. Modeled closely after the Human Metabolome Database, the YMDB contains >2000 metabolites with links to 995 different genes/proteins, including enzymes and transporters. The information in YMDB has been gathered from hundreds of books, journal articles and electronic databases. In addition to its comprehensive literature-derived data, the YMDB also contains an extensive collection of experimental intracellular and extracellular metabolite concentration data compiled from detailed Mass Spectrometry (MS) and Nuclear Magnetic Resonance (NMR) metabolomic analyses performed in our lab. This is further supplemented with thousands of NMR and MS spectra collected on pure, reference yeast metabolites. Each metabolite entry in the YMDB contains an average of 80 separate data fields including comprehensive compound description, names and synonyms, structural information, physico-chemical data, reference NMR and MS spectra, intracellular/extracellular concentrations, growth conditions and substrates, pathway information, enzyme data, gene/protein sequence data, as well as numerous hyperlinks to images, references and other public databases. Extensive searching, relational querying and data browsing tools are also provided that support text, chemical structure, spectral, molecular weight and gene/protein sequence queries. Because of S. cervesiae's importance as a model organism for biologists and as a biofactory for industry, we believe this kind of database could have considerable appeal not only to metabolomics researchers, but also to yeast biologists, systems biologists, the industrial fermentation industry, as well as the beer, wine and spirit industry. PMID:22064855

  6. Toward An Unstructured Mesh Database

    Science.gov (United States)

    Rezaei Mahdiraji, Alireza; Baumann, Peter Peter

    2014-05-01

    -incidence relationships. We instrument ImG model with sets of optional and application-specific constraints which can be used to check validity of meshes for a specific class of object such as manifold, pseudo-manifold, and simplicial manifold. We conducted experiments to measure the performance of the graph database solution in processing mesh queries and compare it with GrAL mesh library and PostgreSQL database on synthetic and real mesh datasets. The experiments show that each system perform well on specific types of mesh queries, e.g., graph databases perform well on global path-intensive queries. In the future, we investigate database operations for the ImG model and design a mesh query language.

  7. The LHCb configuration database

    CERN Document Server

    Abadie, L; Van Herwijnen, Eric; Jacobsson, R; Jost, B; Neufeld, N

    2005-01-01

    The aim of the LHCb configuration database is to store information about all the controllable devices of the detector. The experiment's control system (that uses PVSS ) will configure, start up and monitor the detector from the information in the configuration database. The database will contain devices with their properties, connectivity and hierarchy. The ability to store and rapidly retrieve huge amounts of data, and the navigability between devices are important requirements. We have collected use cases to ensure the completeness of the design. Using the entity relationship modelling technique we describe the use cases as classes with attributes and links. We designed the schema for the tables using relational diagrams. This methodology has been applied to the TFC (switches) and DAQ system. Other parts of the detector will follow later. The database has been implemented using Oracle to benefit from central CERN database support. The project also foresees the creation of tools to populate, maintain, and co...

  8. Mycobacteriophage genome database.

    Science.gov (United States)

    Joseph, Jerrine; Rajendran, Vasanthi; Hassan, Sameer; Kumar, Vanaja

    2011-01-01

    Mycobacteriophage genome database (MGDB) is an exclusive repository of the 64 completely sequenced mycobacteriophages with annotated information. It is a comprehensive compilation of the various gene parameters captured from several databases pooled together to empower mycobacteriophage researchers. The MGDB (Version No.1.0) comprises of 6086 genes from 64 mycobacteriophages classified into 72 families based on ACLAME database. Manual curation was aided by information available from public databases which was enriched further by analysis. Its web interface allows browsing as well as querying the classification. The main objective is to collect and organize the complexity inherent to mycobacteriophage protein classification in a rational way. The other objective is to browse the existing and new genomes and describe their functional annotation. The database is available for free at http://mpgdb.ibioinformatics.org/mpgdb.php.

  9. Database Description - SKIP Stemcell Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us SKIP Stemcell Database Database Description General information of database Database name SKIP Stemcell Database...rsity Journal Search: Contact address http://www.skip.med.keio.ac.jp/en/contact/ Database classification Human Genes and Diseases Dat...abase classification Stemcell Article Organism Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database...ks: Original website information Database maintenance site Center for Medical Genetics, School of medicine, ...lable Web services Not available URL of Web services - Need for user registration Not available About This Database Database

  10. ZAGRADA - A New Radiocarbon Database

    International Nuclear Information System (INIS)

    Portner, A.; Obelic, B.; Krajcar Bornic, I.

    2008-01-01

    In the Radiocarbon and Tritium Laboratory at the Rudjer Boskovic Institute three different techniques for 14C dating have been used: Gas Proportional Counting (GPC), Liquid Scintillation Counting (LSC) and preparation of milligram-sized samples for AMS dating (Accelerator Mass Spectrometry). The use of several measurement techniques has initiated a need for development of a new relational database ZAGRADA (Zagreb Radiocarbon Database) since the existing software package CARBO could not satisfy the requirements for parallel processing/using of several techniques. Using the SQL procedures, and constraints defined by primary and foreign keys, ZAGRADA enforces high data integrity and provides better performances in data filtering and sorting. Additionally, the new database for 14C samples is a multi-user oriented application that can be accessed from remote computers in the work group providing thus better efficiency of laboratory activities. In order to facilitate data handling and processing in ZAGRADA, the graphical user interface is designed to be user-friendly and to perform various actions on data like input, corrections, searching, sorting and output to printer. All invalid actions performed in user interface are registered with short textual description of an error occurred and appearing on screen in message boxes. Unauthorized access is also prevented by login control and each application window has implemented support to track last changes made by the user. The implementation of a new database for 14C samples has significant contribution to scientific research performed in the Radiocarbon and Tritium Laboratory and will provide better and easier communication with customers.(author)

  11. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  12. Database for propagation models

    Science.gov (United States)

    Kantak, Anil V.

    1991-07-01

    A propagation researcher or a systems engineer who intends to use the results of a propagation experiment is generally faced with various database tasks such as the selection of the computer software, the hardware, and the writing of the programs to pass the data through the models of interest. This task is repeated every time a new experiment is conducted or the same experiment is carried out at a different location generating different data. Thus the users of this data have to spend a considerable portion of their time learning how to implement the computer hardware and the software towards the desired end. This situation may be facilitated considerably if an easily accessible propagation database is created that has all the accepted (standardized) propagation phenomena models approved by the propagation research community. Also, the handling of data will become easier for the user. Such a database construction can only stimulate the growth of the propagation research it if is available to all the researchers, so that the results of the experiment conducted by one researcher can be examined independently by another, without different hardware and software being used. The database may be made flexible so that the researchers need not be confined only to the contents of the database. Another way in which the database may help the researchers is by the fact that they will not have to document the software and hardware tools used in their research since the propagation research community will know the database already. The following sections show a possible database construction, as well as properties of the database for the propagation research.

  13. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  14. Database automation of accelerator operation

    International Nuclear Information System (INIS)

    Casstevens, B.J.; Ludemann, C.A.

    1982-01-01

    The Oak Ridge Isochronous Cyclotron (ORIC) is a variable energy, multiparticle accelerator that produces beams of energetic heavy ions which are used as probes to study the structure of the atomic nucleus. To accelerate and transmit a particular ion at a specified energy to an experimenter's apparatus, the electrical currents in up to 82 magnetic field producing coils must be established to accuracies of from 0.1 to 0.001 percent. Mechanical elements must also be positioned by means of motors or pneumatic drives. A mathematical model of this complex system provides a good approximation of operating parameters required to produce an ion beam. However, manual tuning of the system must be performed to optimize the beam quality. The database system was implemented as an on-line query and retrieval system running at a priority lower than the cyclotron real-time software. It was designed for matching beams recorded in the database with beams specified for experiments. The database is relational and permits searching on ranges of any subset of the eleven beam categorizing attributes. A beam file selected from the database is transmitted to the cyclotron general control software which handles the automatic slewing of power supply currents and motor positions to the file values, thereby replicating the desired parameters

  15. Product Licenses Database Application

    CERN Document Server

    Tonkovikj, Petar

    2016-01-01

    The goal of this project is to organize and centralize the data about software tools available to CERN employees, as well as provide a system that would simplify the license management process by providing information about the available licenses and their expiry dates. The project development process is consisted of two steps: modeling the products (software tools), product licenses, legal agreements and other data related to these entities in a relational database and developing the front-end user interface so that the user can interact with the database. The result is an ASP.NET MVC web application with interactive views for displaying and managing the data in the underlying database.

  16. LandIT Database

    DEFF Research Database (Denmark)

    Iftikhar, Nadeem; Pedersen, Torben Bach

    2010-01-01

    and reporting purposes. This paper presents the LandIT database; which is result of the LandIT project, which refers to an industrial collaboration project that developed technologies for communication and data integration between farming devices and systems. The LandIT database in principal is based...... on the ISOBUS standard; however the standard is extended with additional requirements, such as gradual data aggregation and flexible exchange of farming data. This paper describes the conceptual and logical schemas of the proposed database based on a real-life farming case study....

  17. JICST Factual Database(2)

    Science.gov (United States)

    Araki, Keisuke

    The computer programme, which builds atom-bond connection tables from nomenclatures, is developed. Chemical substances with their nomenclature and varieties of trivial names or experimental code numbers are inputted. The chemical structures of the database are stereospecifically stored and are able to be searched and displayed according to stereochemistry. Source data are from laws and regulations of Japan, RTECS of US and so on. The database plays a central role within the integrated fact database service of JICST and makes interrelational retrieval possible.

  18. Dansk Hjerteregister--en klinisk database

    DEFF Research Database (Denmark)

    Abildstrøm, Steen Zabell; Kruse, Marie; Rasmussen, Søren

    2008-01-01

    INTRODUCTION: The Danish Heart Registry (DHR) keeps track of all coronary angiographies (CATH), percutaneous coronary interventions (PCI), coronary artery bypass grafting (CABG), and adult heart valve surgery performed in Denmark. DHR is a clinical database established in order to follow the acti......INTRODUCTION: The Danish Heart Registry (DHR) keeps track of all coronary angiographies (CATH), percutaneous coronary interventions (PCI), coronary artery bypass grafting (CABG), and adult heart valve surgery performed in Denmark. DHR is a clinical database established in order to follow...

  19. Livestock Anaerobic Digester Database

    Science.gov (United States)

    The Anaerobic Digester Database provides basic information about anaerobic digesters on livestock farms in the United States, organized in Excel spreadsheets. It includes projects that are under construction, operating, or shut down.

  20. Toxicity Reference Database

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Toxicity Reference Database (ToxRefDB) contains approximately 30 years and $2 billion worth of animal studies. ToxRefDB allows scientists and the interested...

  1. Dissolution Methods Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — For a drug product that does not have a dissolution test method in the United States Pharmacopeia (USP), the FDA Dissolution Methods Database provides information on...

  2. OTI Activity Database

    Data.gov (United States)

    US Agency for International Development — OTI's worldwide activity database is a simple and effective information system that serves as a program management, tracking, and reporting tool. In each country,...

  3. ARTI Refrigerant Database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M. [Calm (James M.), Great Falls, VA (United States)

    1994-05-27

    The Refrigerant Database consolidates and facilitates access to information to assist industry in developing equipment using alternative refrigerants. The underlying purpose is to accelerate phase out of chemical compounds of environmental concern.

  4. Marine Jurisdictions Database

    National Research Council Canada - National Science Library

    Goldsmith, Roger

    1998-01-01

    The purpose of this project was to take the data gathered for the Maritime Claims chart and create a Maritime Jurisdictions digital database suitable for use with oceanographic mission planning objectives...

  5. Medicaid CHIP ESPC Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Environmental Scanning and Program Characteristic (ESPC) Database is in a Microsoft (MS) Access format and contains Medicaid and CHIP data, for the 50 states and...

  6. Records Management Database

    Data.gov (United States)

    US Agency for International Development — The Records Management Database is tool created in Microsoft Access specifically for USAID use. It contains metadata in order to access and retrieve the information...

  7. Reach Address Database (RAD)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Reach Address Database (RAD) stores the reach address of each Water Program feature that has been linked to the underlying surface water features (streams,...

  8. Household Products Database: Pesticides

    Science.gov (United States)

    ... of Products Manufacturers Ingredients About the Database FAQ Product ... control bulbs carpenter ants caterpillars crabgrass control deer dogs dogs/cats fertilizer w/insecticide fertilizer w/weed ...

  9. Mouse Phenome Database (MPD)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Mouse Phenome Database (MPD) has characterizations of hundreds of strains of laboratory mice to facilitate translational discoveries and to assist in selection...

  10. Consumer Product Category Database

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Chemical and Product Categories database (CPCat) catalogs the use of over 40,000 chemicals and their presence in different consumer products. The chemical use...

  11. Drycleaner Database - Region 7

    Data.gov (United States)

    U.S. Environmental Protection Agency — THIS DATA ASSET NO LONGER ACTIVE: This is metadata documentation for the Region 7 Drycleaner Database (R7DryClnDB) which tracks all Region7 drycleaners who notify...

  12. National Assessment Database

    Data.gov (United States)

    U.S. Environmental Protection Agency — The National Assessment Database stores and tracks state water quality assessment decisions, Total Maximum Daily Loads (TMDLs) and other watershed plans designed to...

  13. IVR RSA Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database contains trip-level reports submitted by vessels participating in Research Set-Aside projects with IVR reporting requirements.

  14. Rat Genome Database (RGD)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Rat Genome Database (RGD) is a collaborative effort between leading research institutions involved in rat genetic and genomic research to collect, consolidate,...

  15. The CAPEC Database

    DEFF Research Database (Denmark)

    Nielsen, Thomas Lund; Abildskov, Jens; Harper, Peter Mathias

    2001-01-01

    in the compound. This classification makes the CAPEC database a very useful tool, for example, in the development of new property models, since properties of chemically similar compounds are easily obtained. A program with efficient search and retrieval functions of properties has been developed.......The Computer-Aided Process Engineering Center (CAPEC) database of measured data was established with the aim to promote greater data exchange in the chemical engineering community. The target properties are pure component properties, mixture properties, and special drug solubility data....... The database divides pure component properties into primary, secondary, and functional properties. Mixture properties are categorized in terms of the number of components in the mixture and the number of phases present. The compounds in the database have been classified on the basis of the functional groups...

  16. The Danish Urogynaecological Database

    DEFF Research Database (Denmark)

    Guldberg, Rikke; Brostrøm, Søren; Hansen, Jesper Kjær

    2013-01-01

    in the DugaBase from 1 January 2009 to 31 October 2010, using medical records as a reference. RESULTS: A total of 16,509 urogynaecological procedures were registered in the DugaBase by 31 December 2010. The database completeness has increased by calendar time, from 38.2 % in 2007 to 93.2 % in 2010 for public......INTRODUCTION AND HYPOTHESIS: The Danish Urogynaecological Database (DugaBase) is a nationwide clinical database established in 2006 to monitor, ensure and improve the quality of urogynaecological surgery. We aimed to describe its establishment and completeness and to validate selected variables....... This is the first study based on data from the DugaBase. METHODS: The database completeness was calculated as a comparison between urogynaecological procedures reported to the Danish National Patient Registry and to the DugaBase. Validity was assessed for selected variables from a random sample of 200 women...

  17. Danish Pancreatic Cancer Database

    DEFF Research Database (Denmark)

    Fristrup, Claus; Detlefsen, Sönke; Palnæs Hansen, Carsten

    2016-01-01

    : Death is monitored using data from the Danish Civil Registry. This registry monitors the survival status of the Danish population, and the registration is virtually complete. All data in the database are audited by all participating institutions, with respect to baseline characteristics, key indicators......AIM OF DATABASE: The Danish Pancreatic Cancer Database aims to prospectively register the epidemiology, diagnostic workup, diagnosis, treatment, and outcome of patients with pancreatic cancer in Denmark at an institutional and national level. STUDY POPULATION: Since May 1, 2011, all patients...... with microscopically verified ductal adenocarcinoma of the pancreas have been registered in the database. As of June 30, 2014, the total number of patients registered was 2,217. All data are cross-referenced with the Danish Pathology Registry and the Danish Patient Registry to ensure the completeness of registrations...

  18. Food Habits Database (FHDBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NEFSC Food Habits Database has two major sources of data. The first, and most extensive, is the standard NEFSC Bottom Trawl Surveys Program. During these...

  19. Functionally Graded Materials Database

    Science.gov (United States)

    Kisara, Katsuto; Konno, Tomomi; Niino, Masayuki

    2008-02-01

    Functionally Graded Materials Database (hereinafter referred to as FGMs Database) was open to the society via Internet in October 2002, and since then it has been managed by the Japan Aerospace Exploration Agency (JAXA). As of October 2006, the database includes 1,703 research information entries with 2,429 researchers data, 509 institution data and so on. Reading materials such as "Applicability of FGMs Technology to Space Plane" and "FGMs Application to Space Solar Power System (SSPS)" were prepared in FY 2004 and 2005, respectively. The English version of "FGMs Application to Space Solar Power System (SSPS)" is now under preparation. This present paper explains the FGMs Database, describing the research information data, the sitemap and how to use it. From the access analysis, user access results and users' interests are discussed.

  20. Tethys Acoustic Metadata Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Tethys database houses the metadata associated with the acoustic data collection efforts by the Passive Acoustic Group. These metadata include dates, locations...

  1. NLCD 2011 database

    Data.gov (United States)

    U.S. Environmental Protection Agency — National Land Cover Database 2011 (NLCD 2011) is the most recent national land cover product created by the Multi-Resolution Land Characteristics (MRLC) Consortium....

  2. Medicare Coverage Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Coverage Database (MCD) contains all National Coverage Determinations (NCDs) and Local Coverage Determinations (LCDs), local articles, and proposed NCD...

  3. Household Products Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — This database links over 4,000 consumer brands to health effects from Material Safety Data Sheets (MSDS) provided by the manufacturers and allows scientists and...

  4. Global Volcano Locations Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NGDC maintains a database of over 1,500 volcano locations obtained from the Smithsonian Institution Global Volcanism Program, Volcanoes of the World publication. The...

  5. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  6. Uranium Location Database

    Data.gov (United States)

    U.S. Environmental Protection Agency — A GIS compiled locational database in Microsoft Access of ~15,000 mines with uranium occurrence or production, primarily in the western United States. The metadata...

  7. INIST: databases reorientation

    International Nuclear Information System (INIS)

    Bidet, J.C.

    1995-01-01

    INIST is a CNRS (Centre National de la Recherche Scientifique) laboratory devoted to the treatment of scientific and technical informations and to the management of these informations compiled in a database. Reorientation of the database content has been proposed in 1994 to increase the transfer of research towards enterprises and services, to develop more automatized accesses to the informations, and to create a quality assurance plan. The catalog of publications comprises 5800 periodical titles (1300 for fundamental research and 4500 for applied research). A science and technology multi-thematic database will be created in 1995 for the retrieval of applied and technical informations. ''Grey literature'' (reports, thesis, proceedings..) and human and social sciences data will be added to the base by the use of informations selected in the existing GRISELI and Francis databases. Strong modifications are also planned in the thematic cover of Earth sciences and will considerably reduce the geological information content. (J.S.). 1 tab

  8. Fine Arts Database (FAD)

    Data.gov (United States)

    General Services Administration — The Fine Arts Database records information on federally owned art in the control of the GSA; this includes the location, current condition and information on artists.

  9. Kansas Cartographic Database (KCD)

    Data.gov (United States)

    Kansas Data Access and Support Center — The Kansas Cartographic Database (KCD) is an exact digital representation of selected features from the USGS 7.5 minute topographic map series. Features that are...

  10. Developments in diffraction databases

    International Nuclear Information System (INIS)

    Jenkins, R.

    1999-01-01

    Full text: There are a number of databases available to the diffraction community. Two of the more important of these are the Powder Diffraction File (PDF) maintained by the International Centre for Diffraction Data (ICDD), and the Inorganic Crystal Structure Database (ICSD) maintained by Fachsinformationzentrum (FIZ, Karlsruhe). In application, the PDF has been used as an indispensable tool in phase identification and identification of unknowns. The ICSD database has extensive and explicit reference to the structures of compounds: atomic coordinates, space group and even thermal vibration parameters. A similar database, but for organic compounds, is maintained by the Cambridge Crystallographic Data Centre. These databases are often used as independent sources of information. However, little thought has been given on how to exploit the combined properties of structural database tools. A recently completed agreement between ICDD and FIZ, plus ICDD and Cambridge, provides a first step in complementary use of the PDF and the ICSD databases. The focus of this paper (as indicated below) is to examine ways of exploiting the combined properties of both databases. In 1996, there were approximately 76,000 entries in the PDF and approximately 43,000 entries in the ICSD database. The ICSD database has now been used to calculate entries in the PDF. Thus, to derive d-spacing and peak intensity data requires the synthesis of full diffraction patterns, i.e., we use the structural data in the ICSD database and then add instrumental resolution information. The combined data from PDF and ICSD can be effectively used in many ways. For example, we can calculate PDF data for an ideally random crystal distribution and also in the absence of preferred orientation. Again, we can use systematic studies of intermediate members in solid solutions series to help produce reliable quantitative phase analyses. In some cases, we can study how solid solution properties vary with composition and

  11. Database on Wind Characteristics

    DEFF Research Database (Denmark)

    Højstrup, J.; Ejsing Jørgensen, Hans; Lundtang Petersen, Erik

    1999-01-01

    his report describes the work and results of the project: Database on Wind Characteristics which was sponsered partly by the European Commision within the framework of JOULE III program under contract JOR3-CT95-0061......his report describes the work and results of the project: Database on Wind Characteristics which was sponsered partly by the European Commision within the framework of JOULE III program under contract JOR3-CT95-0061...

  12. ORACLE DATABASE SECURITY

    OpenAIRE

    Cristina-Maria Titrade

    2011-01-01

    This paper presents some security issues, namely security database system level, data level security, user-level security, user management, resource management and password management. Security is a constant concern in the design and database development. Usually, there are no concerns about the existence of security, but rather how large it should be. A typically DBMS has several levels of security, in addition to those offered by the operating system or network. Typically, a DBMS has user a...

  13. Update History of This Database - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Update History of This Database Date Update contents 2014/05/07 The co...ntact information is corrected. The features and manner of utilization of the database are corrected. 2014/02/04 Trypanosomes Databas...e English archive site is opened. 2011/04/04 Trypanosomes Database ( http://www.tan...paku.org/tdb/ ) is opened. About This Database Database Description Download Lice...nse Update History of This Database Site Policy | Contact Us Update History of This Database - Trypanosomes Database | LSDB Archive ...

  14. The Danish Cardiac Rehabilitation Database.

    Science.gov (United States)

    Zwisler, Ann-Dorthe; Rossau, Henriette Knold; Nakano, Anne; Foghmar, Sussie; Eichhorst, Regina; Prescott, Eva; Cerqueira, Charlotte; Soja, Anne Merete Boas; Gislason, Gunnar H; Larsen, Mogens Lytken; Andersen, Ulla Overgaard; Gustafsson, Ida; Thomsen, Kristian K; Boye Hansen, Lene; Hammer, Signe; Viggers, Lone; Christensen, Bo; Kvist, Birgitte; Lindström Egholm, Cecilie; May, Ole

    2016-01-01

    The Danish Cardiac Rehabilitation Database (DHRD) aims to improve the quality of cardiac rehabilitation (CR) to the benefit of patients with coronary heart disease (CHD). Hospitalized patients with CHD with stenosis on coronary angiography treated with percutaneous coronary intervention, coronary artery bypass grafting, or medication alone. Reporting is mandatory for all hospitals in Denmark delivering CR. The database was initially implemented in 2013 and was fully running from August 14, 2015, thus comprising data at a patient level from the latter date onward. Patient-level data are registered by clinicians at the time of entry to CR directly into an online system with simultaneous linkage to other central patient registers. Follow-up data are entered after 6 months. The main variables collected are related to key outcome and performance indicators of CR: referral and adherence, lifestyle, patient-related outcome measures, risk factor control, and medication. Program-level online data are collected every third year. Based on administrative data, approximately 14,000 patients with CHD are hospitalized at 35 hospitals annually, with 75% receiving one or more outpatient rehabilitation services by 2015. The database has not yet been running for a full year, which explains the use of approximations. The DHRD is an online, national quality improvement database on CR, aimed at patients with CHD. Mandatory registration of data at both patient level as well as program level is done on the database. DHRD aims to systematically monitor the quality of CR over time, in order to improve the quality of CR throughout Denmark to benefit patients.

  15. OCA Oracle Database 11g database administration I : a real-world certification guide

    CERN Document Server

    Ries, Steve

    2013-01-01

    Developed as a practical book, ""Oracle Database 11g Administration I Certification Guide"" will show you all you need to know to effectively excel at being an Oracle DBA, for both examinations and the real world. This book is for anyone who needs the essential skills to become an Oracle DBA, pass the Oracle Database Administration I exam, and use those skills in the real world to manage secure, high performance, and highly available Oracle databases.

  16. Database Description - TMFunction | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available sidue (or mutant) in a protein. The experimental data are collected from the literature both by searching th...the sequence database, UniProt, structural database, PDB, and literature database

  17. Database Description - RPSD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name RPSD Alternative nam...e Rice Protein Structure Database DOI 10.18908/lsdba.nbdc00749-000 Creator Creator Name: Toshimasa Yamazaki ... Ibaraki 305-8602, Japan National Institute of Agrobiological Sciences Toshimasa Yamazaki E-mail : Databas...e classification Structure Databases - Protein structure Organism Taxonomy Name: Or...or name(s): Journal: External Links: Original website information Database maintenance site National Institu

  18. Database Description - DGBY | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name DGBY Alternative name Database...EL: +81-29-838-8066 E-mail: Database classification Microarray Data and other Gene Expression Databases Orga...nism Taxonomy Name: Saccharomyces cerevisiae Taxonomy ID: 4932 Database descripti...-called phenomics). We uploaded these data on this website which is designated DGBY(Database for Gene expres...ma J, Ando A, Takagi H. Journal: Yeast. 2008 Mar;25(3):179-90. External Links: Original website information Database

  19. Solid Waste Projection Model: Database User's Guide

    International Nuclear Information System (INIS)

    Blackburn, C.L.

    1993-10-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for using Version 1.4 of the SWPM database: system requirements and preparation, entering and maintaining data, and performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not Provide instruction in the use of Paradox, the database management system in which the SWPM database is established

  20. CMS experience with online and offline Databases

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The CMS experiment is made of many detectors which in total sum up to more than 75 million channels. The online database stores the configuration data used to configure the various parts of the detector and bring it in all possible running states. The database also stores the conditions data, detector monitoring parameters of all channels (temperatures, voltages), detector quality information, beam conditions, etc. These quantities are used by the experts to monitor the detector performance in detail, as they occupy a very large space in the online database they cannot be used as-is for offline data reconstruction. For this, a "condensed" set of the full information, the "conditions data", is created and copied to a separate database used in the offline reconstruction. The offline conditions database contains the alignment and calibrations data for the various detectors. Conditions data sets are accessed by a tag and an interval of validity through the offline reconstruction program CMSSW, written in C++. Pe...

  1. Evolution of the Configuration Database Design

    International Nuclear Information System (INIS)

    Salnikov, A.

    2006-01-01

    The BABAR experiment at SLAC successfully collects physics data since 1999. One of the major parts of its on-line system is the configuration database which provides other parts of the system with the configuration data necessary for data taking. Originally the configuration database was implemented in the Objectivity/DB ODBMS. Recently BABAR performed a successful migration of its event store from Objectivity/DB to ROOT and this prompted a complete phase-out of the Objectivity/DB in all other BABAR databases. It required the complete redesign of the configuration database to hide any implementation details and to support multiple storage technologies. In this paper we describe the process of the migration of the configuration database, its new design, implementation strategy and details

  2. The CUTLASS database facilities

    International Nuclear Information System (INIS)

    Jervis, P.; Rutter, P.

    1988-09-01

    The enhancement of the CUTLASS database management system to provide improved facilities for data handling is seen as a prerequisite to its effective use for future power station data processing and control applications. This particularly applies to the larger projects such as AGR data processing system refurbishments, and the data processing systems required for the new Coal Fired Reference Design stations. In anticipation of the need for improved data handling facilities in CUTLASS, the CEGB established a User Sub-Group in the early 1980's to define the database facilities required by users. Following the endorsement of the resulting specification and a detailed design study, the database facilities have been implemented as an integral part of the CUTLASS system. This paper provides an introduction to the range of CUTLASS Database facilities, and emphasises the role of Database as the central facility around which future Kit 1 and (particularly) Kit 6 CUTLASS based data processing and control systems will be designed and implemented. (author)

  3. ADANS database specification

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-01-16

    The purpose of the Air Mobility Command (AMC) Deployment Analysis System (ADANS) Database Specification (DS) is to describe the database organization and storage allocation and to provide the detailed data model of the physical design and information necessary for the construction of the parts of the database (e.g., tables, indexes, rules, defaults). The DS includes entity relationship diagrams, table and field definitions, reports on other database objects, and a description of the ADANS data dictionary. ADANS is the automated system used by Headquarters AMC and the Tanker Airlift Control Center (TACC) for airlift planning and scheduling of peacetime and contingency operations as well as for deliberate planning. ADANS also supports planning and scheduling of Air Refueling Events by the TACC and the unit-level tanker schedulers. ADANS receives input in the form of movement requirements and air refueling requests. It provides a suite of tools for planners to manipulate these requirements/requests against mobility assets and to develop, analyze, and distribute schedules. Analysis tools are provided for assessing the products of the scheduling subsystems, and editing capabilities support the refinement of schedules. A reporting capability provides formatted screen, print, and/or file outputs of various standard reports. An interface subsystem handles message traffic to and from external systems. The database is an integral part of the functionality summarized above.

  4. Database Description - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Database Description General information of database Database name Trypanosomes Database...stitute of Genetics Research Organization of Information and Systems Yata 1111, Mishima, Shizuoka 411-8540, JAPAN E mail: Database...y Name: Trypanosoma Taxonomy ID: 5690 Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database description The... Article title: Author name(s): Journal: External Links: Original website information Database maintenance s...DB (Protein Data Bank) KEGG PATHWAY Database DrugPort Entry list Available Query search Available Web servic

  5. Testing the performance of NoSQL databases via the Database Benchmark tool / Тестирование возможностей базы данных NoSQL с помощью Database Benchmark инструмента / Testiranje performansi NoSQL baza podataka pomoću database benchmark alata

    Directory of Open Access Journals (Sweden)

    Lazar J. Krstić

    2018-07-01

    Full Text Available NoSQL is often used as a successful alternative to relational databases, especially when it is necessary to provide adequate system dimensioning, usage of a variety of data types and high efficiency at a low cost for maintaining consistency. The work is conceived in a manner that covers the general concept of a database, i.e. the concept of relational and nonrelational databases, which are substantiated by all important aspects and in an appropriate context. After analyzing the types of NoSQL databases, emphasis is placed on explaining their advantages and disadvantages, as well as on an overview of the NoSQL and SQL database comparison. The final part of the paper presents the results of testing the performance of NoSQL databases, obtained though the Database Benchmark tool. The aim of the paper is to highlight all the details of NoSQL databases in order to establish the justification of their application in practice. / База данных NoSQL часто используется в качестве удачной альтернативы реляционным базам данных, особенно в тех случаях, когда необходимо обеспечить соответствующие размеры системы, применение данных различных типов и высокую эффективность по низкой стоимости хранения и поддержки консистентности данных. В данной работе представлено общее определение базы данных, описаны различные виды баз данных как реляционных, так и нереляционных, выявлены их преимущества и недостатки в соответствующем контексте. Выявленные преимущества и недостатки баз данных NoSQL и SQL проана

  6. Open Geoscience Database

    Science.gov (United States)

    Bashev, A.

    2012-04-01

    Currently there is an enormous amount of various geoscience databases. Unfortunately the only users of the majority of the databases are their elaborators. There are several reasons for that: incompaitability, specificity of tasks and objects and so on. However the main obstacles for wide usage of geoscience databases are complexity for elaborators and complication for users. The complexity of architecture leads to high costs that block the public access. The complication prevents users from understanding when and how to use the database. Only databases, associated with GoogleMaps don't have these drawbacks, but they could be hardly named "geoscience" Nevertheless, open and simple geoscience database is necessary at least for educational purposes (see our abstract for ESSI20/EOS12). We developed a database and web interface to work with them and now it is accessible at maps.sch192.ru. In this database a result is a value of a parameter (no matter which) in a station with a certain position, associated with metadata: the date when the result was obtained; the type of a station (lake, soil etc); the contributor that sent the result. Each contributor has its own profile, that allows to estimate the reliability of the data. The results can be represented on GoogleMaps space image as a point in a certain position, coloured according to the value of the parameter. There are default colour scales and each registered user can create the own scale. The results can be also extracted in *.csv file. For both types of representation one could select the data by date, object type, parameter type, area and contributor. The data are uploaded in *.csv format: Name of the station; Lattitude(dd.dddddd); Longitude(ddd.dddddd); Station type; Parameter type; Parameter value; Date(yyyy-mm-dd). The contributor is recognised while entering. This is the minimal set of features that is required to connect a value of a parameter with a position and see the results. All the complicated data

  7. The LHCb configuration database

    CERN Document Server

    Abadie, Lana; Gaspar, Clara; Jacobsson, Richard; Jost, Beat; Neufeld, Niko

    2005-01-01

    The Experiment Control System (ECS) will handle the monitoring, configuration and operation of all the LHCb experimental equipment. All parameters required to configure electronics equipment under the control of the ECS will reside in a configuration database. The database will contain two kinds of information: 1.\tConfiguration properties about devices such as hardware addresses, geographical location, and operational parameters associated with particular running modes (dynamic properties). 2.\tConnectivity between devices : this consists of describing the output and input connections of a device (static properties). The representation of these data using tables must be complete so that it can provide all the required information to the ECS and must cater for all the subsystems. The design should also guarantee a fast response time, even if a query results in a large volume of data being loaded from the database into the ECS. To fulfil these constraints, we apply the following methodology: Determine from the d...

  8. Database Application Schema Forensics

    Directory of Open Access Journals (Sweden)

    Hector Quintus Beyers

    2014-12-01

    Full Text Available The application schema layer of a Database Management System (DBMS can be modified to deliver results that may warrant a forensic investigation. Table structures can be corrupted by changing the metadata of a database or operators of the database can be altered to deliver incorrect results when used in queries. This paper will discuss categories of possibilities that exist to alter the application schema with some practical examples. Two forensic environments are introduced where a forensic investigation can take place in. Arguments are provided why these environments are important. Methods are presented how these environments can be achieved for the application schema layer of a DBMS. A process is proposed on how forensic evidence should be extracted from the application schema layer of a DBMS. The application schema forensic evidence identification process can be applied to a wide range of forensic settings.

  9. Tibetan Magmatism Database

    Science.gov (United States)

    Chapman, James B.; Kapp, Paul

    2017-11-01

    A database containing previously published geochronologic, geochemical, and isotopic data on Mesozoic to Quaternary igneous rocks in the Himalayan-Tibetan orogenic system are presented. The database is intended to serve as a repository for new and existing igneous rock data and is publicly accessible through a web-based platform that includes an interactive map and data table interface with search, filtering, and download options. To illustrate the utility of the database, the age, location, and ɛHft composition of magmatism from the central Gangdese batholith in the southern Lhasa terrane are compared. The data identify three high-flux events, which peak at 93, 50, and 15 Ma. They are characterized by inboard arc migration and a temporal and spatial shift to more evolved isotopic compositions.

  10. Database Vs Data Warehouse

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Data warehouse technology includes a set of concepts and methods that offer the users useful information for decision making. The necessity to build a data warehouse arises from the necessity to improve the quality of information in the organization. The date proceeding from different sources, having a variety of forms - both structured and unstructured, are filtered according to business rules and are integrated in a single large data collection. Using informatics solutions, managers have understood that data stored in operational systems - including databases, are an informational gold mine that must be exploited. Data warehouses have been developed to answer the increasing demands for complex analysis, which could not be properly achieved with operational databases. The present paper emphasizes some of the criteria that information application developers can use in order to choose between a database solution or a data warehouse one.

  11. Danish Gynecological Cancer Database

    DEFF Research Database (Denmark)

    Sørensen, Sarah Mejer; Bjørn, Signe Frahm; Jochumsen, Kirsten Marie

    2016-01-01

    AIM OF DATABASE: The Danish Gynecological Cancer Database (DGCD) is a nationwide clinical cancer database and its aim is to monitor the treatment quality of Danish gynecological cancer patients, and to generate data for scientific purposes. DGCD also records detailed data on the diagnostic measures...... data forms as follows: clinical data, surgery, pathology, pre- and postoperative care, complications, follow-up visits, and final quality check. DGCD is linked with additional data from the Danish "Pathology Registry", the "National Patient Registry", and the "Cause of Death Registry" using the unique...... Danish personal identification number (CPR number). DESCRIPTIVE DATA: Data from DGCD and registers are available online in the Statistical Analysis Software portal. The DGCD forms cover almost all possible clinical variables used to describe gynecological cancer courses. The only limitation...

  12. RODOS database adapter

    International Nuclear Information System (INIS)

    Xie Gang

    1995-11-01

    Integrated data management is an essential aspect of many automatical information systems such as RODOS, a real-time on-line decision support system for nuclear emergency management. In particular, the application software must provide access management to different commercial database systems. This report presents the tools necessary for adapting embedded SQL-applications to both HP-ALLBASE/SQL and CA-Ingres/SQL databases. The design of the database adapter and the concept of RODOS embedded SQL syntax are discussed by considering some of the most important features of SQL-functions and the identification of significant differences between SQL-implementations. Finally fully part of the software developed and the administrator's and installation guides are described. (orig.) [de

  13. The Danish Depression Database

    DEFF Research Database (Denmark)

    Videbech, Poul Bror Hemming; Deleuran, Anette

    2016-01-01

    AIM OF DATABASE: The purpose of the Danish Depression Database (DDD) is to monitor and facilitate the improvement of the quality of the treatment of depression in Denmark. Furthermore, the DDD has been designed to facilitate research. STUDY POPULATION: Inpatients as well as outpatients...... with depression, aged above 18 years, and treated in the public psychiatric hospital system were enrolled. MAIN VARIABLES: Variables include whether the patient has been thoroughly somatically examined and has been interviewed about the psychopathology by a specialist in psychiatry. The Hamilton score as well...... as an evaluation of the risk of suicide are measured before and after treatment. Whether psychiatric aftercare has been scheduled for inpatients and the rate of rehospitalization are also registered. DESCRIPTIVE DATA: The database was launched in 2011. Every year since then ~5,500 inpatients and 7,500 outpatients...

  14. 600 MW nuclear power database

    International Nuclear Information System (INIS)

    Cao Ruiding; Chen Guorong; Chen Xianfeng; Zhang Yishu

    1996-01-01

    600 MW Nuclear power database, based on ORACLE 6.0, consists of three parts, i.e. nuclear power plant database, nuclear power position database and nuclear power equipment database. In the database, there are a great deal of technique data and picture of nuclear power, provided by engineering designing units and individual. The database can give help to the designers of nuclear power

  15. Aging management database

    International Nuclear Information System (INIS)

    Vidican, Dan

    2003-01-01

    As operation time is accumulated, the overall safety and performance of NPP tend to decrease. The reasons for potential non-availability of the structures, Systems and Components (SCC) in operation, are various but they represent in different mode the end result of the ageing phenomena. In order to understand the ageing phenomena and to be able to take adequate countermeasures, it is necessary to accumulate a big amount of information, from worldwide and also from the own plant. These Data have to be organized in a systematic form, easy to retrieval and use. General requirements and structure of an Ageing DataBase Activities related to ageing evaluation have to allow: - Identification and evaluation of degradation phenomena, potential malfunction and failure mode of the plant typical components; - Trend analyses (on selected critical components), prediction of the future performance and the remaining service life. To perform these activities, it is necessary to have information on similar components behavior in different NPP (in different environment and different operating conditions) and also the results from different pilot studies. The knowledge of worldwide experience is worthwhile. Also, it is necessary to know very well the operating and environmental conditions in own NPP and to analyze in detail the failure mode and root cause for the components removed from the plant due to extended degradation. Based on the above aspects, one presents a proposal for the structure of an Ageing DataBase. It has three main sections: - Section A: General knowledge about ageing phenomena. It contain all the information collected based on the worldwide experience. It could have, a general part with crude information and a synthetic one, structured on typical components (if possible on different manufacturers). The synthetic part, has to consider different ageing aspects and different monitoring and evaluation methods (e. g. component, function, environment condition, specific

  16. The Neotoma Paleoecology Database

    Science.gov (United States)

    Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.

    2015-12-01

    The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community

  17. C# Database Basics

    CERN Document Server

    Schmalz, Michael

    2012-01-01

    Working with data and databases in C# certainly can be daunting if you're coming from VB6, VBA, or Access. With this hands-on guide, you'll shorten the learning curve considerably as you master accessing, adding, updating, and deleting data with C#-basic skills you need if you intend to program with this language. No previous knowledge of C# is necessary. By following the examples in this book, you'll learn how to tackle several database tasks in C#, such as working with SQL Server, building data entry forms, and using data in a web service. The book's code samples will help you get started

  18. Danish Palliative Care Database

    DEFF Research Database (Denmark)

    Grønvold, Mogens; Adsersen, Mathilde; Hansen, Maiken Bang

    2016-01-01

    Aims: The aim of the Danish Palliative Care Database (DPD) is to monitor, evaluate, and improve the clinical quality of specialized palliative care (SPC) (ie, the activity of hospital-based palliative care teams/departments and hospices) in Denmark. Study population: The study population is all...... patients were registered in DPD during the 5 years 2010–2014. Of those registered, 96% had cancer. Conclusion: DPD is a national clinical quality database for SPC having clinically relevant variables and high data and patient completeness....

  19. The CATH database

    Directory of Open Access Journals (Sweden)

    Knudsen Michael

    2010-02-01

    Full Text Available Abstract The CATH database provides hierarchical classification of protein domains based on their folding patterns. Domains are obtained from protein structures deposited in the Protein Data Bank and both domain identification and subsequent classification use manual as well as automated procedures. The accompanying website http://www.cathdb.info provides an easy-to-use entry to the classification, allowing for both browsing and downloading of data. Here, we give a brief review of the database, its corresponding website and some related tools.

  20. The Danish Anaesthesia Database

    DEFF Research Database (Denmark)

    Antonsen, Kristian; Rosenstock, Charlotte Vallentin; Lundstrøm, Lars Hyldborg

    2016-01-01

    AIM OF DATABASE: The aim of the Danish Anaesthesia Database (DAD) is the nationwide collection of data on all patients undergoing anesthesia. Collected data are used for quality assurance, quality development, and serve as a basis for research projects. STUDY POPULATION: The DAD was founded in 2004....... In addition, an annual DAD report is a benchmark for departments nationwide. CONCLUSION: The DAD is covering the anesthetic process for the majority of patients undergoing anesthesia in Denmark. Data in the DAD are increasingly used for both quality and research projects....

  1. MARKS ON ART database

    DEFF Research Database (Denmark)

    van Vlierden, Marieke; Wadum, Jørgen; Wolters, Margreet

    2016-01-01

    Mestermærker, monogrammer og kvalitetsmærker findes ofte præget eller stemplet på kunstværker fra 1300-1700. En illustreret database med denne typer mræker er under etablering på Nederlands Kunsthistoriske Institut (RKD) i Den Haag.......Mestermærker, monogrammer og kvalitetsmærker findes ofte præget eller stemplet på kunstværker fra 1300-1700. En illustreret database med denne typer mræker er under etablering på Nederlands Kunsthistoriske Institut (RKD) i Den Haag....

  2. Yucca Mountain digital database

    International Nuclear Information System (INIS)

    Daudt, C.R.; Hinze, W.J.

    1992-01-01

    This paper discusses the Yucca Mountain Digital Database (DDB) which is a digital, PC-based geographical database of geoscience-related characteristics of the proposed high-level waste (HLW) repository site of Yucca Mountain, Nevada. It was created to provide the US Nuclear Regulatory Commission's (NRC) Advisory Committee on Nuclear Waste (ACNW) and its staff with a visual perspective of geological, geophysical, and hydrological features at the Yucca Mountain site as discussed in the Department of Energy's (DOE) pre-licensing reports

  3. ARTI refrigerant database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M.

    1998-03-15

    The Refrigerant Database is an information system on alternative refrigerants, associated lubricants, and their use in air conditioning and refrigeration. It consolidates and facilitates access to thermophysical properties, compatibility, environmental, safety, application and other information. It provides corresponding information on older refrigerants, to assist manufacturers and those using alternative refrigerants, to make comparisons and determine differences. The underlying purpose is to accelerate phase out of chemical compounds of environmental concern. The database provides bibliographic citations and abstracts for publications that may be useful in research and design of air conditioning and refrigeration equipment. It also references documents addressing compatibility of refrigerants and lubricants with other materials.

  4. Surgery Risk Assessment (SRA) Database

    Data.gov (United States)

    Department of Veterans Affairs — The Surgery Risk Assessment (SRA) database is part of the VA Surgical Quality Improvement Program (VASQIP). This database contains assessments of selected surgical...

  5. Accessing and using chemical databases

    DEFF Research Database (Denmark)

    Nikolov, Nikolai Georgiev; Pavlov, Todor; Niemelä, Jay Russell

    2013-01-01

    Computer-based representation of chemicals makes it possible to organize data in chemical databases-collections of chemical structures and associated properties. Databases are widely used wherever efficient processing of chemical information is needed, including search, storage, retrieval......, and dissemination. Structure and functionality of chemical databases are considered. The typical kinds of information found in a chemical database are considered-identification, structural, and associated data. Functionality of chemical databases is presented, with examples of search and access types. More details...... are included about the OASIS database and platform and the Danish (Q)SAR Database online. Various types of chemical database resources are discussed, together with a list of examples....

  6. Aerobic Fitness and Cognitive Development: Event-Related Brain Potential and Task Performance Indices of Executive Control in Preadolescent Children

    Science.gov (United States)

    Hillman, Charles H.; Buck, Sarah M.; Themanson, Jason R.; Pontifex, Matthew B.; Castelli, Darla M.

    2009-01-01

    The relationship between aerobic fitness and executive control was assessed in 38 higher- and lower-fit children (M[subscript age] = 9.4 years), grouped according to their performance on a field test of aerobic capacity. Participants performed a flanker task requiring variable amounts of executive control while event-related brain potential…

  7. International Nuclear Safety Center (INSC) database

    International Nuclear Information System (INIS)

    Sofu, T.; Ley, H.; Turski, R.B.

    1997-01-01

    As an integral part of DOE's International Nuclear Safety Center (INSC) at Argonne National Laboratory, the INSC Database has been established to provide an interactively accessible information resource for the world's nuclear facilities and to promote free and open exchange of nuclear safety information among nations. The INSC Database is a comprehensive resource database aimed at a scope and level of detail suitable for safety analysis and risk evaluation for the world's nuclear power plants and facilities. It also provides an electronic forum for international collaborative safety research for the Department of Energy and its international partners. The database is intended to provide plant design information, material properties, computational tools, and results of safety analysis. Initial emphasis in data gathering is given to Soviet-designed reactors in Russia, the former Soviet Union, and Eastern Europe. The implementation is performed under the Oracle database management system, and the World Wide Web is used to serve as the access path for remote users. An interface between the Oracle database and the Web server is established through a custom designed Web-Oracle gateway which is used mainly to perform queries on the stored data in the database tables

  8. Influences of Developmental Contexts and Gender Differences on School Performance of Children and Adolescents

    Science.gov (United States)

    Diniz, Eva; da Rosa Piccolo, Luciane; de Paula Couto, Maria Clara Pinheiro; Salles, Jerusa Fumagalli; Helena Koller, Silvia

    2014-01-01

    This study investigated children and adolescents' school performance over time focusing on two variables that may influence it: developmental context and gender. The sample comprised 627 participants (M[subscript age]?=?11.13, SD?=?1.8), 51% of them female, from grade one to eight, living either with family (n?=?474) or in care institutions…

  9. The Danish national quality database for births

    DEFF Research Database (Denmark)

    Andersson, Charlotte Brix; Flems, Christina; Kesmodel, Ulrik Schiøler

    2016-01-01

    Aim of the database: The aim of the Danish National Quality Database for Births (DNQDB) is to measure the quality of the care provided during birth through specific indicators. Study population: The database includes all hospital births in Denmark. Main variables: Anesthesia/pain relief, continuous...... Medical Birth Registry. Registration to the Danish Medical Birth Registry is mandatory for all maternity units in Denmark. During the 5 years, performance has improved in the areas covered by the process indicators and for some of the outcome indicators. Conclusion: Measuring quality of care during...

  10. Concurrency control in distributed database systems

    CERN Document Server

    Cellary, W; Gelenbe, E

    1989-01-01

    Distributed Database Systems (DDBS) may be defined as integrated database systems composed of autonomous local databases, geographically distributed and interconnected by a computer network.The purpose of this monograph is to present DDBS concurrency control algorithms and their related performance issues. The most recent results have been taken into consideration. A detailed analysis and selection of these results has been made so as to include those which will promote applications and progress in the field. The application of the methods and algorithms presented is not limited to DDBSs but a

  11. Solutions for medical databases optimal exploitation.

    Science.gov (United States)

    Branescu, I; Purcarea, V L; Dobrescu, R

    2014-03-15

    The paper discusses the methods to apply OLAP techniques for multidimensional databases that leverage the existing, performance-enhancing technique, known as practical pre-aggregation, by making this technique relevant to a much wider range of medical applications, as a logistic support to the data warehousing techniques. The transformations have practically low computational complexity and they may be implemented using standard relational database technology. The paper also describes how to integrate the transformed hierarchies in current OLAP systems, transparently to the user and proposes a flexible, "multimodel" federated system for extending OLAP querying to external object databases.

  12. License - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data List Contact us Trypanoso... Attribution-Share Alike 2.1 Japan . If you use data from this database, please be sure attribute this database as follows: Trypanoso...nse Update History of This Database Site Policy | Contact Us License - Trypanosomes Database | LSDB Archive ...

  13. Developing of database on nuclear power engineering and purchase of ORACLE system

    International Nuclear Information System (INIS)

    Liu Renkang

    1996-01-01

    This paper presents a point of view according development of database on the nuclear power engineering and performance of ORACLE database manager system. ORACLE system is a practical database system for purchasing

  14. MARC and Relational Databases.

    Science.gov (United States)

    Llorens, Jose; Trenor, Asuncion

    1993-01-01

    Discusses the use of MARC format in relational databases and addresses problems of incompatibilities. A solution is presented that is in accordance with Open Systems Interconnection (OSI) standards and is based on experiences at the library of the Universidad Politecnica de Valencia (Spain). (four references) (EA)

  15. Teaching Historians with Databases.

    Science.gov (United States)

    Burton, Vernon

    1993-01-01

    Asserts that, although pressures to publish have detracted from the quality of teaching at the college level, recent innovations in educational technology have created opportunities for instructional improvement. Describes the use of computer-assisted instruction and databases in college-level history courses. (CFR)

  16. Literature database aid

    International Nuclear Information System (INIS)

    Wanderer, J.A.

    1991-01-01

    The booklet is to help with the acquisition of original literature either after a conventional literature search or in particular after a database search. It bridges the gap between abbreviated (short) and original (long) titel. This, together with information on the holdings of technical/scientific libraries, facilitates document delivery. 1500 short titles are listed alphabetically. (orig.) [de

  17. Oversigt over databaser

    DEFF Research Database (Denmark)

    Krogh Graversen, Brian

    Dette er en oversigt over registre, som kan anvendes til at beslyse situationen og udviklingen på det sociale område. Oversigten er anden fase i et dataprojekt, som har til formål at etablere en database, som kan danne basis for en løbende overvågning, udredning, evaluering og forskning på det...

  18. Database Programming Languages

    DEFF Research Database (Denmark)

    This volume contains the proceedings of the 11th International Symposium on Database Programming Languages (DBPL 2007), held in Vienna, Austria, on September 23-24, 2007. DBPL 2007 was one of 15 meetings co-located with VLBD (the International Conference on Very Large Data Bases). DBPL continues...

  19. From database to normbase

    NARCIS (Netherlands)

    Stamper, R.K.; Liu, Kecheng; Liu, K.; Kolkman, M.; Kolkman, M.; Klarenberg, P.; Ades, Y.; van Slooten, C.; van Slooten, F.; Ades, Y.

    1991-01-01

    After the database concept, we are ready for the normbase concept. The object is to decouple organizational and technical knowledge that are now mixed inextricably together in the application programs we write today. The underlying principle is to find a way of specifying a social system as a system

  20. Database on wind characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, K.S. [The Technical Univ. of Denmark (Denmark); Courtney, M.S. [Risoe National Lab., (Denmark)

    1999-08-01

    The organisations that participated in the project consists of five research organisations: MIUU (Sweden), ECN (The Netherlands), CRES (Greece), DTU (Denmark), Risoe (Denmark) and one wind turbine manufacturer: Vestas Wind System A/S (Denmark). The overall goal was to build a database consisting of a large number of wind speed time series and create tools for efficiently searching through the data to select interesting data. The project resulted in a database located at DTU, Denmark with online access through the Internet. The database contains more than 50.000 hours of measured wind speed measurements. A wide range of wind climates and terrain types are represented with significant amounts of time series. Data have been chosen selectively with a deliberate over-representation of high wind and complex terrain cases. This makes the database ideal for wind turbine design needs but completely unsuitable for resource studies. Diversity has also been an important aim and this is realised with data from a large range of terrain types; everything from offshore to mountain, from Norway to Greece. (EHS)

  1. Database Description - SSBD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name SSBD Alternative nam...ss 2-2-3 Minatojima-minamimachi, Chuo-ku, Kobe 650-0047, Japan, RIKEN Quantitative Biology Center Shuichi Onami E-mail: Database... classification Other Molecular Biology Databases Database classification Dynamic databa...elegans Taxonomy ID: 6239 Taxonomy Name: Escherichia coli Taxonomy ID: 562 Database description Systems Scie...i Onami Journal: Bioinformatics/April, 2015/Volume 31, Issue 7 External Links: Original website information Database

  2. Database Description - GETDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available abase Description General information of database Database name GETDB Alternative n...ame Gal4 Enhancer Trap Insertion Database DOI 10.18908/lsdba.nbdc00236-000 Creator Creator Name: Shigeo Haya... Chuo-ku, Kobe 650-0047 Tel: +81-78-306-3185 FAX: +81-78-306-3183 E-mail: Database classification Expression... Invertebrate genome database Organism Taxonomy Name: Drosophila melanogaster Taxonomy ID: 7227 Database des...riginal website information Database maintenance site Drosophila Genetic Resource

  3. The AMMA database

    Science.gov (United States)

    Boichard, Jean-Luc; Brissebrat, Guillaume; Cloche, Sophie; Eymard, Laurence; Fleury, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim

    2010-05-01

    The AMMA project includes aircraft, ground-based and ocean measurements, an intensive use of satellite data and diverse modelling studies. Therefore, the AMMA database aims at storing a great amount and a large variety of data, and at providing the data as rapidly and safely as possible to the AMMA research community. In order to stimulate the exchange of information and collaboration between researchers from different disciplines or using different tools, the database provides a detailed description of the products and uses standardized formats. The AMMA database contains: - AMMA field campaigns datasets; - historical data in West Africa from 1850 (operational networks and previous scientific programs); - satellite products from past and future satellites, (re-)mapped on a regular latitude/longitude grid and stored in NetCDF format (CF Convention); - model outputs from atmosphere or ocean operational (re-)analysis and forecasts, and from research simulations. The outputs are processed as the satellite products are. Before accessing the data, any user has to sign the AMMA data and publication policy. This chart only covers the use of data in the framework of scientific objectives and categorically excludes the redistribution of data to third parties and the usage for commercial applications. Some collaboration between data producers and users, and the mention of the AMMA project in any publication is also required. The AMMA database and the associated on-line tools have been fully developed and are managed by two teams in France (IPSL Database Centre, Paris and OMP, Toulouse). Users can access data of both data centres using an unique web portal. This website is composed of different modules : - Registration: forms to register, read and sign the data use chart when an user visits for the first time - Data access interface: friendly tool allowing to build a data extraction request by selecting various criteria like location, time, parameters... The request can

  4. JDD, Inc. Database

    Science.gov (United States)

    Miller, David A., Jr.

    2004-01-01

    JDD Inc, is a maintenance and custodial contracting company whose mission is to provide their clients in the private and government sectors "quality construction, construction management and cleaning services in the most efficient and cost effective manners, (JDD, Inc. Mission Statement)." This company provides facilities support for Fort Riley in Fo,rt Riley, Kansas and the NASA John H. Glenn Research Center at Lewis Field here in Cleveland, Ohio. JDD, Inc. is owned and operated by James Vaughn, who started as painter at NASA Glenn and has been working here for the past seventeen years. This summer I worked under Devan Anderson, who is the safety manager for JDD Inc. in the Logistics and Technical Information Division at Glenn Research Center The LTID provides all transportation, secretarial, security needs and contract management of these various services for the center. As a safety manager, my mentor provides Occupational Health and Safety Occupation (OSHA) compliance to all JDD, Inc. employees and handles all other issues (Environmental Protection Agency issues, workers compensation, safety and health training) involving to job safety. My summer assignment was not as considered "groundbreaking research" like many other summer interns have done in the past, but it is just as important and beneficial to JDD, Inc. I initially created a database using a Microsoft Excel program to classify and categorize data pertaining to numerous safety training certification courses instructed by our safety manager during the course of the fiscal year. This early portion of the database consisted of only data (training field index, employees who were present at these training courses and who was absent) from the training certification courses. Once I completed this phase of the database, I decided to expand the database and add as many dimensions to it as possible. Throughout the last seven weeks, I have been compiling more data from day to day operations and been adding the

  5. Database Description - Open TG-GATEs Pathological Image Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Open TG-GATEs Pathological Image Database Database Description General information of database Database... name Open TG-GATEs Pathological Image Database Alternative name - DOI 10.18908/lsdba.nbdc00954-0...iomedical Innovation 7-6-8, Saito-asagi, Ibaraki-city, Osaka 567-0085, Japan TEL:81-72-641-9826 Email: Database... classification Toxicogenomics Database Organism Taxonomy Name: Rattus norvegi... Article title: Author name(s): Journal: External Links: Original website information Database

  6. Database Description - Yeast Interacting Proteins Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Yeast Interacting Proteins Database Database Description General information of database Database... name Yeast Interacting Proteins Database Alternative name - DOI 10.18908/lsdba.nbdc00742-000 Creator C...-ken 277-8561 Tel: +81-4-7136-3989 FAX: +81-4-7136-3979 E-mail : Database classif...s cerevisiae Taxonomy ID: 4932 Database description Information on interactions and related information obta...l Acad Sci U S A. 2001 Apr 10;98(8):4569-74. Epub 2001 Mar 13. External Links: Original website information Database

  7. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  8. Development of a personalized training system using the Lung Image Database Consortium and Image Database resource Initiative Database.

    Science.gov (United States)

    Lin, Hongli; Wang, Weisheng; Luo, Jiawei; Yang, Xuedong

    2014-12-01

    The aim of this study was to develop a personalized training system using the Lung Image Database Consortium (LIDC) and Image Database resource Initiative (IDRI) Database, because collecting, annotating, and marking a large number of appropriate computed tomography (CT) scans, and providing the capability of dynamically selecting suitable training cases based on the performance levels of trainees and the characteristics of cases are critical for developing a efficient training system. A novel approach is proposed to develop a personalized radiology training system for the interpretation of lung nodules in CT scans using the Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) database, which provides a Content-Boosted Collaborative Filtering (CBCF) algorithm for predicting the difficulty level of each case of each trainee when selecting suitable cases to meet individual needs, and a diagnostic simulation tool to enable trainees to analyze and diagnose lung nodules with the help of an image processing tool and a nodule retrieval tool. Preliminary evaluation of the system shows that developing a personalized training system for interpretation of lung nodules is needed and useful to enhance the professional skills of trainees. The approach of developing personalized training systems using the LIDC/IDRL database is a feasible solution to the challenges of constructing specific training program in terms of cost and training efficiency. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  9. Danish clinical databases: An overview

    DEFF Research Database (Denmark)

    Green, Anders

    2011-01-01

    Clinical databases contain data related to diagnostic procedures, treatments and outcomes. In 2001, a scheme was introduced for the approval, supervision and support to clinical databases in Denmark.......Clinical databases contain data related to diagnostic procedures, treatments and outcomes. In 2001, a scheme was introduced for the approval, supervision and support to clinical databases in Denmark....

  10. Group Theory and Crystal Field Theory: A Simple and Rigorous Derivation of the Spectroscopic Terms Generated by the t[subscript 2g][superscript 2] Electronic Configuration in a Strong Octahedral Field

    Science.gov (United States)

    Morpurgo, Simone

    2007-01-01

    The principles of symmetry and group theory are applied to the zero-order wavefunctions associated with the strong-field t[subscript 2g][superscript 2] configuration and their symmetry-adapted linear combinations (SALC) associated with the generated energy terms are derived. This approach will enable students to better understand the use of…

  11. The GLIMS Glacier Database

    Science.gov (United States)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2007-12-01

    The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), Map

  12. Suscripción de revistas en línea en la Argentina, 2007 On line journals subscription in Argentina, 2007

    Directory of Open Access Journals (Sweden)

    Susana Romanos de Tiratel

    2011-12-01

    Full Text Available El artículo presenta los resultados de una investigación orientada a describir aspectos del proceso de suscripción de revistas en línea en la Argentina. Entre abril y setiembre del año 2007 se aplicó una encuesta vía correo electrónico a bibliotecas especializadas y universitarias argentinas para indagar acerca de los actores involucrados en la identificación, evaluación y selección de títulos, en la negociación y firma de las licencias, las modalidades de suscripción y los tipos de recursos suscriptos por esas unidades de información. Se observó una alta participación de los bibliotecarios en todo el proceso que disminuía en la etapa de negociación y firma de los acuerdos de licencia donde los directores institucionales y los administradores tenían un rol más destacado.This paper presents the findings of an investigation aimed at describing aspects of the process of online journals subscription in Argentina. From April to September of 2007 a survey was conducted via electronic mail to special and academic libraries to inquire about Argentine actors involved in the identification, evaluation, and selection of titles, in the negotiation and signing of the licences, the subscription methods, and type of resources subscribed for those information units. There was a high participation of librarians in the process that decreased in the stage of negotiating and signing the licencing agreements where institutional managers and administrators had a more prominent role.

  13. Informing Evidence Based Decisions: Usage Statistics for Online Journal Databases

    Directory of Open Access Journals (Sweden)

    Alexei Botchkarev

    2017-06-01

    Full Text Available Abstract Objective – The primary objective was to examine online journal database usage statistics for a provincial ministry of health in the context of evidence based decision-making. In addition, the study highlights implementation of the Journal Access Centre (JAC that is housed and powered by the Ontario Ministry of Health and Long-Term Care (MOHLTC to inform health systems policy-making. Methods – This was a prospective case study using descriptive analysis of the JAC usage statistics of journal articles from January 2009 to September 2013. Results – JAC enables ministry employees to access approximately 12,000 journals with full-text articles. JAC usage statistics for the 2011-2012 calendar years demonstrate a steady level of activity in terms of searches, with monthly averages of 5,129. In 2009-2013, a total of 4,759 journal titles were accessed including 1,675 journals with full-text. Usage statistics demonstrate that the actual consumption was over 12,790 full-text downloaded articles or approximately 2,700 articles annually. Conclusion – JAC’s steady level of activities, revealed by the study, reflects continuous demand for JAC services and products. It testifies that access to online journal databases has become part of routine government knowledge management processes. MOHLTC’s broad area of responsibilities with dynamically changing priorities translates into the diverse information needs of its employees and a large set of required journals. Usage statistics indicate that MOHLTC information needs cannot be mapped to a reasonably compact set of “core” journals with a subsequent subscription to those.

  14. Nuclear materials thermo-physical property database and property analysis using the database

    International Nuclear Information System (INIS)

    Jeong, Yeong Seok

    2002-02-01

    It is necessary that thermo-physical properties and understand of nuclear materials for evaluation and analysis to steady and accident states of commercial and research reactor. In this study, development of nuclear materials thermo-properties database and home page. In application of this database, it is analyzed of thermal conductivity, heat capacity, enthalpy, and linear thermal expansion of fuel and cladding material and compared thermo-properties model in nuclear fuel performance evaluation codes with experimental data in database. Results of compare thermo-property model of UO 2 fuel and cladding major performance evaluation code, both are similar

  15. REPLIKASI UNIDIRECTIONAL PADA HETEROGEN DATABASE

    OpenAIRE

    Hendro Nindito; Evaristus Didik Madyatmadja; Albert Verasius Dian Sano

    2013-01-01

    The use of diverse database technology in enterprise today can not be avoided. Thus, technology is needed to generate information in real time. The purpose of this research is to discuss a database replication technology that can be applied in heterogeneous database environments. In this study we use Windows-based MS SQL Server database to Linux-based Oracle database as the goal. The research method used is prototyping where development can be done quickly and testing of working models of the...

  16. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  17. Database on aircraft accidents

    International Nuclear Information System (INIS)

    Nishio, Masahide; Koriyama, Tamio

    2013-11-01

    The Reactor Safety Subcommittee in the Nuclear Safety and Preservation Committee published 'The criteria on assessment of probability of aircraft crash into light water reactor facilities' as the standard method for evaluating probability of aircraft crash into nuclear reactor facilities in July 2002. In response to this issue, Japan Nuclear Energy Safety Organization has been collecting open information on aircraft accidents of commercial airplanes, self-defense force (SDF) airplanes and US force airplanes every year since 2003, sorting out them and developing the database of aircraft accidents for the latest 20 years to evaluate probability of aircraft crash into nuclear reactor facilities. In this report the database was revised by adding aircraft accidents in 2011 to the existing database and deleting aircraft accidents in 1991 from it, resulting in development of the revised 2012 database for the latest 20 years from 1992 to 2011. Furthermore, the flight information on commercial aircrafts was also collected to develop the flight database for the latest 20 years from 1992 to 2011 to evaluate probability of aircraft crash into reactor facilities. The method for developing the database of aircraft accidents to evaluate probability of aircraft crash into reactor facilities is based on the report 'The criteria on assessment of probability of aircraft crash into light water reactor facilities' described above. The 2012 revised database for the latest 20 years from 1992 to 2011 shows the followings. The trend of the 2012 database changes little as compared to the last year's report. (1) The data of commercial aircraft accidents is based on 'Aircraft accident investigation reports of Japan transport safety board' of Ministry of Land, Infrastructure, Transport and Tourism. The number of commercial aircraft accidents is 4 for large fixed-wing aircraft, 58 for small fixed-wing aircraft, 5 for large bladed aircraft and 99 for small bladed aircraft. The relevant accidents

  18. ARTI Refrigerant Database

    Energy Technology Data Exchange (ETDEWEB)

    Calm, J.M.

    1992-11-09

    The database provides bibliographic citations and abstracts for publications that may be useful in research and design of air- conditioning and refrigeration equipment. The database identifies sources of specific information on R-32, R-123, R-124, R-125, R-134, R-134a, R-141b, R-142b, R-143a, R-152a, R-245ca, R-290 (propane), R- 717 (ammonia), ethers, and others as well as azeotropic and zeotropic and zeotropic blends of these fluids. It addresses lubricants including alkylbenzene, polyalkylene glycol, ester, and other synthetics as well as mineral oils. It also references documents on compatibility of refrigerants and lubricants with metals, plastics, elastomers, motor insulation, and other materials used in refrigerant circuits. A computerized version is available that includes retrieval software.

  19. Geologic Field Database

    Directory of Open Access Journals (Sweden)

    Katarina Hribernik

    2002-12-01

    Full Text Available The purpose of the paper is to present the field data relational database, which was compiled from data, gathered during thirty years of fieldwork on the Basic Geologic Map of Slovenia in scale1:100.000. The database was created using MS Access software. The MS Access environment ensures its stability and effective operation despite changing, searching, and updating the data. It also enables faster and easier user-friendly access to the field data. Last but not least, in the long-term, with the data transferred into the GISenvironment, it will provide the basis for the sound geologic information system that will satisfy a broad spectrum of geologists’ needs.

  20. DistiLD Database

    DEFF Research Database (Denmark)

    Palleja, Albert; Horn, Heiko; Eliasson, Sabrina

    2012-01-01

    Genome-wide association studies (GWAS) have identified thousands of single nucleotide polymorphisms (SNPs) associated with the risk of hundreds of diseases. However, there is currently no database that enables non-specialists to answer the following simple questions: which SNPs associated...... with diseases are in linkage disequilibrium (LD) with a gene of interest? Which chromosomal regions have been associated with a given disease, and which are the potentially causal genes in each region? To answer these questions, we use data from the HapMap Project to partition each chromosome into so-called LD...... blocks, so that SNPs in LD with each other are preferentially in the same block, whereas SNPs not in LD are in different blocks. By projecting SNPs and genes onto LD blocks, the DistiLD database aims to increase usage of existing GWAS results by making it easy to query and visualize disease...