WorldWideScience

Sample records for database systems ii

  1. Electromagnetic Systems Effects Database (EMSED). AERO 90, Phase II User's Manual

    National Research Council Canada - National Science Library

    Sawires, Kalim

    1998-01-01

    The Electromagnetic Systems Effects Database (EMSED), also called AIRBASE, is a training guide for users not familiar with the AIRBASE database and its operating platform, the Macintosh computer (Mac...

  2. Analysis and Prototyping of the United States Marine Corps Total Force Administration System (TFAS), Echelon II - A Web Enabled Database for the Small Unit Leader

    National Research Council Canada - National Science Library

    Simmons, Steven

    2002-01-01

    ...), Echelon II - A Web Enabled Database for The Small Unit Leader. The analysis consisted of researching the characteristics of the current manpower system, MCTFS, and the conceptual tenets of the TFAS program...

  3. The new area monitoring system and the fuel database of the TRIGA Mark II reactor in Vienna

    International Nuclear Information System (INIS)

    Villa, M.; Boeck, H.; Hofbauer, M.; Schwarz, V.

    2004-01-01

    The 250 kW TRIGA Mark-II reactor operates since March 1962 at the Atominstitut, Vienna, Austria. Its main tasks are nuclear education and training in the fields of neutron- and solid state physics, nuclear technology, reactor safety, radiochemistry, radiation protection and dosimetry, and low temperature physics and fusion research. Academic research is carried out by students in the above mentioned fields coordinated and supervised by about 70 staff members with the aim of a masters- or PhD degree in one of the above mentioned areas. After 25 years of successful operation, it was necessary to exchange the old area monitoring system with a new digital one. The purpose of the new system is the permanent control of the reactor hall, the primary and secondary cooling system and the monitoring of the ventilation system. The paper describes the development and implementation of the new area monitoring system. The second topic in this paper describes the development of the new fuel database. Since March 7th, 1962, the TRIGA Mark II reactor Vienna operates with an average of 263 MWh per year, which corresponds to a uranium burn-up of 13.7 g per year. Presently we have 81 TRIGA fuel elements in the core, 55 of them are old aluminium clad elements from the initial criticality while the rest are stainless steel clad elements which had been added later to compensate the uranium consumption. Because 67 % of the elements are older than 40 years, it was necessary to put the history of every element in a database, to get an easy access to all the relevant data for every element in our facility. (author)

  4. Forest Focus Monitoring Database System - Technical Report 2003 Level II Data

    OpenAIRE

    HIEDERER ROLAND; DURRANT TRACY; GRANKE O.; LAMBOTTE Michel; LORENZ M.; MIGNON B.; OEHMICHEN K.

    2007-01-01

    Forest Focus (Regulation (EC) No 2152/2003) is a Community scheme for harmonized, broad-based, comprehensive and long-term monitoring of European forest ecosystems. Under this scheme the monitoring of air pollution effects on forests is carried out by participating countries on the basis of the systematic network of observation points (Level I) and of the network of observation plots for intensive and continuous monitoring (Level II). According to Article 15(1) of the Forest Focus Regulat...

  5. Forest Focus Monitoring Database System - Technical Report 2006 Level II Data

    OpenAIRE

    HIEDERER Roland; DURRANT Tracy; GRANKE Oliver; LAMBOTTE Michel; LORENZ Martin; MIGNON Bertrand

    2008-01-01

    Forest Focus (Regulation (EC) No 2152/2003) is a Community scheme for harmonized, broadbased, comprehensive and long-term monitoring of European forest ecosystems. Under this scheme the monitoring of air pollution effects on forests is carried out by participating countries on the basis of the systematic network of observation points (Level I) and of the network of observation plots for intensive and continuous monitoring (Level II). According to Article 15(1) of the Forest Focus Regulatio...

  6. JOYO MK-II core characteristics database

    International Nuclear Information System (INIS)

    Tabuchi, Shiro; Aoyama, Takafumi; Nagasaki, Hideaki; Kato, Yuichi

    1998-12-01

    The experimental fast reactor JOYO served as the MK-II irradiation bed core for testing fuel and material for FBR development for 15 years from 1982 to 1997. During the MK-II operation, extensive data were accumulated from the core characteristics tests conducted in thirty-one duty operations and thirteen special test operations. These core management data and core characteristics data were compiled into a database. The code system MAGI has been developed and used for core management of JOYO MK-II, and the core characteristics and the irradiation test conditions were calculated using MAGI on the basis of three dimensional diffusion theory with seven neutron energy groups. The core management data include extensive data, which were recorded on CD-ROM for user convenience. The data are specifications and configurations of the core, and for about 300 driver fuel subassemblies and about 60 uninstrumented irradiation subassemblies are core composition before and after irradiation, neutron flux, neutron fluences, fuel and control rod burn-up, and temperature and power distributions. MK-II core characteristics and test conditions were stored in the database for post analysis. Core characteristics data include excess reactivities, control rod worths, and reactivity coefficients, e.g., temperature, power and burn-up. Test conditions include both measured and calculated data for irradiation conditions. (author)

  7. Towards Sensor Database Systems

    DEFF Research Database (Denmark)

    Bonnet, Philippe; Gehrke, Johannes; Seshadri, Praveen

    2001-01-01

    . These systems lack flexibility because data is extracted in a predefined way; also, they do not scale to a large number of devices because large volumes of raw data are transferred regardless of the queries that are submitted. In our new concept of sensor database system, queries dictate which data is extracted...... from the sensors. In this paper, we define the concept of sensor databases mixing stored data represented as relations and sensor data represented as time series. Each long-running query formulated over a sensor database defines a persistent view, which is maintained during a given time interval. We...... also describe the design and implementation of the COUGAR sensor database system....

  8. Construction of a bibliographic information database and development of retrieval system for research reports in nuclear science and technology (II)

    International Nuclear Information System (INIS)

    Han, Duk Haeng; Kim, Tae Whan; Choi, Kwang; Yoo, An Na; Keum, Jong Yong; Kim, In Kwon

    1996-05-01

    The major goal of this project is to construct a bibliographic information database in nuclear engineering and to develop a prototype retrieval system. To give an easy access to microfiche research report, this project has accomplished the construction of microfiche research reports database and the development of retrieval system. The results of the project are as follows; 1. Microfiche research reports database was constructed by downloading from DOE Energy, NTIS, INIS. 2. The retrieval system was developed in host and web version using access point such as title, abstracts, keyword, report number. 6 tabs., 8 figs., 11 refs. (Author) .new

  9. Construction of a bibliographic information database and development of retrieval system for research reports in nuclear science and technology (II)

    Energy Technology Data Exchange (ETDEWEB)

    Han, Duk Haeng; Kim, Tae Whan; Choi, Kwang; Yoo, An Na; Keum, Jong Yong; Kim, In Kwon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-05-01

    The major goal of this project is to construct a bibliographic information database in nuclear engineering and to develop a prototype retrieval system. To give an easy access to microfiche research report, this project has accomplished the construction of microfiche research reports database and the development of retrieval system. The results of the project are as follows; 1. Microfiche research reports database was constructed by downloading from DOE Energy, NTIS, INIS. 2. The retrieval system was developed in host and web version using access point such as title, abstracts, keyword, report number. 6 tabs., 8 figs., 11 refs. (Author) .new.

  10. JOYO MK-II core characteristics database

    International Nuclear Information System (INIS)

    Ohkawachi, Yasushi; Maeda, Shigetaka; Sekine, Takashi; Aoyama, Takafumi

    2003-04-01

    The 'JOYO' MK-II core characteristics database was compiled and published in 1998. Comments and requests from many users led to the creation of a revised edition. The revisions include changes to the MAGI calculation code system to use the 70 group JFS-3-J3.2 constant set processed from the JENDL-3.2 library. Total control rod worth, reactor kinetic parameters and the MK-II core performance test results were included per user's requests. The core characteristics obtained from the 32 nd to 35 th operational cycles, which were conducted in the MK-III transition core, were newly added in this revised version. The MK-II core management data and core characteristics data were recorded to CD-ROM for user convenience. The Configuration Data' include the core arrangement and refueling record for each operational cycle. The 'Subassembly Library Data' include the atomic number density, neutron fluence, burn-up, integral power of 362 driver fuel subassemblies and 69 irradiation test subassemblies. The 'Output Data' contain the calculated neutron flux, gamma flux, power density, linear heat rate, coolant and fuel temperature distribution of all the fuel subassemblies at the beginning and end of each operational cycle. The 'Core Characteristics Data' include the measured excess reactivity, control rod worth calibration curve, and reactivity coefficients of temperature, power and burn-up. (author)

  11. Conditions Database for the Belle II Experiment

    Science.gov (United States)

    Wood, L.; Elsethagen, T.; Schram, M.; Stephan, E.

    2017-10-01

    The Belle II experiment at KEK is preparing for first collisions in 2017. Processing the large amounts of data that will be produced will require conditions data to be readily available to systems worldwide in a fast and efficient manner that is straightforward for both the user and maintainer. The Belle II conditions database was designed with a straightforward goal: make it as easily maintainable as possible. To this end, HEP-specific software tools were avoided as much as possible and industry standard tools used instead. HTTP REST services were selected as the application interface, which provide a high-level interface to users through the use of standard libraries such as curl. The application interface itself is written in Java and runs in an embedded Payara-Micro Java EE application server. Scalability at the application interface is provided by use of Hazelcast, an open source In-Memory Data Grid (IMDG) providing distributed in-memory computing and supporting the creation and clustering of new application interface instances as demand increases. The IMDG provides fast and efficient access to conditions data via in-memory caching.

  12. The Belle II VXD production database

    Energy Technology Data Exchange (ETDEWEB)

    Valentan, Manfred; Ritter, Martin [Max-Planck-Institut fuer Physik, Muenchen (Germany); Wuerkner, Benedikt; Leitl, Bernhard [Institut fuer Hochenergiephysik, Wien (Austria); Pilo, Federico [Istituto Nazionale di Fisica Nucleare, Pisa (Italy); Collaboration: Belle II-Collaboration

    2015-07-01

    The construction and commissioning of the Belle II Vertex Detector (VXD) is a huge endeavor involving a large number of valuable components. Both subsystems PXD (Pixel Detector) and SVD (Silicon Vertex Detector) deploy a large number of sensors, readout electronic parts and mechanical elements. These items are scattered around the world at many institutes, where they are built, measured and assembled. One has to keep track of measurement configurations and results, know at any time the location of the sensors, their processing state, quality, where they end up in an assembly, and who is responsible. These requirements call for a flexible and extensive database which is able to reflect the processes in the laboratories and the logistics between the institutes. This talk introduces the database requirements of a physics experiment using the PXD construction workflow as a showcase, and presents an overview of the database ''HephyDb'', which is used by the groups constructing the Belle II VXD.

  13. The PEP-II project-wide database

    International Nuclear Information System (INIS)

    Chan, A.; Calish, S.; Crane, G.; MacGregor, I.; Meyer, S.; Wong, J.

    1995-05-01

    The PEP-II Project Database is a tool for monitoring the technical and documentation aspects of this accelerator construction. It holds the PEP-II design specifications, fabrication and installation data in one integrated system. Key pieces of the database include the machine parameter list, magnet and vacuum fabrication data. CAD drawings, publications and documentation, survey and alignment data and property control. The database can be extended to contain information required for the operations phase of the accelerator and detector. Features such as viewing CAD drawing graphics from the database will be implemented in the future. This central Oracle database on a UNIX server is built using ORACLE Case tools. Users at the three collaborating laboratories (SLAC, LBL, LLNL) can access the data remotely, using various desktop computer platforms and graphical interfaces

  14. The magnet database system

    International Nuclear Information System (INIS)

    Ball, M.J.; Delagi, N.; Horton, B.; Ivey, J.C.; Leedy, R.; Li, X.; Marshall, B.; Robinson, S.L.; Tompkins, J.C.

    1992-01-01

    The Test Department of the Magnet Systems Division of the Superconducting Super Collider Laboratory (SSCL) is developing a central database of SSC magnet information that will be available to all magnet scientists at the SSCL or elsewhere, via network connections. The database contains information on the magnets' major components, configuration information (specifying which individual items were used in each cable, coil, and magnet), measurements made at major fabrication stages, and the test results on completed magnets. These data will facilitate the correlation of magnet performance with the properties of its constituents. Recent efforts have focused on the development of procedures for user-friendly access to the data, including displays in the format of the production open-quotes travelerclose quotes data sheets, standard summary reports, and a graphical interface for ad hoc queues and plots

  15. Research Directions in Database Security, II

    Science.gov (United States)

    1990-11-01

    5 Flexible Access Controls Bill Maimone of Oracle Corporation gave a presentation of Oracle’s new roles facility. The approach is apparently motivated ...See rule 5 for substitution of DAC mechanisms.) PS4 : Overclassification of data is to be avoided. PS5: Authorization to update data and create...of designing databases as op- posed to the abstract nature of operating system requirements. The primary motivation be- hind developing the Homework

  16. Database and Expert Systems Applications

    DEFF Research Database (Denmark)

    Viborg Andersen, Kim; Debenham, John; Wagner, Roland

    schemata, query evaluation, semantic processing, information retrieval, temporal and spatial databases, querying XML, organisational aspects of databases, natural language processing, ontologies, Web data extraction, semantic Web, data stream management, data extraction, distributed database systems......This book constitutes the refereed proceedings of the 16th International Conference on Database and Expert Systems Applications, DEXA 2005, held in Copenhagen, Denmark, in August 2005.The 92 revised full papers presented together with 2 invited papers were carefully reviewed and selected from 390...... submissions. The papers are organized in topical sections on workflow automation, database queries, data classification and recommendation systems, information retrieval in multimedia databases, Web applications, implementational aspects of databases, multimedia databases, XML processing, security, XML...

  17. An XCT image database system

    International Nuclear Information System (INIS)

    Komori, Masaru; Minato, Kotaro; Koide, Harutoshi; Hirakawa, Akina; Nakano, Yoshihisa; Itoh, Harumi; Torizuka, Kanji; Yamasaki, Tetsuo; Kuwahara, Michiyoshi.

    1984-01-01

    In this paper, an expansion of X-ray CT (XCT) examination history database to XCT image database is discussed. The XCT examination history database has been constructed and used for daily examination and investigation in our hospital. This database consists of alpha-numeric information (locations, diagnosis and so on) of more than 15,000 cases, and for some of them, we add tree structured image data which has a flexibility for various types of image data. This database system is written by MUMPS database manipulation language. (author)

  18. The magnet database system

    International Nuclear Information System (INIS)

    Baggett, P.; Delagi, N.; Leedy, R.; Marshall, W.; Robinson, S.L.; Tompkins, J.C.

    1991-01-01

    This paper describes the current status of MagCom, a central database of SSC magnet information that is available to all magnet scientists via network connections. The database has been designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will help magnet scientists to track and control the production process and to correlate the performance of magnets with the properties of their constituents

  19. Nuclear database management systems

    International Nuclear Information System (INIS)

    Stone, C.; Sutton, R.

    1996-01-01

    The authors are developing software tools for accessing and visualizing nuclear data. MacNuclide was the first software application produced by their group. This application incorporates novel database management and visualization tools into an intuitive interface. The nuclide chart is used to access properties and to display results of searches. Selecting a nuclide in the chart displays a level scheme with tables of basic, radioactive decay, and other properties. All level schemes are interactive, allowing the user to modify the display, move between nuclides, and display entire daughter decay chains

  20. Database Systems - Present and Future

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available The database systems have nowadays an increasingly important role in the knowledge-based society, in which computers have penetrated all fields of activity and the Internet tends to develop worldwide. In the current informatics context, the development of the applications with databases is the work of the specialists. Using databases, reach a database from various applications, and also some of related concepts, have become accessible to all categories of IT users. This paper aims to summarize the curricular area regarding the fundamental database systems issues, which are necessary in order to train specialists in economic informatics higher education. The database systems integrate and interfere with several informatics technologies and therefore are more difficult to understand and use. Thus, students should know already a set of minimum, mandatory concepts and their practical implementation: computer systems, programming techniques, programming languages, data structures. The article also presents the actual trends in the evolution of the database systems, in the context of economic informatics.

  1. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  2. A Sandia telephone database system

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, S.D.; Tolendino, L.F.

    1991-08-01

    Sandia National Laboratories, Albuquerque, may soon have more responsibility for the operation of its own telephone system. The processes that constitute providing telephone service can all be improved through the use of a central data information system. We studied these processes, determined the requirements for a database system, then designed the first stages of a system that meets our needs for work order handling, trouble reporting, and ISDN hardware assignments. The design was based on an extensive set of applications that have been used for five years to manage the Sandia secure data network. The system utilizes an Ingres database management system and is programmed using the Application-By-Forms tools.

  3. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 10 figs

  4. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage of MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. The data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 9 figs

  5. Management system of instrument database

    International Nuclear Information System (INIS)

    Zhang Xin

    1997-01-01

    The author introduces a management system of instrument database. This system has been developed using with Foxpro on network. The system has some characters such as clear structure, easy operation, flexible and convenient query, as well as the data safety and reliability

  6. JT-60 database system, 2

    International Nuclear Information System (INIS)

    Itoh, Yasuhiro; Kurihara, Kenichi; Kimura, Toyoaki.

    1987-07-01

    The JT-60 central control system, ''ZENKEI'' collects the control and instrumentation data relevant to discharge and device status data for plant monitoring. The former of the engineering data amounts to about 3 Mbytes per shot of discharge. The ''ZENKEI'' control system which consists of seven minicomputers for on-line real-time control has little performance of handling such a large amount of data for physical and engineering analysis. In order to solve this problem, it was planned to establish the experimental database on the Front-end Processor (FEP) of general purpose large computer in JAERI Computer Center. The database management system (DBMS), therefore, has been developed for creating the database during the shot interval. The engineering data are shipped up from ''ZENKEI'' to FEP through the dedicated communication line after the shot. The hierarchical data model has been adopted in this database, which consists of the data files with tree structure of three keys of system, discharge type and shot number. The JT-60 DBMS provides the data handling packages of subroutines for interfacing the database with user's application programs. The subroutine packages for supporting graphic processing and the function of access control for security of the database are also prepared in this DBMS. (author)

  7. Coordinating Mobile Databases: A System Demonstration

    OpenAIRE

    Zaihrayeu, Ilya; Giunchiglia, Fausto

    2004-01-01

    In this paper we present the Peer Database Management System (PDBMS). This system runs on top of the standard database management system, and it allows it to connect its database with other (peer) databases on the network. A particularity of our solution is that PDBMS allows for conventional database technology to be effectively operational in mobile settings. We think of database mobility as a database network, where databases appear and disappear spontaneously and their network access point...

  8. Jelly Views : Extending Relational Database Systems Toward Deductive Database Systems

    Directory of Open Access Journals (Sweden)

    Igor Wojnicki

    2004-01-01

    Full Text Available This paper regards the Jelly View technology, which provides a new, practical methodology for knowledge decomposition, storage, and retrieval within Relational Database Management Systems (RDBMS. Intensional Knowledge clauses (rules are decomposed and stored in the RDBMS founding reusable components. The results of the rule-based processing are visible as regular views, accessible through SQL. From the end-user point of view the processing capability becomes unlimited (arbitrarily complex queries can be constructed using Intensional Knowledge, while the most external queries are expressed with standard SQL. The RDBMS functionality becomes extended toward that of the Deductive Databases

  9. JT-60 database system, 1

    International Nuclear Information System (INIS)

    Kurihara, Kenichi; Kimura, Toyoaki; Itoh, Yasuhiro.

    1987-07-01

    Naturally, sufficient software circumstance makes it possible to analyse the discharge result data effectively. JT-60 discharge result data, collected by the supervisor, are transferred to the general purpose computer through the new linkage channel, and are converted to ''database''. Datafile in the database was designed to be surrounded by various interfaces. This structure is able to preserve the datafile reliability and does not expect the user's information about the datafile structure. In addition, the support system for graphic processing was developed so that the user may easily obtain the figures with some calculations. This paper reports on the basic concept and system design. (author)

  10. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  11. Nuclear technology databases and information network systems

    International Nuclear Information System (INIS)

    Iwata, Shuichi; Kikuchi, Yasuyuki; Minakuchi, Satoshi

    1993-01-01

    This paper describes the databases related to nuclear (science) technology, and information network. Following contents are collected in this paper: the database developed by JAERI, ENERGY NET, ATOM NET, NUCLEN nuclear information database, INIS, NUclear Code Information Service (NUCLIS), Social Application of Nuclear Technology Accumulation project (SANTA), Nuclear Information Database/Communication System (NICS), reactor materials database, radiation effects database, NucNet European nuclear information database, reactor dismantling database. (J.P.N.)

  12. ECG-ViEW II, a freely accessible electrocardiogram database

    Science.gov (United States)

    Park, Man Young; Lee, Sukhoon; Jeon, Min Seok; Yoon, Dukyong; Park, Rae Woong

    2017-01-01

    The Electrocardiogram Vigilance with Electronic data Warehouse II (ECG-ViEW II) is a large, single-center database comprising numeric parameter data of the surface electrocardiograms of all patients who underwent testing from 1 June 1994 to 31 July 2013. The electrocardiographic data include the test date, clinical department, RR interval, PR interval, QRS duration, QT interval, QTc interval, P axis, QRS axis, and T axis. These data are connected with patient age, sex, ethnicity, comorbidities, age-adjusted Charlson comorbidity index, prescribed drugs, and electrolyte levels. This longitudinal observational database contains 979,273 electrocardiograms from 461,178 patients over a 19-year study period. This database can provide an opportunity to study electrocardiographic changes caused by medications, disease, or other demographic variables. ECG-ViEW II is freely available at http://www.ecgview.org. PMID:28437484

  13. Database usage and performance for the Fermilab Run II experiments

    International Nuclear Information System (INIS)

    Bonham, D.; Box, D.; Gallas, E.; Guo, Y.; Jetton, R.; Kovich, S.; Kowalkowski, J.; Kumar, A.; Litvintsev, D.; Lueking, L.; Stanfield, N.; Trumbo, J.; Vittone-Wiersma, M.; White, S.P.; Wicklund, E.; Yasuda, T.; Maksimovic, P.

    2004-01-01

    The Run II experiments at Fermilab, CDF and D0, have extensive database needs covering many areas of their online and offline operations. Delivering data to users and processing farms worldwide has represented major challenges to both experiments. The range of applications employing databases includes, calibration (conditions), trigger information, run configuration, run quality, luminosity, data management, and others. Oracle is the primary database product being used for these applications at Fermilab and some of its advanced features have been employed, such as table partitioning and replication. There is also experience with open source database products such as MySQL for secondary databases used, for example, in monitoring. Tools employed for monitoring the operation and diagnosing problems are also described

  14. Interconnecting heterogeneous database management systems

    Science.gov (United States)

    Gligor, V. D.; Luckenbaugh, G. L.

    1984-01-01

    It is pointed out that there is still a great need for the development of improved communication between remote, heterogeneous database management systems (DBMS). Problems regarding the effective communication between distributed DBMSs are primarily related to significant differences between local data managers, local data models and representations, and local transaction managers. A system of interconnected DBMSs which exhibit such differences is called a network of distributed, heterogeneous DBMSs. In order to achieve effective interconnection of remote, heterogeneous DBMSs, the users must have uniform, integrated access to the different DBMs. The present investigation is mainly concerned with an analysis of the existing approaches to interconnecting heterogeneous DBMSs, taking into account four experimental DBMS projects.

  15. Experimental database retrieval system 'DARTS'

    International Nuclear Information System (INIS)

    Aoyagi, Tetsuo; Tani, Keiji; Haginoya, Hirobumi; Naito, Shinjiro.

    1989-02-01

    In JT-60, a large tokamak device of Japan Atomic Energy Research Institute (JAERI), a plasma is fired for 5 ∼ 10 seconds at intervals of about 10 minutes. The each firing is called a shot. Plasma diagnostic data are edited as JT-60 experimental database at every shot cycle and are stored in a large-scale computer (FACOM-M780). Experimentalists look up the data for specific shots which they want to analyze and consider. As the total number of shots increases, they find a difficulty in the looking-up work. In order that they can easily access to their objective shot data or shot group data by using a computer terminal, 'DARTS' (DAtabase ReTrieval System) has been developed. This report may provide enough information on DARTS handling for users. (author)

  16. Database reliability engineering designing and operating resilient database systems

    CERN Document Server

    Campbell, Laine

    2018-01-01

    The infrastructure-as-code revolution in IT is also affecting database administration. With this practical book, developers, system administrators, and junior to mid-level DBAs will learn how the modern practice of site reliability engineering applies to the craft of database architecture and operations. Authors Laine Campbell and Charity Majors provide a framework for professionals looking to join the ranks of today’s database reliability engineers (DBRE). You’ll begin by exploring core operational concepts that DBREs need to master. Then you’ll examine a wide range of database persistence options, including how to implement key technologies to provide resilient, scalable, and performant data storage and retrieval. With a firm foundation in database reliability engineering, you’ll be ready to dive into the architecture and operations of any modern database. This book covers: Service-level requirements and risk management Building and evolving an architecture for operational visibility ...

  17. Column-oriented database management systems

    OpenAIRE

    Možina, David

    2013-01-01

    In the following thesis I will present column-oriented database. Among other things, I will answer on a question why there is a need for a column-oriented database. In recent years there have been a lot of attention regarding a column-oriented database, even if the existence of a columnar database management systems dates back in the early seventies of the last century. I will compare both systems for a database management – a colum-oriented database system and a row-oriented database system ...

  18. A Relational Database System for Student Use.

    Science.gov (United States)

    Fertuck, Len

    1982-01-01

    Describes an APL implementation of a relational database system suitable for use in a teaching environment in which database development and database administration are studied, and discusses the functions of the user and the database administrator. An appendix illustrating system operation and an eight-item reference list are attached. (Author/JL)

  19. The TJ-II Relational Database Access Library: A User's Guide

    International Nuclear Information System (INIS)

    Sanchez, E.; Portas, A. B.; Vega, J.

    2003-01-01

    A relational database has been developed to store data representing physical values from TJ-II discharges. This new database complements the existing TJ-EI raw data database. This database resides in a host computer running Windows 2000 Server operating system and it is managed by SQL Server. A function library has been developed that permits remote access to these data from user programs running in computers connected to TJ-II local area networks via remote procedure cali. In this document a general description of the database and its organization are provided. Also given are a detailed description of the functions included in the library and examples of how to use these functions in computer programs written in the FORTRAN and C languages. (Author) 8 refs

  20. Security Research on Engineering Database System

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Engine engineering database system is an oriented C AD applied database management system that has the capability managing distributed data. The paper discusses the security issue of the engine engineering database management system (EDBMS). Through studying and analyzing the database security, to draw a series of securi ty rules, which reach B1, level security standard. Which includes discretionary access control (DAC), mandatory access control (MAC) and audit. The EDBMS implem ents functions of DAC, ...

  1. Content And Multimedia Database Management Systems

    NARCIS (Netherlands)

    de Vries, A.P.

    1999-01-01

    A database management system is a general-purpose software system that facilitates the processes of defining, constructing, and manipulating databases for various applications. The main characteristic of the ‘database approach’ is that it increases the value of data by its emphasis on data

  2. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  3. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  4. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  5. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  6. 2011 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  7. Security aspects of database systems implementation

    OpenAIRE

    Pokorný, Tomáš

    2009-01-01

    The aim of this thesis is to provide a comprehensive overview of database systems security. Reader is introduced into the basis of information security and its development. Following chapter defines a concept of database system security using ISO/IEC 27000 Standard. The findings from this chapter form a complex list of requirements on database security. One chapter also deals with legal aspects of this domain. Second part of this thesis offers a comparison of four object-relational database s...

  8. The PEP-II/BaBar Project-Wide Database using World Wide Web and Oracle*Case

    International Nuclear Information System (INIS)

    Chan, A.; Crane, G.; MacGregor, I.; Meyer, S.

    1995-12-01

    The PEP-II/BaBar Project Database is a tool for monitoring the technical and documentation aspects of the accelerator and detector construction. It holds the PEP-II/BaBar design specifications, fabrication and installation data in one integrated system. Key pieces of the database include the machine parameter list, components fabrication and calibration data, survey and alignment data, property control, CAD drawings, publications and documentation. This central Oracle database on a UNIX server is built using Oracle*Case tools. Users at the collaborating laboratories mainly access the data using World Wide Web (WWW). The Project Database is being extended to link to legacy databases required for the operations phase

  9. The NCBI BioSystems database.

    Science.gov (United States)

    Geer, Lewis Y; Marchler-Bauer, Aron; Geer, Renata C; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI's Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets.

  10. Reexamining Operating System Support for Database Management

    OpenAIRE

    Vasil, Tim

    2003-01-01

    In 1981, Michael Stonebraker [21] observed that database management systems written for commodity operating systems could not effectively take advantage of key operating system services, such as buffer pool management and process scheduling, due to expensive overhead and lack of customizability. The “not quite right” fit between these kernel services and the demands of database systems forced database designers to work around such limitations or re-implement some kernel functionality in user ...

  11. An event-oriented database for continuous data flows in the TJ-II environment

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, E. [Asociacion Euratom/CIEMAT para Fusion Madrid, 28040 Madrid (Spain)], E-mail: edi.sanchez@ciemat.es; Pena, A. de la; Portas, A.; Pereira, A.; Vega, J. [Asociacion Euratom/CIEMAT para Fusion Madrid, 28040 Madrid (Spain); Neto, A.; Fernandes, H. [Associacao Euratom/IST, Centro de Fusao Nuclear, Avenue Rovisco Pais P-1049-001 Lisboa (Portugal)

    2008-04-15

    A new database for storing data related to the TJ-II experiment has been designed and implemented. It allows the storage of raw data not acquired during plasma shots, i.e. data collected continuously or between plasma discharges while testing subsystems (e.g. during neutral beam test pulses). This new database complements already existing ones by permitting the storage of raw data that are not classified by shot number. Rather these data are indexed according to a more general entity entitled event. An event is defined as any occurrence relevant to the TJ-II environment. Such occurrences are registered thus allowing relationships to be established between data acquisition, TJ-II control-system and diagnostic control-system actions. In the new database, raw data are stored in files on the TJ-II UNIX central server disks while meta-data are stored in Oracle tables thereby permitting fast data searches according to different criteria. In addition, libraries for registering data/events in the database from different subsystems within the laboratory local area network have been developed. Finally, a Shared Data Access System has been implemented for external access to data. It permits both new event-indexed as well as old data (indexed by shot number) to be read from a common event perspective.

  12. Airports and Navigation Aids Database System -

    Data.gov (United States)

    Department of Transportation — Airport and Navigation Aids Database System is the repository of aeronautical data related to airports, runways, lighting, NAVAID and their components, obstacles, no...

  13. Development of a PSA information database system

    International Nuclear Information System (INIS)

    Kim, Seung Hwan

    2005-01-01

    The need to develop the PSA information database for performing a PSA has been growing rapidly. For example, performing a PSA requires a lot of data to analyze, to evaluate the risk, to trace the process of results and to verify the results. PSA information database is a system that stores all PSA related information into the database and file system with cross links to jump to the physical documents whenever they are needed. Korea Atomic Energy Research Institute is developing a PSA information database system, AIMS (Advanced Information Management System for PSA). The objective is to integrate and computerize all the distributed information of a PSA into a system and to enhance the accessibility to PSA information for all PSA related activities. This paper describes how we implemented such a database centered application in the view of two areas, database design and data (document) service

  14. Microcomputer Database Management Systems for Bibliographic Data.

    Science.gov (United States)

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  15. Performance Enhancements for Advanced Database Management Systems

    OpenAIRE

    Helmer, Sven

    2000-01-01

    New applications have emerged, demanding database management systems with enhanced functionality. However, high performance is a necessary precondition for the acceptance of such systems by end users. In this context we developed, implemented, and tested algorithms and index structures for improving the performance of advanced database management systems. We focused on index structures and join algorithms for set-valued attributes.

  16. Super Natural II--a database of natural products.

    Science.gov (United States)

    Banerjee, Priyanka; Erehman, Jevgeni; Gohlke, Björn-Oliver; Wilhelm, Thomas; Preissner, Robert; Dunkel, Mathias

    2015-01-01

    Natural products play a significant role in drug discovery and development. Many topological pharmacophore patterns are common between natural products and commercial drugs. A better understanding of the specific physicochemical and structural features of natural products is important for corresponding drug development. Several encyclopedias of natural compounds have been composed, but the information remains scattered or not freely available. The first version of the Supernatural database containing ∼ 50,000 compounds was published in 2006 to face these challenges. Here we present a new, updated and expanded version of natural product database, Super Natural II (http://bioinformatics.charite.de/supernatural), comprising ∼ 326,000 molecules. It provides all corresponding 2D structures, the most important structural and physicochemical properties, the predicted toxicity class for ∼ 170,000 compounds and the vendor information for the vast majority of compounds. The new version allows a template-based search for similar compounds as well as a search for compound names, vendors, specific physical properties or any substructures. Super Natural II also provides information about the pathways associated with synthesis and degradation of the natural products, as well as their mechanism of action with respect to structurally similar drugs and their target proteins. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Column-Oriented Database Systems (Tutorial)

    OpenAIRE

    Abadi, D.; Boncz, Peter; Harizopoulos, S.

    2009-01-01

    textabstractColumn-oriented database systems (column-stores) have attracted a lot of attention in the past few years. Column-stores, in a nutshell, store each database table column separately, with attribute values belonging to the same column stored contiguously, compressed, and densely packed, as opposed to traditional database systems that store entire records (rows) one after the other. Reading a subset of a table’s columns becomes faster, at the potential expense of excessive disk-head s...

  18. MIMIC II: a massive temporal ICU patient database to support research in intelligent patient monitoring

    Science.gov (United States)

    Saeed, M.; Lieu, C.; Raber, G.; Mark, R. G.

    2002-01-01

    Development and evaluation of Intensive Care Unit (ICU) decision-support systems would be greatly facilitated by the availability of a large-scale ICU patient database. Following our previous efforts with the MIMIC (Multi-parameter Intelligent Monitoring for Intensive Care) Database, we have leveraged advances in networking and storage technologies to develop a far more massive temporal database, MIMIC II. MIMIC II is an ongoing effort: data is continuously and prospectively archived from all ICU patients in our hospital. MIMIC II now consists of over 800 ICU patient records including over 120 gigabytes of data and is growing. A customized archiving system was used to store continuously up to four waveforms and 30 different parameters from ICU patient monitors. An integrated user-friendly relational database was developed for browsing of patients' clinical information (lab results, fluid balance, medications, nurses' progress notes). Based upon its unprecedented size and scope, MIMIC II will prove to be an important resource for intelligent patient monitoring research, and will support efforts in medical data mining and knowledge-discovery.

  19. E-READING II: words database for reading by students from Basic Education II.

    Science.gov (United States)

    Oliveira, Adriana Marques de; Capellini, Simone Aparecida

    2016-01-01

    To develop a database of words of high, medium and low frequency in reading for Basic Education II. The words were taken from the teaching material for Portuguese Language, used by the teaching network of the State of São Paulo in the 6th to the 9th year of Basic Education. Only nouns were selected. The frequency with which each word occurred was recorded and a single database was created. In order to classify the words as of high, medium and low frequency, the decision was taken to work with the distribution terciles, mean frequency and the cutoff point of the terciles. In order to ascertain whether the words of high, medium and low frequency corresponded to this classification, 224 students were assessed: G1 (6th year, n= 61); G2 (7th year, n= 44); G3 (8th year, n= 65); and G4 (9th year, n= 54). The lists of words were presented to the students for reading out loud, in two sessions: 1st) words of high and medium frequency and 2nd) words of low-frequency. Words which encompassed the exclusion criteria, or which caused discomfort or joking on the part of the students, were excluded. The word database was made up of 1659 words and was titled 'E - LEITURA II' ('E-READING II', in English). The E-LEITURA II database is a useful resource for the professionals, as it provides a database which can be used for research, educational and clinical purposes among students of Basic Education II. The professional can choose the words according to her objectives and criteria for elaborating evaluation or intervention procedures involving reading.

  20. Column-Oriented Database Systems (Tutorial)

    NARCIS (Netherlands)

    D. Abadi; P.A. Boncz (Peter); S. Harizopoulos

    2009-01-01

    textabstractColumn-oriented database systems (column-stores) have attracted a lot of attention in the past few years. Column-stores, in a nutshell, store each database table column separately, with attribute values belonging to the same column stored contiguously, compressed, and densely packed, as

  1. Development of JOYO MK-II core characteristics database

    International Nuclear Information System (INIS)

    Tabuchi, Shiro; Aoyama, Takafumi

    2000-01-01

    The MK-II core of the experimental fast reactor JOYO served as the irradiation bed for testing fuels and materials for FBR development since 1982 for 15 years. During the MK-II operation, extensive data were accumulated from the core management calculations and characteristics tests conducted in thirty-one duty operations and thirteen special test operations. These core management data and core characteristics data were compiled into a database recorded on CD-ROM for user convenience. The calculated core management data are the text style data. The 'Configuration Data' include the history of the fuel exchange and core arrangement for each cycle. The Subassembly Library Data' include the atomic number density, neutron fluence, burn-up, integral power of about 300 fuel subassemblies, and 60 irradiation subassemblies. The 'Output Data' include the neutron fluxes, gamma fluxes, power density, linear heat rates, coolant and fuel temperature distributions of each core position at the beginning and end of each cycle. The measured core characteristics data, such as the excess reactivity, control rod worths, temperature coefficient, power coefficient, and burn-up coefficient are also included along with the measurement conditions. (J.P.N.)

  2. Issues in Big-Data Database Systems

    Science.gov (United States)

    2014-06-01

    that big data will not be manageable using conventional relational database technology, and it is true that alternative paradigms, such as NoSQL systems...conventional relational database technology, and it is true that alternative paradigms, such as NoSQL systems and search engines, have much to offer...scale well, and because integration with external data sources is so difficult. NoSQL systems are more open to this integration, and provide excellent

  3. Distributed Database Management Systems A Practical Approach

    CERN Document Server

    Rahimi, Saeed K

    2010-01-01

    This book addresses issues related to managing data across a distributed database system. It is unique because it covers traditional database theory and current research, explaining the difficulties in providing a unified user interface and global data dictionary. The book gives implementers guidance on hiding discrepancies across systems and creating the illusion of a single repository for users. It also includes three sample frameworksâ€"implemented using J2SE with JMS, J2EE, and Microsoft .Netâ€"that readers can use to learn how to implement a distributed database management system. IT and

  4. Armada, an Evolving Database System

    NARCIS (Netherlands)

    F.E. Groffen (Fabian)

    2009-01-01

    htmlabstractIn a world where data usage becomes more and more widespread, single system solutions are no longer adequate to meet the data requirements of today. No longer one monolithic system, but instead a group of smaller and cheaper ones have to manage the workload of the system, preferably

  5. The RMS program system and database

    International Nuclear Information System (INIS)

    Fisher, S.M.; Peach, K.J.

    1982-08-01

    This report describes the program system developed for the data reduction and analysis of data obtained with the Rutherford Multiparticle Spectrometer (RMS), with particular emphasis on the utility of a well structured central data-base. (author)

  6. Resource Survey Relational Database Management System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Mississippi Laboratories employ both enterprise and localized data collection systems for recording data. The databases utilized by these applications range from...

  7. Concurrency control in distributed database systems

    CERN Document Server

    Cellary, W; Gelenbe, E

    1989-01-01

    Distributed Database Systems (DDBS) may be defined as integrated database systems composed of autonomous local databases, geographically distributed and interconnected by a computer network.The purpose of this monograph is to present DDBS concurrency control algorithms and their related performance issues. The most recent results have been taken into consideration. A detailed analysis and selection of these results has been made so as to include those which will promote applications and progress in the field. The application of the methods and algorithms presented is not limited to DDBSs but a

  8. Database/Operating System Co-Design

    OpenAIRE

    Giceva, Jana

    2016-01-01

    We want to investigate how to improve the information flow between a database and an operating system, aiming for better scheduling and smarter resource management. We are interested in identifying the potential optimizations that can be achieved with a better interaction between a database engine and the underlying operating system, especially by allowing the application to get more control over scheduling and memory management decisions. Therefore, we explored some of the issues that arise ...

  9. Analysis of photovoltaic systems. Leadership/cooperation in Task II of the IEA Implementing Agreements Photovoltaic Power Systems, database operation; Analyse des Betriebsverhaltens von Photovoltaiksystemen. Leitung/Mitarbeit im Task II des IEA Implementing Agreements Photovoltaic Power Systems, Betrieb der Datenbank. Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Schreitmueller, K.; Niemann, M.; Decker, B.; Jahn, U.; Meyer, H.

    2000-02-01

    the analysis of PV systems of the different types addressed in the database are presented and discussed. (orig.) [German] Im Rahmen des Vorhabens wurde an den laufenden Arbeiten im Task II des 'Photovoltaic Power System Programme' (PVPS) der Internationalen Energieagentur (IEA) wesentlich mitgewirkt. Es ist eine zentrale Aufgabe des Task II, das Wissen ueber die Auslegung und das Betriebsverhalten von Photovoltaik(PV)-Systemen zu verbreiten und durch Ansprache geeigneter Zielgruppen die Markteinfuehrung der PV-Technologie voranzutreiben. Dabei stuetzt sich das Vorhaben auf die Erstellung und Aktualisierung einer dezentralen Datenbank, in der sowohl detaillierte Anlagedaten von realisierten PV-Anlagen als auch monatsaufgeloeste Betriebsergebnisse der vermessenen PV-Systeme gesammelt und aufbereitet werden. Ziel der Arbeiten ist es, Informationen ueber Energieertraege, Zuverlaessigkeit und Kosten von verschiedenen PV-Systemen aus der ganzen Welt bereitzustellen und Richtlinien fuer optimale PV-Anlagen in Bezug auf ihr Betriebsverhalten und ihre Auslegung zu entwickeln. Die dezentrale Datenbank auf PC-Basis wurde im ISFH erstellt und konfiguriert. Das Datenbankprogramm besteht aus dem Programm 'PVbase' und dem Programm 'PVreport'. Umfangreiche Softwarewerkzeuge erlauben die Selektion von PV-Anlagen, die graphische und statistische Aufarbeitung der relevanten Daten von PV-Anlagen und den Export von berechneten Jahresmittelwerten. Die Datenbank enthaelt derzeit mehr als 260 PV-Anlagen mit monatlichen Datensaetzen von ueber 600 Betriebsjahren. Neben den allgemeinen Informationen sind Angaben ueber die Art der PV-Anlage (Netzverbund, Insel- oder Hybridanlage), ihre Montageart, ihre verwendeten Komponenten und oekonomische Daten enthalten. Die Datenbank umfasst einen grossen Anteil weltweit realisierter PV-Anlagen mit installierter Messdatenerfassung. Der Nutzen der kontinuierlich verbesserten Datenbank liegt in der Verbreitung von Informationen

  10. The ATLAS Distributed Data Management System & Databases

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Barisits, M; Beermann, T; Vigne, R; Serfon, C

    2013-01-01

    The ATLAS Distributed Data Management (DDM) System is responsible for the global management of petabytes of high energy physics data. The current system, DQ2, has a critical dependency on Relational Database Management Systems (RDBMS), like Oracle. RDBMS are well-suited to enforcing data integrity in online transaction processing applications, however, concerns have been raised about the scalability of its data warehouse-like workload. In particular, analysis of archived data or aggregation of transactional data for summary purposes is problematic. Therefore, we have evaluated new approaches to handle vast amounts of data. We have investigated a class of database technologies commonly referred to as NoSQL databases. This includes distributed filesystems, like HDFS, that support parallel execution of computational tasks on distributed data, as well as schema-less approaches via key-value stores, like HBase. In this talk we will describe our use cases in ATLAS, share our experiences with various databases used ...

  11. Database, expert systems, information retrieval

    International Nuclear Information System (INIS)

    Fedele, P.; Grandoni, G.; Mammarella, M.C.

    1989-12-01

    The great debate concerning the Italian high-school reform has induced a ferment of activity among the most interested and sensible of people. This was clearly demonstrated by the course 'Innovazione metodologico-didattica e tecnologie informatiche' organized for the staff of the 'lstituto Professionale L. Einaudi' of Lamezia Terme. The course was an interesting opportunity for discussions and interaction between the world of School and computer technology used in the Research field. This three day course included theoretical and practical lessons, showing computer facilities that could be useful for teaching. During the practical lessons some computer tools were presented from the very simple Electronic Sheets to the more complicated information Retrieval on CD-ROM interactive realizations. The main topics will be discussed later. They are: Modelling, Data Base, Integrated Information Systems, Expert Systems, Information Retrieval. (author)

  12. Deductive databases and P systems

    Directory of Open Access Journals (Sweden)

    Miguel A. Gutierrez-Naranjo

    2004-06-01

    Full Text Available In computational processes based on backwards chaining, a rule of the type is seen as a procedure which points that the problem can be split into the problems. In classical devices, the subproblems are solved sequentially. In this paper we present some questions that circulated during the Second Brainstorming Week related to the application of the parallelism of P systems to computation based on backwards chaining on the example of inferential deductive process.

  13. Generable PEARL-realtime-database system

    International Nuclear Information System (INIS)

    Plessmann, K.W.; Duif, V.; Angenendt, F.

    1983-06-01

    This database system has been designed with special consideration of the requirements of process-control-application. For that purpose the attribute ''time'' is treated as the essential dimension for processes, affecting data treatment. According to the multiformed requirements of process-control applications the database system is generable, i.e. its size and collection of functions is applicable to each implementation. The system is not adapted to a single data model, therefore several modes can be implemented. Using PEARL for the implementation allows to put the portability of the system to a high standard. (orig.) [de

  14. Developing of tensile property database system

    International Nuclear Information System (INIS)

    Park, S. J.; Kim, D. H.; Jeon, J.; Ryu, W. S.

    2002-01-01

    The data base construction using the data produced from tensile experiment can increase the application of test results. Also, we can get the basic data ease from database when we prepare the new experiment and can produce high quality result by compare the previous data. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, the tensile database system was developed by internet method using JSP(Java Server pages) tool

  15. License - RED II INAHO | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us RED II INAHO License License to Use This Database Last updated : 2016/01/14 You may use this database...the license terms regarding the use of this database and the requirements you must follow in using this database.... The license for this database is specified in the Creative Commons Attribut...ion-Share Alike 4.0 International . If you use data from this database, please be sure attribute this database...of the Creative Commons Attribution-Share Alike 4.0 International is found here . With regard to this database

  16. Data management in the TJ-II multi-layer database

    International Nuclear Information System (INIS)

    Vega, J.; Cremy, C.; Sanchez, E.; Portas, A.; Fabregas, J.A.; Herrera, R.

    2000-01-01

    The handling of TJ-II experimental data is performed by means of several software modules. These modules provide the resources for data capture, data storage and management, data access as well as general-purpose data visualisation. Here we describe the module related to data storage and management. We begin by introducing the categories in which data can be classified. Then, we describe the TJ-II data flow through the several file systems involved, before discussing the architecture of the TJ-II database. We review the concept of the 'discharge file' and identify the drawbacks that would result from a direct application of this idea to the TJ-II data. In order to overcome these drawbacks, we propose alternatives based on our concepts of signal family, user work-group and data priority. Finally, we present a model for signal storage. This model is in accordance with the database architecture and provides a proper framework for managing the TJ-II experimental data. In the model, the information is organised in layers and is distributed according to the generality of the information, from the common fields of all signals (first layer), passing through the specific records of signal families (second layer) and reaching the particular information of individual signals (third layer)

  17. An anomaly analysis framework for database systems

    NARCIS (Netherlands)

    Vavilis, S.; Egner, A.I.; Petkovic, M.; Zannone, N.

    2015-01-01

    Anomaly detection systems are usually employed to monitor database activities in order to detect security incidents. These systems raise an alert when anomalous activities are detected. The raised alerts have to be analyzed to timely respond to the security incidents. Their analysis, however, is

  18. Distributed Access View Integrated Database (DAVID) system

    Science.gov (United States)

    Jacobs, Barry E.

    1991-01-01

    The Distributed Access View Integrated Database (DAVID) System, which was adopted by the Astrophysics Division for their Astrophysics Data System, is a solution to the system heterogeneity problem. The heterogeneous components of the Astrophysics problem is outlined. The Library and Library Consortium levels of the DAVID approach are described. The 'books' and 'kits' level is discussed. The Universal Object Typer Management System level is described. The relation of the DAVID project with the Small Business Innovative Research (SBIR) program is explained.

  19. Similarity joins in relational database systems

    CERN Document Server

    Augsten, Nikolaus

    2013-01-01

    State-of-the-art database systems manage and process a variety of complex objects, including strings and trees. For such objects equality comparisons are often not meaningful and must be replaced by similarity comparisons. This book describes the concepts and techniques to incorporate similarity into database systems. We start out by discussing the properties of strings and trees, and identify the edit distance as the de facto standard for comparing complex objects. Since the edit distance is computationally expensive, token-based distances have been introduced to speed up edit distance comput

  20. Function and organization of CPC database system

    International Nuclear Information System (INIS)

    Yoshida, Tohru; Tomiyama, Mineyoshi.

    1986-02-01

    It is very time-consuming and expensive work to develop computer programs. Therefore, it is desirable to effectively use the existing program. For this purpose, it is required for researchers and technical staffs to obtain the relevant informations easily. CPC (Computer Physics Communications) is a journal published to facilitate the exchange of physics programs and of the relevant information about the use of computers in the physics community. There are about 1300 CPC programs in JAERI computing center, and the number of programs is increasing. A new database system (CPC database) has been developed to manage the CPC programs and their information. Users obtain information about all the programs stored in the CPC database. Also users can find and copy the necessary program by inputting the program name, the catalogue number and the volume number. In this system, each operation is done by menu selection. Every CPC program is compressed and stored in the database; the required storage size is one third of the non-compressed format. Programs unused for a long time are moved to magnetic tape. The present report describes the CPC database system and the procedures for its use. (author)

  1. Nuclear integrated database and design advancement system

    International Nuclear Information System (INIS)

    Ha, Jae Joo; Jeong, Kwang Sub; Kim, Seung Hwan; Choi, Sun Young.

    1997-01-01

    The objective of NuIDEAS is to computerize design processes through an integrated database by eliminating the current work style of delivering hardcopy documents and drawings. The major research contents of NuIDEAS are the advancement of design processes by computerization, the establishment of design database and 3 dimensional visualization of design data. KSNP (Korea Standard Nuclear Power Plant) is the target of legacy database and 3 dimensional model, so that can be utilized in the next plant design. In the first year, the blueprint of NuIDEAS is proposed, and its prototype is developed by applying the rapidly revolutionizing computer technology. The major results of the first year research were to establish the architecture of the integrated database ensuring data consistency, and to build design database of reactor coolant system and heavy components. Also various softwares were developed to search, share and utilize the data through networks, and the detailed 3 dimensional CAD models of nuclear fuel and heavy components were constructed, and walk-through simulation using the models are developed. This report contains the major additions and modifications to the object oriented database and associated program, using methods and Javascript.. (author). 36 refs., 1 tab., 32 figs

  2. 9th Asian Conference on Intelligent Information and Database Systems

    CERN Document Server

    Nguyen, Ngoc; Shirai, Kiyoaki

    2017-01-01

    This book presents recent research in intelligent information and database systems. The carefully selected contributions were initially accepted for presentation as posters at the 9th Asian Conference on Intelligent Information and Database Systems (ACIIDS 2017) held from to 5 April 2017 in Kanazawa, Japan. While the contributions are of an advanced scientific level, several are accessible for non-expert readers. The book brings together 47 chapters divided into six main parts: • Part I. From Machine Learning to Data Mining. • Part II. Big Data and Collaborative Decision Support Systems, • Part III. Computer Vision Analysis, Detection, Tracking and Recognition, • Part IV. Data-Intensive Text Processing, • Part V. Innovations in Web and Internet Technologies, and • Part VI. New Methods and Applications in Information and Software Engineering. The book is an excellent resource for researchers and those working in algorithmics, artificial and computational intelligence, collaborative systems, decisio...

  3. DAD - Distributed Adamo Database system at Hermes

    International Nuclear Information System (INIS)

    Wander, W.; Dueren, M.; Ferstl, M.; Green, P.; Potterveld, D.; Welch, P.

    1996-01-01

    Software development for the HERMES experiment faces the challenges of many other experiments in modern High Energy Physics: Complex data structures and relationships have to be processed at high I/O rate. Experimental control and data analysis are done on a distributed environment of CPUs with various operating systems and requires access to different time dependent databases like calibration and geometry. Slow and experimental control have a need for flexible inter-process-communication. Program development is done in different programming languages where interfaces to the libraries should not restrict the capacities of the language. The needs of handling complex data structures are fulfilled by the ADAMO entity relationship model. Mixed language programming can be provided using the CFORTRAN package. DAD, the Distributed ADAMO Database library, was developed to provide the I/O and database functionality requirements. (author)

  4. LHCb Conditions Database Operation Assistance Systems

    CERN Multimedia

    Shapoval, Illya

    2012-01-01

    The Conditions Database of the LHCb experiment (CondDB) provides versioned, time dependent geometry and conditions data for all LHCb data processing applications (simulation, high level trigger, reconstruction, analysis) in a heterogeneous computing environment ranging from user laptops to the HLT farm and the Grid. These different use cases impose front-end support for multiple database technologies (Oracle and SQLite are used). Sophisticated distribution tools are required to ensure timely and robust delivery of updates to all environments. The content of the database has to be managed to ensure that updates are internally consistent and externally compatible with multiple versions of the physics application software. In this paper we describe three systems that we have developed to address these issues: - an extension to the automatic content validation done by the “Oracle Streams” replication technology, to trap cases when the replication was unsuccessful; - an automated distribution process for the S...

  5. Aerospace Systems Monitor, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Proposal Title: Aerospace Systems Monitor PHASE 1 Technical Abstract: This Phase II STTR project will continue development and commercialization of the Aerospace...

  6. Stress Testing of Transactional Database Systems

    OpenAIRE

    Meira , Jorge Augusto; Cunha De Almeida , Eduardo; Sunyé , Gerson; Le Traon , Yves; Valduriez , Patrick

    2013-01-01

    International audience; Transactional database management systems (DBMS) have been successful at supporting traditional transaction processing workloads. However, web-based applications that tend to generate huge numbers of concurrent business operations are pushing DBMS performance over their limits, thus threatening overall system availability. Then, a crucial question is how to test DBMS performance under heavy workload conditions. Answering this question requires a testing methodology to ...

  7. SPIRE Data-Base Management System

    Science.gov (United States)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  8. Expert database system for quality control

    Science.gov (United States)

    Wang, Anne J.; Li, Zhi-Cheng

    1993-09-01

    There are more competitors today. Markets are not homogeneous they are fragmented into increasingly focused niches requiring greater flexibility in the product mix shorter manufacturing production runs and above allhigher quality. In this paper the author identified a real-time expert system as a way to improve plantwide quality management. The quality control expert database system (QCEDS) by integrating knowledge of experts in operations quality management and computer systems use all information relevant to quality managementfacts as well as rulesto determine if a product meets quality standards. Keywords: expert system quality control data base

  9. Database management system for large container inspection system

    International Nuclear Information System (INIS)

    Gao Wenhuan; Li Zheng; Kang Kejun; Song Binshan; Liu Fang

    1998-01-01

    Large Container Inspection System (LCIS) based on radiation imaging technology is a powerful tool for the Customs to check the contents inside a large container without opening it. The author has discussed a database application system, as a part of Signal and Image System (SIS), for the LCIS. The basic requirements analysis was done first. Then the selections of computer hardware, operating system, and database management system were made according to the technology and market products circumstance. Based on the above considerations, a database application system with central management and distributed operation features has been implemented

  10. Selection of nuclear power information database management system

    International Nuclear Information System (INIS)

    Zhang Shuxin; Wu Jianlei

    1996-01-01

    In the condition of the present database technology, in order to build the Chinese nuclear power information database (NPIDB) in the nuclear industry system efficiently at a high starting point, an important task is to select a proper database management system (DBMS), which is the hinge of the matter to build the database successfully. Therefore, this article explains how to build a practical information database about nuclear power, the functions of different database management systems, the reason of selecting relation database management system (RDBMS), the principles of selecting RDBMS, the recommendation of ORACLE management system as the software to build database and so on

  11. Database specification for the Worldwide Port System (WPS) Regional Integrated Cargo Database (ICDB)

    Energy Technology Data Exchange (ETDEWEB)

    Faby, E.Z.; Fluker, J.; Hancock, B.R.; Grubb, J.W.; Russell, D.L. [Univ. of Tennessee, Knoxville, TN (United States); Loftis, J.P.; Shipe, P.C.; Truett, L.F. [Oak Ridge National Lab., TN (United States)

    1994-03-01

    This Database Specification for the Worldwide Port System (WPS) Regional Integrated Cargo Database (ICDB) describes the database organization and storage allocation, provides the detailed data model of the logical and physical designs, and provides information for the construction of parts of the database such as tables, data elements, and associated dictionaries and diagrams.

  12. The design of distributed database system for HIRFL

    International Nuclear Information System (INIS)

    Wang Hong; Huang Xinmin

    2004-01-01

    This paper is focused on a kind of distributed database system used in HIRFL distributed control system. The database of this distributed database system is established by SQL Server 2000, and its application system adopts the Client/Server model. Visual C ++ is used to develop the applications, and the application uses ODBC to access the database. (authors)

  13. Portable database driven control system for SPEAR

    Energy Technology Data Exchange (ETDEWEB)

    Howry, S.; Gromme, T.; King, A.; Sullenberger, M.

    1985-04-01

    The new computer control system software for SPEAR is presented as a transfer from the PEP system. Features of the target ring (SPEAR) such as symmetries, magnet groupings, etc., are all contained in a design file which is read by both people and computer. People use it as documentation; a program reads it to generate the database structure, which becomes the center of communication for all the software. Geometric information, such as element positions and lengths, and CAMAC I/O routing information is entered into the database as it is developed. Since application processes refer only to the database and since they do so only in generic terms, almost all of this software (representing more then fifteen man years) is transferred with few changes. Operator console menus (touchpanels) are also transferred with only superficial changes for the same reasons. The system is modular: the CAMAC I/O software is all in one process; the menu control software is a process; the ring optics model and the orbit model are separate processes, each of which runs concurrently with about 15 others in the multiprogramming environment of the VAX/VMS operating system. 10 refs., 1 fig.

  14. Portable database driven control system for SPEAR

    International Nuclear Information System (INIS)

    Howry, S.; Gromme, T.; King, A.; Sullenberger, M.

    1985-04-01

    The new computer control system software for SPEAR is presented as a transfer from the PEP system. Features of the target ring (SPEAR) such as symmetries, magnet groupings, etc., are all contained in a design file which is read by both people and computer. People use it as documentation; a program reads it to generate the database structure, which becomes the center of communication for all the software. Geometric information, such as element positions and lengths, and CAMAC I/O routing information is entered into the database as it is developed. Since application processes refer only to the database and since they do so only in generic terms, almost all of this software (representing more then fifteen man years) is transferred with few changes. Operator console menus (touchpanels) are also transferred with only superficial changes for the same reasons. The system is modular: the CAMAC I/O software is all in one process; the menu control software is a process; the ring optics model and the orbit model are separate processes, each of which runs concurrently with about 15 others in the multiprogramming environment of the VAX/VMS operating system. 10 refs., 1 fig

  15. The ALADDIN atomic physics database system

    International Nuclear Information System (INIS)

    Hulse, R.A.

    1990-01-01

    ALADDIN is an atomic physics database system which has been developed in order to provide a broadly-based standard medium for the exchange and management of atomic data. ALADDIN consists of a data format definition together with supporting software for both interactive searches as well as for access to the data by plasma modeling and other codes. 8AB The ALADDIN system is designed to offer maximum flexibility in the choice of data representations and labeling schemes, so as to support a wide range of atomic physics data types and allow natural evolution and modification of the database as needs change. Associated dictionary files are included in the ALADDIN system for data documentation. The importance of supporting the widest possible user community was also central to be ALADDIN design, leading to the use of straightforward text files with concatentated data entries for the file structure, and the adoption of strict FORTRAN 77 code for the supporting software. This will allow ready access to the ALADDIN system on the widest range of scientific computers, and easy interfacing with FORTRAN modeling codes, user developed atomic physics codes and database, etc. This supporting software consists of the ALADDIN interactive searching and data display code, together with the ALPACK subroutine package which provides ALADDIN datafile searching and data retrieval capabilities to user's codes

  16. Development of environment radiation database management system

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun [Daeduk College, Taejon (Korea, Republic of)

    1999-03-15

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation.

  17. Development of environment radiation database management system

    International Nuclear Information System (INIS)

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun

    1999-03-01

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation

  18. HATCHES - a thermodynamic database and management system

    International Nuclear Information System (INIS)

    Cross, J.E.; Ewart, F.T.

    1990-03-01

    The Nirex Safety Assessment Research Programme has been compiling the thermodynamic data necessary to allow simulations of the aqueous behaviour of the elements important to radioactive waste disposal to be made. These data have been obtained from the literature, when available, and validated for the conditions of interest by experiment. In order to maintain these data in an accessible form and to satisfy quality assurance on all data used for assessments, a database has been constructed which resides on a personal computer operating under MS-DOS using the Ashton-Tate dBase III program. This database contains all the input data fields required by the PHREEQE program and, in addition, a body of text which describes the source of the data and the derivation of the PHREEQE input parameters from the source data. The HATCHES system consists of this database, a suite of programs to facilitate the searching and listing of data and a further suite of programs to convert the dBase III files to PHREEQE database format. (Author)

  19. Accessing the public MIMIC-II intensive care relational database for clinical research.

    Science.gov (United States)

    Scott, Daniel J; Lee, Joon; Silva, Ikaro; Park, Shinhyuk; Moody, George B; Celi, Leo A; Mark, Roger G

    2013-01-10

    The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge "Predicting mortality of ICU Patients". QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database.

  20. Nuclear Criticality Information System. Database examples

    Energy Technology Data Exchange (ETDEWEB)

    Foret, C.A.

    1984-06-01

    The purpose of this publication is to provide our users with a guide to using the Nuclear Criticality Information System (NCIS). It is comprised of an introduction, an information and resources section, a how-to-use section, and several useful appendices. The main objective of this report is to present a clear picture of the NCIS project and its available resources as well as assisting our users in accessing the database and using the TIS computer to process data. The introduction gives a brief description of the NCIS project, the Technology Information System (TIS), online user information, future plans and lists individuals to contact for additional information about the NCIS project. The information and resources section outlines the NCIS database and describes the resources that are available. The how-to-use section illustrates access to the NCIS database as well as searching datafiles for general or specific data. It also shows how to access and read the NCIS news section as well as connecting to other information centers through the TIS computer.

  1. Nuclear Criticality Information System. Database examples

    International Nuclear Information System (INIS)

    Foret, C.A.

    1984-06-01

    The purpose of this publication is to provide our users with a guide to using the Nuclear Criticality Information System (NCIS). It is comprised of an introduction, an information and resources section, a how-to-use section, and several useful appendices. The main objective of this report is to present a clear picture of the NCIS project and its available resources as well as assisting our users in accessing the database and using the TIS computer to process data. The introduction gives a brief description of the NCIS project, the Technology Information System (TIS), online user information, future plans and lists individuals to contact for additional information about the NCIS project. The information and resources section outlines the NCIS database and describes the resources that are available. The how-to-use section illustrates access to the NCIS database as well as searching datafiles for general or specific data. It also shows how to access and read the NCIS news section as well as connecting to other information centers through the TIS computer

  2. JOYO MK-II core characteristics database. Update to JFS-3-J3.2R

    International Nuclear Information System (INIS)

    Ohkawachi, Yasushi; Maeda, Shigetaka; Sekine, Takashi

    2003-04-01

    The 'JOYO' MK-II core characteristics database was compiled and published in 1998. Comments and requests from many users led to the creation of a revised edition in 2001. The revisions include changes to the MAGI calculation code system to use the 70 group JFS-3-J3.2 constant set processed from the JENDL-3.2 library. However, after the database was published, it was recently found that there were errors in the process of making the group constant set JFS-3-J3.2, and it was revised at JFS-3-J3.2R. Then, the group constant set was updated at JFS-3-J3.2R in this database. The MK-II core management data nad core characteristics data were recorded on CD-ROM for user convenience. The structure of the database is the same as in the first edition. The 'Configuration Data' include the core arrangement and refueling record for each operational cycle. The 'Subassembly Library Data' include the atomic number density, neutron fluence, burn-up, integral power of 362 driver fuel subassemblies and 69 irradiation test subassemblies. The 'Output Data' contain the calculated neutron flux, gamma flux, power density, linear heat rate, coolant and fuel temperature distribution of all the fuel subassemblies at the beginning and end of each operational cycle. The 'Core Characteristics Data' include the measured excess reactivity, control rod worth calibration curve, and reactivity coefficients of temperature, power and burn-up. The effect of updating the group constant set, the calculation results of excess reactivity decreased by about 0.15Δk/kk', and the effects to other core characteristics were negligible. (author)

  3. Integrated spent nuclear fuel database system

    International Nuclear Information System (INIS)

    Henline, S.P.; Klingler, K.G.; Schierman, B.H.

    1994-01-01

    The Distributed Information Systems software Unit at the Idaho National Engineering Laboratory has designed and developed an Integrated Spent Nuclear Fuel Database System (ISNFDS), which maintains a computerized inventory of all US Department of Energy (DOE) spent nuclear fuel (SNF). Commercial SNF is not included in the ISNFDS unless it is owned or stored by DOE. The ISNFDS is an integrated, single data source containing accurate, traceable, and consistent data and provides extensive data for each fuel, extensive facility data for every facility, and numerous data reports and queries

  4. LHCb Conditions database operation assistance systems

    International Nuclear Information System (INIS)

    Clemencic, M; Shapoval, I; Cattaneo, M; Degaudenzi, H; Santinelli, R

    2012-01-01

    The Conditions Database (CondDB) of the LHCb experiment provides versioned, time dependent geometry and conditions data for all LHCb data processing applications (simulation, high level trigger (HLT), reconstruction, analysis) in a heterogeneous computing environment ranging from user laptops to the HLT farm and the Grid. These different use cases impose front-end support for multiple database technologies (Oracle and SQLite are used). Sophisticated distribution tools are required to ensure timely and robust delivery of updates to all environments. The content of the database has to be managed to ensure that updates are internally consistent and externally compatible with multiple versions of the physics application software. In this paper we describe three systems that we have developed to address these issues. The first system is a CondDB state tracking extension to the Oracle 3D Streams replication technology, to trap cases when the CondDB replication was corrupted. Second, an automated distribution system for the SQLite-based CondDB, providing also smart backup and checkout mechanisms for the CondDB managers and LHCb users respectively. And, finally, a system to verify and monitor the internal (CondDB self-consistency) and external (LHCb physics software vs. CondDB) compatibility. The former two systems are used in production in the LHCb experiment and have achieved the desired goal of higher flexibility and robustness for the management and operation of the CondDB. The latter one has been fully designed and is passing currently to the implementation stage.

  5. Databases and information systems: Applications in biogeography

    International Nuclear Information System (INIS)

    Escalante E, Tania; Llorente B, Jorge; Espinoza O, David N; Soberon M, Jorge

    2000-01-01

    Some aspects of the new instrumentalization and methodological elements that make up information systems in biodiversity (ISB) are described. The use of accurate geographically referenced data allows a broad range of available sources: natural history collections and scientific literature require the use of databases and geographic information systems (GIS). The conceptualization of ISB and GIS, based in the use of extensive data bases, has implied detailed modeling and the construction of authoritative archives: exhaustive catalogues of nomenclature and synonymies, complete bibliographic lists, list of names proposed, historical-geographic gazetteers with localities and their synonyms united under a global positioning system which produces a geospheric conception of the earth and its biota. Certain difficulties in the development of the system and the construction of the biological databases are explained: quality control of data, for example. The use of such systems is basic in order to respond to many questions at the frontier of current studies of biodiversity and conservation. In particular, some applications in biogeography and their importance for modeling distributions, to identify and contrast areas of endemism and biological richness for conservation, and their use as tools in what we identify as predictive and experimental faunistics are detailed. Lastly, the process as well as its relevance is emphasized at national and regional levels

  6. OAP- OFFICE AUTOMATION PILOT GRAPHICS DATABASE SYSTEM

    Science.gov (United States)

    Ackerson, T.

    1994-01-01

    The Office Automation Pilot (OAP) Graphics Database system offers the IBM PC user assistance in producing a wide variety of graphs and charts. OAP uses a convenient database system, called a chartbase, for creating and maintaining data associated with the charts, and twelve different graphics packages are available to the OAP user. Each of the graphics capabilities is accessed in a similar manner. The user chooses creation, revision, or chartbase/slide show maintenance options from an initial menu. The user may then enter or modify data displayed on a graphic chart. The cursor moves through the chart in a "circular" fashion to facilitate data entries and changes. Various "help" functions and on-screen instructions are available to aid the user. The user data is used to generate the graphics portion of the chart. Completed charts may be displayed in monotone or color, printed, plotted, or stored in the chartbase on the IBM PC. Once completed, the charts may be put in a vector format and plotted for color viewgraphs. The twelve graphics capabilities are divided into three groups: Forms, Structured Charts, and Block Diagrams. There are eight Forms available: 1) Bar/Line Charts, 2) Pie Charts, 3) Milestone Charts, 4) Resources Charts, 5) Earned Value Analysis Charts, 6) Progress/Effort Charts, 7) Travel/Training Charts, and 8) Trend Analysis Charts. There are three Structured Charts available: 1) Bullet Charts, 2) Organization Charts, and 3) Work Breakdown Structure (WBS) Charts. The Block Diagram available is an N x N Chart. Each graphics capability supports a chartbase. The OAP graphics database system provides the IBM PC user with an effective means of managing data which is best interpreted as a graphic display. The OAP graphics database system is written in IBM PASCAL 2.0 and assembler for interactive execution on an IBM PC or XT with at least 384K of memory, and a color graphics adapter and monitor. Printed charts require an Epson, IBM, OKIDATA, or HP Laser

  7. Establishment of Database System for Radiation Oncology

    International Nuclear Information System (INIS)

    Kim, Dae Sup; Lee, Chang Ju; Yoo, Soon Mi; Kim, Jong Min; Lee, Woo Seok; Kang, Tae Young; Back, Geum Mun; Hong, Dong Ki; Kwon, Kyung Tae

    2008-01-01

    To enlarge the efficiency of operation and establish a constituency for development of new radiotherapy treatment through database which is established by arranging and indexing radiotherapy related affairs in well organized manner to have easy access by the user. In this study, Access program provided by Microsoft (MS Office Access) was used to operate the data base. The data of radiation oncology was distinguished by a business logs and maintenance expenditure in addition to stock management of accessories with respect to affairs and machinery management. Data for education and research was distinguished by education material for department duties, user manual and related thesis depending upon its property. Registration of data was designed to have input form according to its subject and the information of data was designed to be inspected by making a report. Number of machine failure in addition to its respective repairing hours from machine maintenance expenditure in a period of January 2008 to April 2009 was analyzed with the result of initial system usage and one year after the usage. Radiation oncology database system was accomplished by distinguishing work related and research related criteria. The data are arranged and collected according to its subjects and classes, and can be accessed by searching the required data through referring the descriptions from each criteria. 32.3% of total average time was reduced on analyzing repairing hours by acquiring number of machine failure in addition to its type in a period of January 2008 to April 2009 through machine maintenance expenditure. On distinguishing and indexing present and past data upon its subjective criteria through the database system for radiation oncology, the use of information can be easily accessed to enlarge the efficiency of operation, and in further, can be a constituency for improvement of work process by acquiring various information required for new radiotherapy treatment in real time.

  8. Solvent Handbook Database System user's manual

    International Nuclear Information System (INIS)

    1993-03-01

    Industrial solvents and cleaners are used in maintenance facilities to remove wax, grease, oil, carbon, machining fluids, solder fluxes, mold release, and various other contaminants from parts, and to prepare the surface of various metals. However, because of growing environmental and worker-safety concerns, government regulations have already excluded the use of some chemicals and have restricted the use of halogenated hydrocarbons because they affect the ozone layer and may cause cancer. The Solvent Handbook Database System lets you view information on solvents and cleaners, including test results on cleaning performance, air emissions, recycling and recovery, corrosion, and non-metals compatibility. Company and product safety information is also available

  9. Design and implementation of typical target image database system

    International Nuclear Information System (INIS)

    Qin Kai; Zhao Yingjun

    2010-01-01

    It is necessary to provide essential background data and thematic data timely in image processing and application. In fact, application is an integrating and analyzing procedure with different kinds of data. In this paper, the authors describe an image database system which classifies, stores, manages and analyzes database of different types, such as image database, vector database, spatial database, spatial target characteristics database, its design and structure. (authors)

  10. ASEAN Mineral Database and Information System (AMDIS)

    Science.gov (United States)

    Okubo, Y.; Ohno, T.; Bandibas, J. C.; Wakita, K.; Oki, Y.; Takahashi, Y.

    2014-12-01

    AMDIS has lunched officially since the Fourth ASEAN Ministerial Meeting on Minerals on 28 November 2013. In cooperation with Geological Survey of Japan, the web-based GIS was developed using Free and Open Source Software (FOSS) and the Open Geospatial Consortium (OGC) standards. The system is composed of the local databases and the centralized GIS. The local databases created and updated using the centralized GIS are accessible from the portal site. The system introduces distinct advantages over traditional GIS. Those are a global reach, a large number of users, better cross-platform capability, charge free for users, charge free for provider, easy to use, and unified updates. Raising transparency of mineral information to mining companies and to the public, AMDIS shows that mineral resources are abundant throughout the ASEAN region; however, there are many datum vacancies. We understand that such problems occur because of insufficient governance of mineral resources. Mineral governance we refer to is a concept that enforces and maximizes the capacity and systems of government institutions that manages minerals sector. The elements of mineral governance include a) strengthening of information infrastructure facility, b) technological and legal capacities of state-owned mining companies to fully-engage with mining sponsors, c) government-led management of mining projects by supporting the project implementation units, d) government capacity in mineral management such as the control and monitoring of mining operations, and e) facilitation of regional and local development plans and its implementation with the private sector.

  11. Dynamic graph system for a semantic database

    Science.gov (United States)

    Mizell, David

    2015-01-27

    A method and system in a computer system for dynamically providing a graphical representation of a data store of entries via a matrix interface is disclosed. A dynamic graph system provides a matrix interface that exposes to an application program a graphical representation of data stored in a data store such as a semantic database storing triples. To the application program, the matrix interface represents the graph as a sparse adjacency matrix that is stored in compressed form. Each entry of the data store is considered to represent a link between nodes of the graph. Each entry has a first field and a second field identifying the nodes connected by the link and a third field with a value for the link that connects the identified nodes. The first, second, and third fields represent the rows, column, and elements of the adjacency matrix.

  12. Database system selection for marketing strategies support in information systems

    Directory of Open Access Journals (Sweden)

    František Dařena

    2007-01-01

    Full Text Available In today’s dynamically changing environment marketing has a significant role. Creating successful marketing strategies requires large amount of high quality information of various kinds and data types. A powerful database management system is a necessary condition for marketing strategies creation support. The paper briefly describes the field of marketing strategies and specifies the features that should be provided by database systems in connection with these strategies support. Major commercial (Oracle, DB2, MS SQL, Sybase and open-source (PostgreSQL, MySQL, Firebird databases are than examined from the point of view of accordance with these characteristics and their comparison in made. The results are useful for making the decision before acquisition of a database system during information system’s hardware architecture specification.

  13. Database interfaces on NASA's heterogeneous distributed database system

    Science.gov (United States)

    Huang, Shou-Hsuan Stephen

    1989-01-01

    The syntax and semantics of all commands used in the template are described. Template builders should consult this document for proper commands in the template. Previous documents (Semiannual reports) described other aspects of this project. Appendix 1 contains all substituting commands used in the system. Appendix 2 includes all repeating commands. Appendix 3 is a collection of DEFINE templates from eight different DBMS's.

  14. An inductive database system based on virtual mining views

    NARCIS (Netherlands)

    Blockeel, H.; Calders, T.G.K.; Fromont, É.; Goethals, B.; Prado, A.; Robardet, C.

    2012-01-01

    Inductive databases integrate database querying with database mining. In this article, we present an inductive database system that does not rely on a new data mining query language, but on plain SQL. We propose an intuitive and elegant framework based on virtual mining views, which are relational

  15. An Introduction to the DB Relational Database Management System

    OpenAIRE

    Ward, J.R.

    1982-01-01

    This paper is an introductory guide to using the Db programs to maintain and query a relational database on the UNIX operating system. In the past decade. increasing interest has been shown in the development of relational database management systems. Db is an attempt to incorporate a flexible and powerful relational database system within the user environment presented by the UNIX operating system. The family of Db programs is useful for maintaining a database of information that i...

  16. Towards a Component Based Model for Database Systems

    Directory of Open Access Journals (Sweden)

    Octavian Paul ROTARU

    2004-02-01

    Full Text Available Due to their effectiveness in the design and development of software applications and due to their recognized advantages in terms of reusability, Component-Based Software Engineering (CBSE concepts have been arousing a great deal of interest in recent years. This paper presents and extends a component-based approach to object-oriented database systems (OODB introduced by us in [1] and [2]. Components are proposed as a new abstraction level for database system, logical partitions of the schema. In this context, the scope is introduced as an escalated property for transactions. Components are studied from the integrity, consistency, and concurrency control perspective. The main benefits of our proposed component model for OODB are the reusability of the database design, including the access statistics required for a proper query optimization, and a smooth information exchange. The integration of crosscutting concerns into the component database model using aspect-oriented techniques is also discussed. One of the main goals is to define a method for the assessment of component composition capabilities. These capabilities are restricted by the component’s interface and measured in terms of adaptability, degree of compose-ability and acceptability level. The above-mentioned metrics are extended from database components to generic software components. This paper extends and consolidates into one common view the ideas previously presented by us in [1, 2, 3].[1] Octavian Paul Rotaru, Marian Dobre, Component Aspects in Object Oriented Databases, Proceedings of the International Conference on Software Engineering Research and Practice (SERP’04, Volume II, ISBN 1-932415-29-7, pages 719-725, Las Vegas, NV, USA, June 2004.[2] Octavian Paul Rotaru, Marian Dobre, Mircea Petrescu, Integrity and Consistency Aspects in Component-Oriented Databases, Proceedings of the International Symposium on Innovation in Information and Communication Technology (ISIICT

  17. Development of a personalized training system using the Lung Image Database Consortium and Image Database resource Initiative Database.

    Science.gov (United States)

    Lin, Hongli; Wang, Weisheng; Luo, Jiawei; Yang, Xuedong

    2014-12-01

    The aim of this study was to develop a personalized training system using the Lung Image Database Consortium (LIDC) and Image Database resource Initiative (IDRI) Database, because collecting, annotating, and marking a large number of appropriate computed tomography (CT) scans, and providing the capability of dynamically selecting suitable training cases based on the performance levels of trainees and the characteristics of cases are critical for developing a efficient training system. A novel approach is proposed to develop a personalized radiology training system for the interpretation of lung nodules in CT scans using the Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) database, which provides a Content-Boosted Collaborative Filtering (CBCF) algorithm for predicting the difficulty level of each case of each trainee when selecting suitable cases to meet individual needs, and a diagnostic simulation tool to enable trainees to analyze and diagnose lung nodules with the help of an image processing tool and a nodule retrieval tool. Preliminary evaluation of the system shows that developing a personalized training system for interpretation of lung nodules is needed and useful to enhance the professional skills of trainees. The approach of developing personalized training systems using the LIDC/IDRL database is a feasible solution to the challenges of constructing specific training program in terms of cost and training efficiency. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  18. Implementing database system for LHCb publications page

    CERN Document Server

    Abdullayev, Fakhriddin

    2017-01-01

    The LHCb is one of the main detectors of Large Hadron Collider, where physicists and scientists work together on high precision measurements of matter-antimatter asymmetries and searches for rare and forbidden decays, with the aim of discovering new and unexpected forces. The work does not only consist of analyzing data collected from experiments but also in publishing the results of those analyses. The LHCb publications are gathered on LHCb publications page to maximize their availability to both LHCb members and to the high energy community. In this project a new database system was implemented for LHCb publications page. This will help to improve access to research papers for scientists and better integration with current CERN library website and others.

  19. A Support Database System for Integrated System Health Management (ISHM)

    Science.gov (United States)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  20. The Network Configuration of an Object Relational Database Management System

    Science.gov (United States)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  1. Integrated Procurement Management System, Version II

    Science.gov (United States)

    Collier, L. J.

    1985-01-01

    Integrated Procurement Management System, Version II (IPMS II) is online/ batch system for collecting developing, managing and disseminating procurementrelated data at NASA Johnson Space Center. Portions of IPMS II adaptable to other procurement situations.

  2. Design of Database System of HIRFL-CSR Beam Line

    International Nuclear Information System (INIS)

    Li Peng; Li Ke; Yin Dayu; Yuan Youjin; Gou Shizhe

    2009-01-01

    This paper introduces the database design and optimization for the power supply system of Lanzhou Heavy Ion Accelerator CSR (HIRFL-CSR) beam line. Based on HIFEL-CSR main Oracle database system, the interface was designed to read parameters of the power supply while achieving real-time monitoring. A new database system to store the history data of power supplies was established at the same time, and it realized the data exchange between Oracle database system and Access database system. Meanwhile, the interface was designed conveniently for printing and query parameters. (authors)

  3. Characterization analysis database system (CADS). A system overview

    International Nuclear Information System (INIS)

    1997-12-01

    The CADS database is a standardized, quality-assured, and configuration-controlled data management system developed to assist in the task of characterizing the DOE surplus HEU material. Characterization of the surplus HEU inventory includes identifying the specific material; gathering existing data about the inventory; defining the processing steps that may be necessary to prepare the material for transfer to a blending site; and, ultimately, developing a range of the preliminary cost estimates for those processing steps. Characterization focuses on producing commercial reactor fuel as the final step in material disposition. Based on the project analysis results, the final determination will be made as to the viability of the disposition path for each particular item of HEU. The purpose of this document is to provide an informational overview of the CADS database, its evolution, and its current capabilities. This document describes the purpose of CADS, the system requirements it fulfills, the database structure, and the operational guidelines of the system

  4. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  5. Audit Database and Information Tracking System

    Data.gov (United States)

    Social Security Administration — This database contains information about the Social Security Administration's audits regarding SSA agency performance and compliance. These audits can be requested...

  6. Minority Serving Institutions Reporting System Database

    Data.gov (United States)

    Social Security Administration — The database will be used to track SSA's contributions to Minority Serving Institutions such as Historically Black Colleges and Universities (HBCU), Tribal Colleges...

  7. Software Application for Supporting the Education of Database Systems

    Science.gov (United States)

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  8. The UCSD HIRES/Keck I Damped Lyα Abundance Database. II. The Implications

    Science.gov (United States)

    Prochaska, Jason X.; Wolfe, Arthur M.

    2002-02-01

    We present a comprehensive analysis of the damped Lyα (DLA) abundance database presented in the first paper of this series. This database provides a homogeneous set of abundance measurements for many elements including Si, Cr, Ni, Zn, Fe, Al, S, Co, O, and Ar from 38 DLA systems with zabs>1.5. With little exception, these DLA systems exhibit very similar relative abundances. There is no significant correlation in X/Fe with [Fe/H] metallicity, and the dispersion in X/Fe is small at all metallicity. We search the database for trends indicative of dust depletion and in a few cases find strong evidence. Specifically, we identify a correlation between [Si/Ti] and [Zn/Fe] which is unambiguous evidence for depletion. Following Hou and colleagues, we present [X/Si] abundances against [Si/H]+logN(HI) and note trends of decreasing X/Si with increasing [Si/H]+logN(HI) which argue for dust depletion. Similarly, comparisons of [Si/Fe] and [Si/Cr] against [Si/H] indicate significant depletion at [Si/H]>-1 but suggest essentially dust-free damped systems at [Si/H]0.25 dex as [Zn/Fe]-->0 and that the [Si/Fe] values exhibit a plateau of ~0.3 dex at [Si/H]good agreement with our previous work, but we emphasize two differences: (1) the unweighted and N(H I)-weighted [Fe/H] mean metallicities now have similar values at all epochs except z>3.5, where small number statistics dominate the N(H I)-weighted mean; and (2) there is no evolution in the mean [Fe/H] metallicity from z=1.7 to 3.5 but possibly a marked drop at higher redshift. We conclude with a general discussion on the physical nature of the DLA systems. We stress the uniformity of the DLA chemical abundances which indicates that the protogalaxies identified with DLA systems have very similar enrichment histories, i.e., a nearly constant relative contribution from Type Ia and Type II supernovae. The DLA systems also show constant relative abundances within a given system, which places strict constraints on the mixing timescales

  9. A web-based data visualization tool for the MIMIC-II database.

    Science.gov (United States)

    Lee, Joon; Ribey, Evan; Wallace, James R

    2016-02-04

    Although MIMIC-II, a public intensive care database, has been recognized as an invaluable resource for many medical researchers worldwide, becoming a proficient MIMIC-II researcher requires knowledge of SQL programming and an understanding of the MIMIC-II database schema. These are challenging requirements especially for health researchers and clinicians who may have limited computer proficiency. In order to overcome this challenge, our objective was to create an interactive, web-based MIMIC-II data visualization tool that first-time MIMIC-II users can easily use to explore the database. The tool offers two main features: Explore and Compare. The Explore feature enables the user to select a patient cohort within MIMIC-II and visualize the distributions of various administrative, demographic, and clinical variables within the selected cohort. The Compare feature enables the user to select two patient cohorts and visually compare them with respect to a variety of variables. The tool is also helpful to experienced MIMIC-II researchers who can use it to substantially accelerate the cumbersome and time-consuming steps of writing SQL queries and manually visualizing extracted data. Any interested researcher can use the MIMIC-II data visualization tool for free to quickly and conveniently conduct a preliminary investigation on MIMIC-II with a few mouse clicks. Researchers can also use the tool to learn the characteristics of the MIMIC-II patients. Since it is still impossible to conduct multivariable regression inside the tool, future work includes adding analytics capabilities. Also, the next version of the tool will aim to utilize MIMIC-III which contains more data.

  10. An Adaptive Database Intrusion Detection System

    Science.gov (United States)

    Barrios, Rita M.

    2011-01-01

    Intrusion detection is difficult to accomplish when attempting to employ current methodologies when considering the database and the authorized entity. It is a common understanding that current methodologies focus on the network architecture rather than the database, which is not an adequate solution when considering the insider threat. Recent…

  11. Open-access MIMIC-II database for intensive care research.

    Science.gov (United States)

    Lee, Joon; Scott, Daniel J; Villarroel, Mauricio; Clifford, Gari D; Saeed, Mohammed; Mark, Roger G

    2011-01-01

    The critical state of intensive care unit (ICU) patients demands close monitoring, and as a result a large volume of multi-parameter data is collected continuously. This represents a unique opportunity for researchers interested in clinical data mining. We sought to foster a more transparent and efficient intensive care research community by building a publicly available ICU database, namely Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II). The data harnessed in MIMIC-II were collected from the ICUs of Beth Israel Deaconess Medical Center from 2001 to 2008 and represent 26,870 adult hospital admissions (version 2.6). MIMIC-II consists of two major components: clinical data and physiological waveforms. The clinical data, which include patient demographics, intravenous medication drip rates, and laboratory test results, were organized into a relational database. The physiological waveforms, including 125 Hz signals recorded at bedside and corresponding vital signs, were stored in an open-source format. MIMIC-II data were also deidentified in order to remove protected health information. Any interested researcher can gain access to MIMIC-II free of charge after signing a data use agreement and completing human subjects training. MIMIC-II can support a wide variety of research studies, ranging from the development of clinical decision support algorithms to retrospective clinical studies. We anticipate that MIMIC-II will be an invaluable resource for intensive care research by stimulating fair comparisons among different studies.

  12. Selecting a Relational Database Management System for Library Automation Systems.

    Science.gov (United States)

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  13. Present status of the TJ-II remote participation system

    International Nuclear Information System (INIS)

    Vega, J.; Sanchez, E.; Lopez, A.; Portas, A.; Ochando, M.; Ascasibar, E.; Mollinedo, A.; Munoz, J.; Sanchez, A.; Ruiz, M.; Barrera, E.; Lopez, S.; Castro, R.; Lopez, D.

    2005-01-01

    The TJ-II remote participation system (RPS) was designed to extend to Internet the working capabilities provided in the TJ-II local environment, i.e., tracking the TJ-II operation, monitoring/programming data acquisition and control systems, and accessing databases. The TJ-II RPS was based on web and Java technologies because of their open character, security properties and technological maturity. A web server acts as a communication front-end between remote participants and local TJ-II elements. From the server side, web services are provided by means of resources supplied by JSP pages. The client part makes use of web browsers and ad hoc Java applications. The operation requires the use of a distributed authentication and authorization system. This development employs the PAPI System. At present, approximately 1000 digitisation channels can be managed from the TJ-II RPS. Furthermore, processing software based on a 4GL language (LabView) can be downloaded to multiprocessor data acquisition systems. Also, 15 diagnostic control systems, databases and the operation logbook are available from the RPS. The system even allows for the physicist in charge of operation to be in a remote location. Four Spanish universities make use of the TJ-II remote participation system capabilities for joint collaborations: these are the Universidad Politecnica de Madrid (UPM), Universidad Nacional de Educacion a Distancia (UNED), Universidad Complutense de Madrid (UCM) and Universidad Politecnica de Cataluna (UPC)

  14. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  15. Revisiting Reuse in Main Memory Database Systems

    OpenAIRE

    Dursun, Kayhan; Binnig, Carsten; Cetintemel, Ugur; Kraska, Tim

    2016-01-01

    Reusing intermediates in databases to speed-up analytical query processing has been studied in the past. Existing solutions typically require intermediate results of individual operators to be materialized into temporary tables to be considered for reuse in subsequent queries. However, these approaches are fundamentally ill-suited for use in modern main memory databases. The reason is that modern main memory DBMSs are typically limited by the bandwidth of the memory bus, thus query execution ...

  16. Analysis of Cloud-Based Database Systems

    Science.gov (United States)

    2015-06-01

    deploying the VM, we installed SQL Server 2014 relational database management software (RDBMS) and restored a copy of the PYTHON database onto the server ...management views within SQL Server , we retrieved lists of the most commonly executed queries, the percentage of reads versus writes, as well as...Monitor. This gave us data regarding resource utilization and queueing. The second tool we used was the SQL Server Profiler provided by Microsoft

  17. An Integrated Enterprise Accelerator Database for the SLC Control System

    International Nuclear Information System (INIS)

    2002-01-01

    Since its inception in the early 1980's, the SLC Control System has been driven by a highly structured memory-resident real-time database. While efficient, its rigid structure and file-based sources makes it difficult to maintain and extract relevant information. The goal of transforming the sources for this database into a relational form is to enable it to be part of a Control System Enterprise Database that is an integrated central repository for SLC accelerator device and Control System data with links to other associated databases. We have taken the concepts developed for the NLC Enterprise Database and used them to create and load a relational model of the online SLC Control System database. This database contains data and structure to allow querying and reporting on beamline devices, their associations and parameters. In the future this will be extended to allow generation of EPICS and SLC database files, setup of applications and links to other databases such as accelerator maintenance, archive data, financial and personnel records, cabling information, documentation etc. The database is implemented using Oracle 8i. In the short term it will be updated daily in batch from the online SLC database. In the longer term, it will serve as the primary source for Control System static data, an R and D platform for the NLC, and contribute to SLC Control System operations

  18. Report of the SRC working party on databases and database management systems

    International Nuclear Information System (INIS)

    Crennell, K.M.

    1980-10-01

    An SRC working party, set up to consider the subject of support for databases within the SRC, were asked to identify interested individuals and user communities, establish which features of database management systems they felt were desirable, arrange demonstrations of possible systems and then make recommendations for systems, funding and likely manpower requirements. This report describes the activities and lists the recommendations of the working party and contains a list of databses maintained or proposed by those who replied to a questionnaire. (author)

  19. Exploration of a Vision for Actor Database Systems

    DEFF Research Database (Denmark)

    Shah, Vivek

    of these services. Existing popular approaches to building these services either use an in-memory database system or an actor runtime. We observe that these approaches have complementary strengths and weaknesses. In this dissertation, we propose the integration of actor programming models in database systems....... In doing so, we lay down a vision for a new class of systems called actor database systems. To explore this vision, this dissertation crystallizes the notion of an actor database system by defining its feature set in light of current application and hardware trends. In order to explore the viability...... of the outlined vision, a new programming model named Reactors has been designed to enrich classic relational database programming models with logical actor programming constructs. To support the reactor programming model, a high-performance in-memory multi-core OLTP database system named REACTDB has been built...

  20. Portuguese food composition database quality management system.

    Science.gov (United States)

    Oliveira, L M; Castanheira, I P; Dantas, M A; Porto, A A; Calhau, M A

    2010-11-01

    The harmonisation of food composition databases (FCDB) has been a recognised need among users, producers and stakeholders of food composition data (FCD). To reach harmonisation of FCDBs among the national compiler partners, the European Food Information Resource (EuroFIR) Network of Excellence set up a series of guidelines and quality requirements, together with recommendations to implement quality management systems (QMS) in FCDBs. The Portuguese National Institute of Health (INSA) is the national FCDB compiler in Portugal and is also a EuroFIR partner. INSA's QMS complies with ISO/IEC (International Organization for Standardisation/International Electrotechnical Commission) 17025 requirements. The purpose of this work is to report on the strategy used and progress made for extending INSA's QMS to the Portuguese FCDB in alignment with EuroFIR guidelines. A stepwise approach was used to extend INSA's QMS to the Portuguese FCDB. The approach included selection of reference standards and guides and the collection of relevant quality documents directly or indirectly related to the compilation process; selection of the adequate quality requirements; assessment of adequacy and level of requirement implementation in the current INSA's QMS; implementation of the selected requirements; and EuroFIR's preassessment 'pilot' auditing. The strategy used to design and implement the extension of INSA's QMS to the Portuguese FCDB is reported in this paper. The QMS elements have been established by consensus. ISO/IEC 17025 management requirements (except 4.5) and 5.2 technical requirements, as well as all EuroFIR requirements (including technical guidelines, FCD compilation flowchart and standard operating procedures), have been selected for implementation. The results indicate that the quality management requirements of ISO/IEC 17025 in place in INSA fit the needs for document control, audits, contract review, non-conformity work and corrective actions, and users' (customers

  1. Performance Assessment of Dynaspeak Speech Recognition System on Inflight Databases

    National Research Council Canada - National Science Library

    Barry, Timothy

    2004-01-01

    .... To aid in the assessment of various commercially available speech recognition systems, several aircraft speech databases have been developed at the Air Force Research Laboratory's Human Effectiveness Directorate...

  2. Extended functions of the database machine FREND for interactive systems

    International Nuclear Information System (INIS)

    Hikita, S.; Kawakami, S.; Sano, K.

    1984-01-01

    Well-designed visual interfaces encourage non-expert users to use relational database systems. In those systems such as office automation systems or engineering database systems, non-expert users interactively access to database from visual terminals. Some users may want to occupy database or other users may share database according to various situations. Because, those jobs need a lot of time to be completed, concurrency control must be well designed to enhance the concurrency. The extended method of concurrency control of FREND is presented in this paper. The authors assume that systems are composed of workstations, a local area network and the database machine FREND. This paper also stresses that those workstations and FREND must cooperate to complete concurrency control for interactive applications

  3. MPS II drift chamber system

    International Nuclear Information System (INIS)

    Platner, E.D.

    1982-01-01

    The MPS II detectors are narrow drift space chambers designed for high position resolution in a magnetic field and in a very high particle flux environment. Central to this implementation was the development of 3 multi-channel custom IC's and one multi-channel hybrid. The system is deadtimeless and requires no corrections on an anode-to-anode basis. Operational experience and relevance to ISABELLE detectors is discussed

  4. A relational database for physical data from TJ-II discharges

    International Nuclear Information System (INIS)

    Sanchez, E.; Portas, A.B.; Vega, J.

    2002-01-01

    A relational database (RDB) has been developed for classifying TJ-II experimental data according to physical criteria. Two objectives have been achieved: the design and the implementation of the database and the software tools for data access depending on a single software driver. TJ-II data were arranged in several tables with a flexible design, speedy performance, efficient search capacity and adaptability to meet present and future, requirements. The software has been developed to allow the access to the TJ-II RDB from a variety of computer platforms (ALPHA AXP/True64 UNIX, CRAY/UNICOS, Intel Linux, Sparc/Solaris and Intel/Windows 95/98/NT) and programming languages (FORTRAN and C/C++). The database resides in a Windows NT Server computer and is managed by Microsoft SQL Server. The access software is based on open network computing remote procedure call and follows client/server model. A server program running in the Windows NT computer controls data access. Operations on the database (through a local ODBC connection) are performed according to predefined permission protocols. A client library providing a set of basic functions for data integration and retrieval has been built in both static and dynamic link versions. The dynamic version is essential in accessing RDB data from 4GL environments (IDL and PV-WAVE among others)

  5. Foundations of database systems : an introductory tutorial

    NARCIS (Netherlands)

    Paredaens, J.; Paredaens, J.; Tenenbaum, L. A.

    1994-01-01

    A very short overview is given of the principles of databases. The entity relationship model is used to define the conceptual base. Furthermore file management, the hierarchical model, the network model, the relational model and the object oriented model are discussed During the second world war,

  6. Nuclear data processing using a database management system

    International Nuclear Information System (INIS)

    Castilla, V.; Gonzalez, L.

    1991-01-01

    A database management system that permits the design of relational models was used to create an integrated database with experimental and evaluated nuclear data.A system that reduces the time and cost of processing was created for computers type EC or compatibles.A set of programs for the conversion from nuclear calculated data output format to EXFOR format was developed.A dictionary to perform a retrospective search in the ENDF database was created too

  7. Developing of corrosion and creep property test database system

    International Nuclear Information System (INIS)

    Park, S. J.; Jun, I.; Kim, J. S.; Ryu, W. S.

    2004-01-01

    The corrosion and creep characteristics database systems were constructed using the data produced from corrosion and creep test and designed to hold in common the data and programs of tensile, impact, fatigue characteristics database that was constructed since 2001 and others characteristics databases that will be constructed in future. We can easily get the basic data from the corrosion and creep characteristics database systems when we prepare the new experiment and can produce high quality result by compare the previous test result. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, we describe the procedure about analysis, design and development of the impact and fatigue characteristics database systems developed by internet method using jsp(Java Server pages) tool

  8. Developing of impact and fatigue property test database system

    International Nuclear Information System (INIS)

    Park, S. J.; Jun, I.; Kim, D. H.; Ryu, W. S.

    2003-01-01

    The impact and fatigue characteristics database systems were constructed using the data produced from impact and fatigue test and designed to hold in common the data and programs of tensile characteristics database that was constructed on 2001 and others characteristics databases that will be constructed in future. We can easily get the basic data from the impact and fatigue characteristics database systems when we prepare the new experiment and can produce high quality result by compare the previous data. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, we describe the procedure about analysis, design and development of the impact and fatigue characteristics database systems developed by internet method using jsp(Java Server pages) tool

  9. Conceptual design of nuclear power plants database system

    International Nuclear Information System (INIS)

    Ishikawa, Masaaki; Izumi, Fumio; Sudoh, Takashi.

    1984-03-01

    This report is the result of the joint study on the developments of the nuclear power plants database system. The present conceptual design of the database system, which includes Japanese character processing and image processing, has been made on the data of safety design parameters mainly found in the application documents for reactor construction permit made available to the public. (author)

  10. Distributed Database Control and Allocation. Volume 3. Distributed Database System Designer’s Handbook.

    Science.gov (United States)

    1983-10-01

    Multiversion Data 2-18 2.7.1 Multiversion Timestamping 2-20 2.T.2 Multiversion Looking 2-20 2.8 Combining the Techniques 2-22 3. Database Recovery Algorithms...See rTHEM79, GIFF79] for details. 2.7 Multiversion Data Let us return to a database system model where each logical data item is stored at one DM...In a multiversion database each Write wifxl, produces a new copy (or version) of x, denoted xi. Thus, the value of z is a set of ver- sions. For each

  11. Comparison of the Frontier Distributed Database Caching System with NoSQL Databases

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Non-relational "NoSQL" databases such as Cassandra and CouchDB are best known for their ability to scale to large numbers of clients spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects, is based on traditional SQL databases but also has the same high scalability and wide-area distributability for an important subset of applications. This paper compares the architectures, behavior, performance, and maintainability of the two different approaches and identifies the criteria for choosing which approach to prefer over the other.

  12. A user's manual for managing database system of tensile property

    International Nuclear Information System (INIS)

    Ryu, Woo Seok; Park, S. J.; Kim, D. H.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the tensile database system for managing the tensile property test data. The data base constructed the data produced from tensile property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The tensile database system was developed by internet method using Java, PL/SQL, JSP(Java Server Pages) tool

  13. Formalization of Database Systems -- and a Formal Definition of {IMS}

    DEFF Research Database (Denmark)

    Bjørner, Dines; Løvengreen, Hans Henrik

    1982-01-01

    Drawing upon an analogy between Programming Language Systems and Database Systems we outline the requirements that architectural specifications of database systems must futfitl, and argue that only formal, mathematical definitions may 6atisfy these. Then we illustrate home aspects and touch upon...... come ueee of formal definitions of data models and databaee management systems. A formal model of INS will carry this discussion. Finally we survey some of the exkting literature on formal definitions of database systems. The emphasis will be on constructive definitions in the denotationul semantics...... style of the VCM: Vienna Development Nethd. The role of formal definitions in international standardiaation efforts is briefly mentioned....

  14. [The future of clinical laboratory database management system].

    Science.gov (United States)

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  15. TRENDS: The aeronautical post-test database management system

    Science.gov (United States)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  16. The TJ-II Relational Database Access Library: A User's Guide; Libreria de Acceso a la Base de Datos Relacional de TJ-II: Guia del Usuario

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, E.; Portas, A. B.; Vega, J.

    2003-07-01

    A relational database has been developed to store data representing physical values from TJ-II discharges. This new database complements the existing TJ-EI raw data database. This database resides in a host computer running Windows 2000 Server operating system and it is managed by SQL Server. A function library has been developed that permits remote access to these data from user programs running in computers connected to TJ-II local area networks via remote procedure cali. In this document a general description of the database and its organization are provided. Also given are a detailed description of the functions included in the library and examples of how to use these functions in computer programs written in the FORTRAN and C languages. (Author) 8 refs.

  17. Optimistic protocol for partitioned distributed database systems

    International Nuclear Information System (INIS)

    Davidson, S.B.

    1982-01-01

    A protocol for transaction processing during partition failures is presented which guarantees mutual consistency between copies of data-items after repair is completed. The protocol is optimistic in that transactions are processed without restrictions during the failure; conflicts are detected at repair time using a precedence graph and are resolved by backing out transactions according to some backout strategy. The protocol is then evaluated using simulation and probabilistic modeling. In the simulation, several parameters are varied such as the number of transactions processed in a group, the type of transactions processed, the number of data-items present in the database, and the distribution of references to data-items. The simulation also uses different backout strategies. From these results we note conditions under which the protocol performs well, i.e., conditions under which the protocol backs out a small percentage of the transaction run. A probabilistic model is developed to estimate the expected number of transactions backed out using most of the above database and transaction parameters, and is shown to agree with simulation results. Suggestions are then made on how to improve the performance of the protocol. Insights gained from the simulation and probabilistic modeling are used to develop a backout strategy which takes into account individual transaction costs and attempts to minimize total backout cost. Although the problem of choosing transactions to minimize total backout cost is, in general, NP-complete, the backout strategy is efficient and produces very good results

  18. Advanced approaches to intelligent information and database systems

    CERN Document Server

    Boonjing, Veera; Chittayasothorn, Suphamit

    2014-01-01

    This book consists of 35 chapters presenting different theoretical and practical aspects of Intelligent Information and Database Systems. Nowadays both Intelligent and Database Systems are applied in most of the areas of human activities which necessitates further research in these areas. In this book various interesting issues related to the intelligent information models and methods as well as their advanced applications, database systems applications, data models and their analysis, and digital multimedia methods and applications are presented and discussed both from the practical and theoretical points of view. The book is organized in four parts devoted to intelligent systems models and methods, intelligent systems advanced applications, database systems methods and applications, and multimedia systems methods and applications. The book will be interesting for both practitioners and researchers, especially graduate and PhD students of information technology and computer science, as well more experienced ...

  19. Comparison of the Frontier Distributed Database Caching System to NoSQL Databases

    Science.gov (United States)

    Dykstra, Dave

    2012-12-01

    One of the main attractions of non-relational “NoSQL” databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also adds high scalability and the ability to be distributed over a wide-area for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  20. Comparison of the Frontier Distributed Database Caching System to NoSQL Databases

    International Nuclear Information System (INIS)

    Dykstra, Dave

    2012-01-01

    One of the main attractions of non-relational “NoSQL” databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also adds high scalability and the ability to be distributed over a wide-area for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  1. Comparison of the Frontier Distributed Database Caching System with NoSQL Databases

    CERN Document Server

    Dykstra, David

    2012-01-01

    One of the main attractions of non-relational "NoSQL" databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also has high scalability and wide-area distributability for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  2. Comparison of the Frontier Distributed Database Caching System to NoSQL Databases

    Energy Technology Data Exchange (ETDEWEB)

    Dykstra, Dave [Fermilab

    2012-07-20

    One of the main attractions of non-relational NoSQL databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also adds high scalability and the ability to be distributed over a wide-area for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  3. Switching the Fermilab Accelerator Control System to a relational database

    International Nuclear Information System (INIS)

    Shtirbu, S.

    1993-01-01

    The accelerator control system (open-quotes ACNETclose quotes) at Fermilab is using a made-in-house, Assembly language, database. The database holds device information, which is mostly used for finding out how to read/set devices and how to interpret alarms. This is a very efficient implementation, but it lacks the needed flexibility and forces applications to store data in private/shared files. This database is being replaced by an off-the-shelf relational database (Sybase 2 ). The major constraints on switching are the necessity to maintain/improve response time and to minimize changes to existing applications. Innovative methods are used to help achieve the required performance, and a layer seven gateway simulates the old database for existing programs. The new database is running on a DEC ALPHA/VMS platform, and provides better performance. The switch is also exposing problems with the data currently stored in the database, and is helping in cleaning up erroneous data. The flexibility of the new relational database is going to facilitate many new applications in the future (e.g. a 3D presentation of device location). The new database is expected to fully replace the old database during this summer's shutdown

  4. System factors influencing utilisation of Research4Life databases by ...

    African Journals Online (AJOL)

    This is a comprehensive investigation of the influence of system factors on utilisation of Research4Life databases. It is part of a doctoral dissertation. Research4Life databases are new innovative technologies being investigated in a new context – utilisation by NARIs scientists for research. The study adopted the descriptive ...

  5. Online-Expert: An Expert System for Online Database Selection.

    Science.gov (United States)

    Zahir, Sajjad; Chang, Chew Lik

    1992-01-01

    Describes the design and development of a prototype expert system called ONLINE-EXPERT that helps users select online databases and vendors that meet users' needs. Search strategies are discussed; knowledge acquisition and knowledge bases are described; and the Analytic Hierarchy Process (AHP), a decision analysis technique that ranks databases,…

  6. A59 Drum Activity database (DRUMAC): system documentation

    International Nuclear Information System (INIS)

    Keel, Alan.

    1993-01-01

    This paper sets out the requirements, database design, software module designs and test plans for DRUMAC (the Active handling Building Drum Activity Database) - a computer-based system to record the radiological inventory for LLW/ILW drums dispatched from the Active Handling Building. (author)

  7. Plant operation data collection and database management using NIC system

    International Nuclear Information System (INIS)

    Inase, S.

    1990-01-01

    The Nuclear Information Center (NIC), a division of the Central Research Institute of Electric Power Industry, collects nuclear power plant operation and maintenance information both in Japan and abroad and transmits the information to all domestic utilities so that it can be effectively utilized for safe plant operation and reliability enhancement. The collected information is entered into the database system after being key-worded by NIC. The database system, Nuclear Information database/Communication System (NICS), has been developed by NIC for storage and management of collected information. Objectives of keywords are retrieval and classification by the keyword categories

  8. Design of SMART alarm system using main memory database

    International Nuclear Information System (INIS)

    Jang, Kue Sook; Seo, Yong Seok; Park, Keun Oak; Lee, Jong Bok; Kim, Dong Hoon

    2001-01-01

    To achieve design goal of SMART alarm system, first of all we have to decide on how to handle and manage alarm information and how to use database. So this paper analyses concepts and deficiencies of main memory database applied in real time system. And this paper sets up structure and processing principles of main memory database using nonvolatile memory such as flash memory and develops recovery strategy and process board structures using these. Therefore this paper shows design of SMART alarm system is suited functions and requirements

  9. Performance analysis of different database in new internet mapping system

    Science.gov (United States)

    Yao, Xing; Su, Wei; Gao, Shuai

    2017-03-01

    In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.

  10. A Grid Architecture for Manufacturing Database System

    Directory of Open Access Journals (Sweden)

    Laurentiu CIOVICĂ

    2011-06-01

    Full Text Available Before the Enterprise Resource Planning concepts business functions within enterprises were supported by small and isolated applications, most of them developed internally. Yet today ERP platforms are not by themselves the answer to all organizations needs especially in times of differentiated and diversified demands among end customers. ERP platforms were integrated with specialized systems for the management of clients, Customer Relationship Management and vendors, Supplier Relationship Management. They were integrated with Manufacturing Execution Systems for better planning and control of production lines. In order to offer real time, efficient answers to the management level, ERP systems were integrated with Business Intelligence systems. This paper analyses the advantages of grid computing at this level of integration, communication and interoperability between complex specialized informatics systems with a focus on the system architecture and data base systems.

  11. A Data Analysis Expert System For Large Established Distributed Databases

    Science.gov (United States)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-05-01

    The purpose of this work is to analyze the applicability of artificial intelligence techniques for developing a user-friendly, parallel interface to large isolated, incompatible NASA databases for the purpose of assisting the management decision process. To carry out this work, a survey was conducted to establish the data access requirements of several key NASA user groups. In addition, current NASA database access methods were evaluated. The results of this work are presented in the form of a design for a natural language database interface system, called the Deductively Augmented NASA Management Decision Support System (DANMDS). This design is feasible principally because of recently announced commercial hardware and software product developments which allow cross-vendor compatibility. The goal of the DANMDS system is commensurate with the central dilemma confronting most large companies and institutions in America, the retrieval of information from large, established, incompatible database systems. The DANMDS system implementation would represent a significant first step toward this problem's resolution.

  12. DOE technology information management system database study report

    Energy Technology Data Exchange (ETDEWEB)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  13. Improving Timeliness in Real-Time Secure Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H; David, Rasikan; Thuraisingham, Bhavani

    2006-01-01

    .... In addition to real-time requirements, security is usually required in many applications. Multilevel security requirements introduce a new dimension to transaction processing in real-time database systems...

  14. Centralized database for interconnection system design. [for spacecraft

    Science.gov (United States)

    Billitti, Joseph W.

    1989-01-01

    A database application called DFACS (Database, Forms and Applications for Cabling and Systems) is described. The objective of DFACS is to improve the speed and accuracy of interconnection system information flow during the design and fabrication stages of a project, while simultaneously supporting both the horizontal (end-to-end wiring) and the vertical (wiring by connector) design stratagems used by the Jet Propulsion Laboratory (JPL) project engineering community. The DFACS architecture is centered around a centralized database and program methodology which emulates the manual design process hitherto used at JPL. DFACS has been tested and successfully applied to existing JPL hardware tasks with a resulting reduction in schedule time and costs.

  15. Development of a Relational Database for Learning Management Systems

    Science.gov (United States)

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  16. PFTijah: text search in an XML database system

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Rode, H.; van Os, R.; Flokstra, Jan

    2006-01-01

    This paper introduces the PFTijah system, a text search system that is integrated with an XML/XQuery database management system. We present examples of its use, we explain some of the system internals, and discuss plans for future work. PFTijah is part of the open source release of MonetDB/XQuery.

  17. Integration of functions in logic database systems

    NARCIS (Netherlands)

    Lambrichts, E.; Nees, P.; Paredaens, J.; Peelman, P.; Tanca, L.

    1990-01-01

    We extend Datalog, a logic programming language for rule-based systems, by respectively integrating types, negation and functions. This extention of Datalog is called MilAnt. Furthermore, MilAnt consistency is defined as a stronger form of consistency for functions. It is known that consistency for

  18. Expert system for quality control in the INIS database

    International Nuclear Information System (INIS)

    Todeschini, C.; Tolstenkov, A.

    1990-05-01

    An expert system developed to identify input items to INIS database with a high probability of containing errors is described. The system employs a Knowledge Base constructed by the interpretation of a large number of intellectual choices or expert decisions made by human indexers and incorporated in the INIS database. On the basis of the descriptor indexing, the system checks the correctness of the categorization. A notable feature of the system is its capability of self improvement by the continuous updating of the Knowledge Base. The expert system has also been found to be extremely useful in identifying documents with poor indexing. 3 refs, 9 figs

  19. Expert system for quality control in the INIS database

    Energy Technology Data Exchange (ETDEWEB)

    Todeschini, C; Tolstenkov, A [International Atomic Energy Agency, Vienna (Austria)

    1990-05-01

    An expert system developed to identify input items to INIS database with a high probability of containing errors is described. The system employs a Knowledge Base constructed by the interpretation of a large number of intellectual choices or expert decisions made by human indexers and incorporated in the INIS database. On the basis of the descriptor indexing, the system checks the correctness of the categorization. A notable feature of the system is its capability of self improvement by the continuous updating of the Knowledge Base. The expert system has also been found to be extremely useful in identifying documents with poor indexing. 3 refs, 9 figs.

  20. Optimization of Extended Relational Database Systems

    Science.gov (United States)

    1986-07-23

    control functions are integrated into a single system in a homogeneoua way. As a first exam - ple, consider previous work in supporting various semantic...sizes are reduced and, wnk? quently, the number of materializations that will be needed is aba lower. For exam - pie, in the above query tuple...retrieve (EMP.name) where EMP hobbies instrument = ’ violin ’ When the various entries in the hobbies field are materialized, only those queries that

  1. The Database Driven ATLAS Trigger Configuration System

    CERN Document Server

    Martyniuk, Alex; The ATLAS collaboration

    2015-01-01

    This contribution describes the trigger selection configuration system of the ATLAS low- and high-level trigger (HLT) and the upgrades it received in preparation for LHC Run 2. The ATLAS trigger configuration system is responsible for applying the physics selection parameters for the online data taking at both trigger levels and the proper connection of the trigger lines across those levels. Here the low-level trigger consists of the already existing central trigger (CT) and the new Level-1 Topological trigger (L1Topo), which has been added for Run 2. In detail the tasks of the configuration system during the online data taking are Application of the selection criteria, e.g. energy cuts, minimum multiplicities, trigger object correlation, at the three trigger components L1Topo, CT, and HLT On-the-fly, e.g. rate-dependent, generation and application of prescale factors to the CT and HLT to adjust the trigger rates to the data taking conditions, such as falling luminosity or rate spikes in the detector readout ...

  2. Database design for Physical Access Control System for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Sathishkumar, T., E-mail: satishkumart@igcar.gov.in; Rao, G. Prabhakara, E-mail: prg@igcar.gov.in; Arumugam, P., E-mail: aarmu@igcar.gov.in

    2016-08-15

    Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

  3. Database design for Physical Access Control System for nuclear facilities

    International Nuclear Information System (INIS)

    Sathishkumar, T.; Rao, G. Prabhakara; Arumugam, P.

    2016-01-01

    Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

  4. Kingfisher: a system for remote sensing image database management

    Science.gov (United States)

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  5. A data analysis expert system for large established distributed databases

    Science.gov (United States)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  6. Databases in Cloud - Solutions for Developing Renewable Energy Informatics Systems

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2017-08-01

    Full Text Available The paper presents the data model of a decision support prototype developed for generation monitoring, forecasting and advanced analysis in the renewable energy filed. The solutions considered for developing this system include databases in cloud, XML integration, spatial data representation and multidimensional modeling. This material shows the advantages of Cloud databases and spatial data representation and their implementation in Oracle Database 12 c. Also, it contains a data integration part and a multidimensional analysis. The presentation of output data is made using dashboards.

  7. Insertion algorithms for network model database management systems

    Science.gov (United States)

    Mamadolimov, Abdurashid; Khikmat, Saburov

    2017-12-01

    The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, forms partial order. When a database is large and a query comparison is expensive then the efficiency requirement of managing algorithms is minimizing the number of query comparisons. We consider updating operation for network model database management systems. We develop a new sequantial algorithm for updating operation. Also we suggest a distributed version of the algorithm.

  8. Development of subsurface drainage database system for use in environmental management issues

    International Nuclear Information System (INIS)

    Azhar, A.H.; Rafiq, M.; Alam, M.M.

    2007-01-01

    A simple user-friendly menue-driven system for database management pertinent to the Impact of Subsurface Drainage Systems on Land and Water Conditions (ISIAW) has been developed for use in environment-management issues of the drainage areas. This database has been developed by integrating four soft wares, viz; Microsoft Excel, MS Word Acrobat and MS Access. The information, in the form of tables and figures, with respect to various drainage projects has been presented in MS Word files. The major data-sets of various subsurface drainage projects included in the ISLaW database are: i) technical aspects, ii) groundwater and soil-salinity aspects, iii) socio-technical aspects, iv) agro-economic aspects, and v) operation and maintenance aspects. The various ISlAW file can be accessed just by clicking at the Menu buttons of the database system. This database not only gives feed back on the functioning of different subsurface drainage projects, with respect to the above-mentioned aspects, but also serves as a resource-document for these data for future studies on other drainage projects. The developed database-system is useful for planners, designers and Farmers Organisations for improved operation of existing drainage projects as well as development of future ones. (author)

  9. A Transactional Asynchronous Replication Scheme for Mobile Database Systems

    Institute of Scientific and Technical Information of China (English)

    丁治明; 孟小峰; 王珊

    2002-01-01

    In mobile database systems, mobility of users has a significant impact on data replication. As a result, the various replica control protocols that exist today in traditional distributed and multidatabase environments are no longer suitable. To solve this problem, a new mobile database replication scheme, the Transaction-Level Result-Set Propagation (TLRSP)model, is put forward in this paper. The conflict detection and resolution strategy based on TLRSP is discussed in detail, and the implementation algorithm is proposed. In order to compare the performance of the TLRSP model with that of other mobile replication schemes, we have developed a detailed simulation model. Experimental results show that the TLRSP model provides an efficient support for replicated mobile database systems by reducing reprocessing overhead and maintaining database consistency.

  10. Operational experience running the HERA-B database system

    International Nuclear Information System (INIS)

    Amaral, V.; Amorim, A.; Batista, J.

    2001-01-01

    The HERA-B database system has been used in the commissioning period of the experiment. The authors present the expertise gathered during this period, covering also the improvements introduced and describing the different classes of problems faced in giving persistency to all non-event information. The author aims to give a global overview of the Database group activities, techniques developed and results based on the running experiment and dealing with large Data Volumes during and after the production phase

  11. A comparison of database systems for XML-type data.

    Science.gov (United States)

    Risse, Judith E; Leunissen, Jack A M

    2010-01-01

    In the field of bioinformatics interchangeable data formats based on XML are widely used. XML-type data is also at the core of most web services. With the increasing amount of data stored in XML comes the need for storing and accessing the data. In this paper we analyse the suitability of different database systems for storing and querying large datasets in general and Medline in particular. All reviewed database systems perform well when tested with small to medium sized datasets, however when the full Medline dataset is queried a large variation in query times is observed. There is not one system that is vastly superior to the others in this comparison and, depending on the database size and the query requirements, different systems are most suitable. The best all-round solution is the Oracle 11~g database system using the new binary storage option. Alias-i's Lingpipe is a more lightweight, customizable and sufficiently fast solution. It does however require more initial configuration steps. For data with a changing XML structure Sedna and BaseX as native XML database systems or MySQL with an XML-type column are suitable.

  12. Database for fusion devices and associated fuel systems

    International Nuclear Information System (INIS)

    Woolgar, P.W.

    1983-03-01

    A computerized database storage and retrieval system has been set up for fusion devices and the associated fusion fuel systems which should be a useful tool for the CFFTP program and other users. The features of the Wang 'Alliance' system are discussed for this application, as well as some of the limitations of the system. Recommendations are made on the operation, upkeep and further development that should take place to implement and maintain the system

  13. The Eruption Forecasting Information System (EFIS) database project

    Science.gov (United States)

    Ogburn, Sarah; Harpel, Chris; Pesicek, Jeremy; Wellik, Jay; Pallister, John; Wright, Heather

    2016-04-01

    The Eruption Forecasting Information System (EFIS) project is a new initiative of the U.S. Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) with the goal of enhancing VDAP's ability to forecast the outcome of volcanic unrest. The EFIS project seeks to: (1) Move away from relying on the collective memory to probability estimation using databases (2) Create databases useful for pattern recognition and for answering common VDAP questions; e.g. how commonly does unrest lead to eruption? how commonly do phreatic eruptions portend magmatic eruptions and what is the range of antecedence times? (3) Create generic probabilistic event trees using global data for different volcano 'types' (4) Create background, volcano-specific, probabilistic event trees for frequently active or particularly hazardous volcanoes in advance of a crisis (5) Quantify and communicate uncertainty in probabilities A major component of the project is the global EFIS relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest, including the timing of phreatic eruptions, column heights, eruptive products, etc. and will be initially populated using chronicles of eruptive activity from Alaskan volcanic eruptions in the GeoDIVA database (Cameron et al. 2013). This database module allows us to query across other global databases such as the WOVOdat database of monitoring data and the Smithsonian Institution's Global Volcanism Program (GVP) database of eruptive histories and volcano information. The EFIS database is in the early stages of development and population; thus, this contribution also serves as a request for feedback from the community.

  14. ADVICE--Educational System for Teaching Database Courses

    Science.gov (United States)

    Cvetanovic, M.; Radivojevic, Z.; Blagojevic, V.; Bojovic, M.

    2011-01-01

    This paper presents a Web-based educational system, ADVICE, that helps students to bridge the gap between database management system (DBMS) theory and practice. The usage of ADVICE is presented through a set of laboratory exercises developed to teach students conceptual and logical modeling, SQL, formal query languages, and normalization. While…

  15. An Expert System Helps Students Learn Database Design

    Science.gov (United States)

    Post, Gerald V.; Whisenand, Thomas G.

    2005-01-01

    Teaching and learning database design is difficult for both instructors and students. Students need to solve many problems with feedback and corrections. A Web-based specialized expert system was created to enable students to create designs online and receive immediate feedback. An experiment testing the system shows that it significantly enhances…

  16. Data-based control tuning in master-slave systems

    NARCIS (Netherlands)

    Heertjes, M.F.; Temizer, B.

    2012-01-01

    For improved output synchronization in master-slave systems, a data-based control tuning is presented. Herein the coefficients of two finite-duration impulse response (FIR) filters are found through machine-in-the-loop optimization. One filter is used to shape the input to the slave system while the

  17. Quality assurance database for the CBM silicon tracking system

    Energy Technology Data Exchange (ETDEWEB)

    Lymanets, Anton [Physikalisches Institut, Universitaet Tuebingen (Germany); Collaboration: CBM-Collaboration

    2015-07-01

    The Silicon Tracking System is a main tracking device of the CBM Experiment at FAIR. Its construction includes production, quality assurance and assembly of large number of components, e.g., 106 carbon fiber support structures, 1300 silicon microstrip sensors, 16.6k readout chips, analog microcables, etc. Detector construction is distributed over several production and assembly sites and calls for a database that would be extensible and allow tracing the components, integrating the test data, monitoring the component statuses and data flow. A possible implementation of the above-mentioned requirements is being developed at GSI (Darmstadt) based on the FAIR DB Virtual Database Library that provides connectivity to common SQL-Database engines (PostgreSQL, Oracle, etc.). Data structure, database architecture as well as status of implementation are discussed.

  18. A database system for enhancing fuel records management capabilities

    International Nuclear Information System (INIS)

    Rieke, Phil; Razvi, Junaid

    1994-01-01

    The need to modernize the system of managing a large variety of fuel related data at the TRIGA Reactors Facility at General Atomics, as well as the need to improve NRC nuclear material reporting requirements, prompted the development of a database to cover all aspects of fuel records management. The TRIGA Fuel Database replaces (a) an index card system used for recording fuel movements, (b) hand calculations for uranium burnup, and (c) a somewhat aged and cumbersome system of recording fuel inspection results. It was developed using Microsoft Access, a relational database system for Windows. Instead of relying on various sources for element information, users may now review individual element statistics, record inspection results, calculate element burnup and more, all from within a single application. Taking full advantage of the ease-of-use features designed in to Windows and Access, the user can enter and extract information easily through a number of customized on screen forms, with a wide variety of reporting options available. All forms are accessed through a main 'Options' screen, with the options broken down by categories, including 'Elements', 'Special Elements/Devices', 'Control Rods' and 'Areas'. Relational integrity and data validation rules are enforced to assist in ensuring accurate and meaningful data is entered. Among other items, the database lets the user define: element types (such as FLIP or standard) and subtypes (such as fuel follower, instrumented, etc.), various inspection codes for standardizing inspection results, areas within the facility where elements are located, and the power factors associated with element positions within a reactor. Using fuel moves, power history, power factors and element types, the database tracks uranium burnup and plutonium buildup on a quarterly basis. The Fuel Database was designed with end-users in mind and does not force an operations oriented user to learn any programming or relational database theory in

  19. Overview of the SECOM II communications system

    International Nuclear Information System (INIS)

    Olson, W.D.

    1977-04-01

    The conversion of the SECOM system to the SECOM II configuration has been completed and all communications between the central control station and vehicles carrying nuclear weapons and special nuclear materials are being handled by the SECOM II system. A summary of the system characteristics and the improvements achieved over the all-voice system is shown. Through the combined efforts of ALO and Sandia personnel, the transition from SECOM to SECOM II was achieved with a minimum disruption to the operation of the ERDA transportation system

  20. Database management in the new GANIL control system

    International Nuclear Information System (INIS)

    Lecorche, E.; Lermine, P.

    1993-01-01

    At the start of the new control system design, decision was made to manage the huge amount of data by means of a database management system. The first implementations built on the INGRES relational database are described. Real time and data management domains are shown, and problems induced by Ada/SQL interfacing are briefly discussed. Database management concerns the whole hardware and software configuration for the GANIL pieces of equipment and the alarm system either for the alarm configuration or for the alarm logs. An other field of application encompasses the beam parameter archiving as a function of the various kinds of beams accelerated at GANIL (ion species, energies, charge states). (author) 3 refs., 4 figs

  1. Developing of database on nuclear power engineering and purchase of ORACLE system

    International Nuclear Information System (INIS)

    Liu Renkang

    1996-01-01

    This paper presents a point of view according development of database on the nuclear power engineering and performance of ORACLE database manager system. ORACLE system is a practical database system for purchasing

  2. NSLS-II Radio Frequency Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rose J.; Gao F.; Goel, A.; Holub, B.; Kulpin, J.; Marques, C.; Yeddulla, M.

    2015-05-03

    The National Synchrotron Light Source II is a 3 GeV X-ray user facility commissioned in 2014. The NSLS-II RF system consists of the master oscillator, digital low level RF controllers, linac, booster and storage ring RF sub-systems, as well as a supporting cryogenic system. Here we will report on RF commissioning and early operation experience of the system.

  3. Thermodynamic database for the Co-Pr system

    Directory of Open Access Journals (Sweden)

    S.H. Zhou

    2016-03-01

    Full Text Available In this article, we describe data on (1 compositions for both as-cast and heat treated specimens were summarized in Table 1; (2 the determined enthalpy of mixing of liquid phase is listed in Table 2; (3 thermodynamic database of the Co-Pr system in TDB format for the research articled entitle Chemical partitioning for the Co-Pr system: First-principles, experiments and energetic calculations to investigate the hard magnetic phase W. Keywords: Thermodynamic database of Co-Pr, Solution calorimeter measurement, Phase diagram Co-Pr

  4. JAERI Material Performance Database (JMPD); outline of the system

    International Nuclear Information System (INIS)

    Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime.

    1991-01-01

    JAERI Material Performance Database (JMPD) has been developed since 1986 in JAERI with a view to utilizing the various kinds of characteristic data of nuclear materials efficiently. Management system of relational database, PLANNER was employed and supporting systems for data retrieval and output were expanded. JMPD is currently serving the following data; (1) Data yielded from the research activities of JAERI including fatigue crack growth data of LWR pressure vessel materials as well as creep and fatigue data of the alloy developed for the High Temperature Gas-cooled Reactor (HTGR), Hastelloy XR. (2) Data of environmentally assisted cracking of LWR materials arranged by Electric power Research Institute (EPRI) including fatigue crack growth data (3000 tests), stress corrosion data (500 tests) and Slow Strain Rate Technique (SSRT) data (1000 tests). In order to improve user-friendliness of retrieval system, the menu selection type procedures have been developed where knowledge of system and data structures are not required for end-users. In addition a retrieval via database commands, Structured Query Language (SQL), is supported by the relational database management system. In JMPD the retrieved data can be processed readily through supporting systems for graphical and statistical analyses. The present report outlines JMPD and describes procedures for data retrieval and analyses by utilizing JMPD. (author)

  5. Virus Database and Online Inquiry System Based on Natural Vectors.

    Science.gov (United States)

    Dong, Rui; Zheng, Hui; Tian, Kun; Yau, Shek-Chung; Mao, Weiguang; Yu, Wenping; Yin, Changchuan; Yu, Chenglong; He, Rong Lucy; Yang, Jie; Yau, Stephen St

    2017-01-01

    We construct a virus database called VirusDB (http://yaulab.math.tsinghua.edu.cn/VirusDB/) and an online inquiry system to serve people who are interested in viral classification and prediction. The database stores all viral genomes, their corresponding natural vectors, and the classification information of the single/multiple-segmented viral reference sequences downloaded from National Center for Biotechnology Information. The online inquiry system serves the purpose of computing natural vectors and their distances based on submitted genomes, providing an online interface for accessing and using the database for viral classification and prediction, and back-end processes for automatic and manual updating of database content to synchronize with GenBank. Submitted genomes data in FASTA format will be carried out and the prediction results with 5 closest neighbors and their classifications will be returned by email. Considering the one-to-one correspondence between sequence and natural vector, time efficiency, and high accuracy, natural vector is a significant advance compared with alignment methods, which makes VirusDB a useful database in further research.

  6. Design of database management system for 60Co container inspection system

    International Nuclear Information System (INIS)

    Liu Jinhui; Wu Zhifang

    2007-01-01

    The function of the database management system has been designed according to the features of cobalt-60 container inspection system. And the software related to the function has been constructed. The database querying and searching are included in the software. The database operation program is constructed based on Microsoft SQL server and Visual C ++ under Windows 2000. The software realizes database querying, image and graph displaying, statistic, report form and its printing, interface designing, etc. The software is powerful and flexible for operation and information querying. And it has been successfully used in the real database management system of cobalt-60 container inspection system. (authors)

  7. YUCSA: A CLIPS expert database system to monitor academic performance

    Science.gov (United States)

    Toptsis, Anestis A.; Ho, Frankie; Leindekar, Milton; Foon, Debra Low; Carbonaro, Mike

    1991-01-01

    The York University CLIPS Student Administrator (YUCSA), an expert database system implemented in C Language Integrated Processing System (CLIPS), for monitoring the academic performance of undergraduate students at York University, is discussed. The expert system component in the system has already been implemented for two major departments, and it is under testing and enhancement for more departments. Also, more elaborate user interfaces are under development. We describe the design and implementation of the system, problems encountered, and immediate future plans. The system has excellent maintainability and it is very efficient, taking less than one minute to complete an assessment of one student.

  8. Answering biological questions: Querying a systems biology database for nutrigenomics

    NARCIS (Netherlands)

    Evelo, C.T.; Bochove, K. van; Saito, J.T.

    2011-01-01

    The requirement of systems biology for connecting different levels of biological research leads directly to a need for integrating vast amounts of diverse information in general and of omics data in particular. The nutritional phenotype database addresses this challenge for nutrigenomics. A

  9. Research and Implementation of Distributed Database HBase Monitoring System

    Directory of Open Access Journals (Sweden)

    Guo Lisi

    2017-01-01

    Full Text Available With the arrival of large data age, distributed database HBase becomes an important tool for storing data in massive data age. The normal operation of HBase database is an important guarantee to ensure the security of data storage. Therefore designing a reasonable HBase monitoring system is of great significance in practice. In this article, we introduce the solution, which contains the performance monitoring and fault alarm function module, to meet a certain operator’s demand of HBase monitoring database in their actual production projects. We designed a monitoring system which consists of a flexible and extensible monitoring agent, a monitoring server based on SSM architecture, and a concise monitoring display layer. Moreover, in order to deal with the problem that pages renders too slow in the actual operation process, we present a solution: reducing the SQL query. It has been proved that reducing SQL query can effectively improve system performance and user experience. The system work well in monitoring the status of HBase database, flexibly extending the monitoring index, and issuing a warning when a fault occurs, so that it is able to improve the working efficiency of the administrator, and ensure the smooth operation of the project.

  10. JOYO coolant sodium and cover gas purity control database (MK-II core)

    International Nuclear Information System (INIS)

    Ito, Kazuhiro; Nemoto, Masaaki

    2000-03-01

    The experimental fast reactor 'JOYO' served as the MK-II irradiation bed core for testing fuel and material for FBR development for 15 years from 1982 to 1997. During the MK-II operation, impurities concentrations in the sodium and the argon gas were determined by 67 samples of primary sodium, 81 samples of secondary sodium, 75 samples of primary argon gas, 89 samples of secondary argon gas (the overflow tank) and 89 samples of secondary argon gas (the dump tank). The sodium and the argon gas purity control data were accumulated from in thirty-one duty operations, thirteen special test operations and eight annual inspections. These purity control results and related plant data were compiled into database, which were recorded on CD-ROM for user convenience. Purity control data include concentration of oxygen, carbon, hydrogen, nitrogen, chlorine, iron, nickel and chromium in sodium, concentration of oxygen, hydrogen, nitrogen, carbon dioxide, methane and helium in argon gas with the reactor condition. (author)

  11. 8th Asian Conference on Intelligent Information and Database Systems

    CERN Document Server

    Madeyski, Lech; Nguyen, Ngoc

    2016-01-01

    The objective of this book is to contribute to the development of the intelligent information and database systems with the essentials of current knowledge, experience and know-how. The book contains a selection of 40 chapters based on original research presented as posters during the 8th Asian Conference on Intelligent Information and Database Systems (ACIIDS 2016) held on 14–16 March 2016 in Da Nang, Vietnam. The papers to some extent reflect the achievements of scientific teams from 17 countries in five continents. The volume is divided into six parts: (a) Computational Intelligence in Data Mining and Machine Learning, (b) Ontologies, Social Networks and Recommendation Systems, (c) Web Services, Cloud Computing, Security and Intelligent Internet Systems, (d) Knowledge Management and Language Processing, (e) Image, Video, Motion Analysis and Recognition, and (f) Advanced Computing Applications and Technologies. The book is an excellent resource for researchers, those working in artificial intelligence, mu...

  12. Thermodynamic database for the Co-Pr system.

    Science.gov (United States)

    Zhou, S H; Kramer, M J; Meng, F Q; McCallum, R W; Ott, R T

    2016-03-01

    In this article, we describe data on (1) compositions for both as-cast and heat treated specimens were summarized in Table 1; (2) the determined enthalpy of mixing of liquid phase is listed in Table 2; (3) thermodynamic database of the Co-Pr system in TDB format for the research articled entitle Chemical partitioning for the Co-Pr system: First-principles, experiments and energetic calculations to investigate the hard magnetic phase W.

  13. A Tactical Database for the Low Cost Combat Direction System

    Science.gov (United States)

    1990-12-01

    A Tactical Database for the Low Cost Combat Direction System by Everton G. de Paula Captain, Brazilian Air Force B.S., Instituto Tecnologico de...objects as a unit. The AVANCE object management system [Ref. 29] uses the timestamp 156 model (pessimistic approach) for concurrency control. The Vbase...are no longer used). In AVANCE [Ref. 291, garbage collection is performed on user request. In GemStone [Ref. 25], garbage collection is executed in

  14. A Bayesian model for anomaly detection in SQL databases for security systems

    NARCIS (Netherlands)

    Drugan, M.M.

    2017-01-01

    We focus on automatic anomaly detection in SQL databases for security systems. Many logs of database systems, here the Townhall database, contain detailed information about users, like the SQL queries and the response of the database. A database is a list of log instances, where each log instance is

  15. LHC II system sensitivity to magnetic fluids

    CERN Document Server

    Cotae, Vlad

    2005-01-01

    Experiments have been designed to reveal the influences of ferrofluid treatment and static magnetic field exposure on the photosynthetic system II, where the light harvesting complex (LHC II) controls the ratio chlorophyll a/ chlorophyll b (revealing, indirectly, the photosynthesis rate). Spectrophotometric measurement of chlorophyll content revealed different influences for relatively low ferrofluid concentrations (10-30 mul/l) in comparison to higher concentrations (70-100 mul/l). The overlapped effect of the static magnetic field shaped better the stimulatory ferrofluid action on LHC II system in young poppy plantlets.

  16. Coordinate Systems Integration for Craniofacial Database from Multimodal Devices

    Directory of Open Access Journals (Sweden)

    Deni Suwardhi

    2005-05-01

    Full Text Available This study presents a data registration method for craniofacial spatial data of different modalities. The data consists of three dimensional (3D vector and raster data models. The data is stored in object relational database. The data capture devices are Laser scanner, CT (Computed Tomography scan and CR (Close Range Photogrammetry. The objective of the registration is to transform the data from various coordinate systems into a single 3-D Cartesian coordinate system. The standard error of the registration obtained from multimodal imaging devices using 3D affine transformation is in the ranged of 1-2 mm. This study is a step forward for storing the craniofacial spatial data in one reference system in database.

  17. LINGUISTIC DATABASE FOR AUTOMATIC GENERATION SYSTEM OF ENGLISH ADVERTISING TEXTS

    Directory of Open Access Journals (Sweden)

    N. A. Metlitskaya

    2017-01-01

    Full Text Available The article deals with the linguistic database for the system of automatic generation of English advertising texts on cosmetics and perfumery. The database for such a system includes two main blocks: automatic dictionary (that contains semantic and morphological information for each word, and semantic-syntactical formulas of the texts in a special formal language SEMSINT. The database is built on the result of the analysis of 30 English advertising texts on cosmetics and perfumery. First, each word was given a unique code. For example, N stands for nouns, A – for adjectives, V – for verbs, etc. Then all the lexicon of the analyzed texts was distributed into different semantic categories. According to this semantic classification each word was given a special semantic code. For example, the record N01 that is attributed to the word «lip» in the dictionary means that this word refers to nouns of the semantic category «part of a human’s body».The second block of the database includes the semantic-syntactical formulas of the analyzed advertising texts written in a special formal language SEMSINT. The author gives a brief description of this language, presenting its essence and structure. Also, an example of one formalized advertising text in SEMSINT is provided.

  18. Development of a Multidisciplinary and Telemedicine Focused System Database.

    Science.gov (United States)

    Paštěka, Richard; Forjan, Mathias; Sauermann, Stefan

    2017-01-01

    Tele-rehabilitation at home is one of the promising approaches in increasing rehabilitative success and simultaneously decreasing the financial burden on the healthcare system. Novel and mostly mobile devices are already in use, but shall be used in the future to a higher extent for allowing at home rehabilitation processes at a high quality level. The combination of exercises, assessments and available equipment is the basic objective of the presented database. The database has been structured in order to allow easy-to-use and fast access for the three main user groups. Therapists - looking for exercise and equipment combinations - patients - rechecking their tasks for home exercises - and manufacturers - entering their equipment for specific use cases. The database has been evaluated by a proof of concept study and shows a high degree of applicability for the field of rehabilitative medicine. Currently it contains 110 exercises/assessments and 111 equipment/systems. Foundations of presented database are already established in the rehabilitative field of application, but can and will be enhanced in its functionality to be usable for a higher variety of medical fields and specifications.

  19. Establishment of database system for management of KAERI wastes

    International Nuclear Information System (INIS)

    Shon, J. S.; Kim, K. J.; Ahn, S. J.

    2004-07-01

    Radioactive wastes generated by KAERI has various types, nuclides and characteristics. To manage and control these kinds of radioactive wastes, it comes to need systematic management of their records, efficient research and quick statistics. Getting information about radioactive waste generated and stored by KAERI is the basic factor to construct the rapid information system for national cooperation management of radioactive waste. In this study, Radioactive Waste Management Integration System (RAWMIS) was developed. It is is aimed at management of record of radioactive wastes, uplifting the efficiency of management and support WACID(Waste Comprehensive Integration Database System) which is a national radioactive waste integrated safety management system of Korea. The major information of RAWMIS supported by user's requirements is generation, gathering, transfer, treatment, and storage information for solid waste, liquid waste, gas waste and waste related to spent fuel. RAWMIS is composed of database, software (interface between user and database), and software for a manager and it was designed with Client/Server structure. RAWMIS will be a useful tool to analyze radioactive waste management and radiation safety management. Also, this system is developed to share information with associated companies. Moreover, it can be expected to support the technology of research and development for radioactive waste treatment

  20. The use of intelligent database systems in acute pancreatitis--a systematic review.

    Science.gov (United States)

    van den Heever, Marc; Mittal, Anubhav; Haydock, Matthew; Windsor, John

    2014-01-01

    Acute pancreatitis (AP) is a complex disease with multiple aetiological factors, wide ranging severity, and multiple challenges to effective triage and management. Databases, data mining and machine learning algorithms (MLAs), including artificial neural networks (ANNs), may assist by storing and interpreting data from multiple sources, potentially improving clinical decision-making. 1) Identify database technologies used to store AP data, 2) collate and categorise variables stored in AP databases, 3) identify the MLA technologies, including ANNs, used to analyse AP data, and 4) identify clinical and non-clinical benefits and obstacles in establishing a national or international AP database. Comprehensive systematic search of online reference databases. The predetermined inclusion criteria were all papers discussing 1) databases, 2) data mining or 3) MLAs, pertaining to AP, independently assessed by two reviewers with conflicts resolved by a third author. Forty-three papers were included. Three data mining technologies and five ANN methodologies were reported in the literature. There were 187 collected variables identified. ANNs increase accuracy of severity prediction, one study showed ANNs had a sensitivity of 0.89 and specificity of 0.96 six hours after admission--compare APACHE II (cutoff score ≥8) with 0.80 and 0.85 respectively. Problems with databases were incomplete data, lack of clinical data, diagnostic reliability and missing clinical data. This is the first systematic review examining the use of databases, MLAs and ANNs in the management of AP. The clinical benefits these technologies have over current systems and other advantages to adopting them are identified. Copyright © 2013 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  1. State analysis requirements database for engineering complex embedded systems

    Science.gov (United States)

    Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.

  2. A protable Database driven control system for SPEAR

    International Nuclear Information System (INIS)

    Howry, S.; Gromme, T.; King, A.; Sullenberger, M.

    1985-01-01

    The new computer control system software for SPEAR is presented as a transfer from the PEP system. Features of the target ring (SPEAR) such as symmetries, magnet groupings, etc., are all contained in a design file which is read by both people and computer. People use it as documentation; a program reads it to generate the database structure, which becomes the center of communication for all the software. Geometric information, such as element positions and lengths, and CAMAC I/O routing information is entered into the database as it is developed. Since application processes refer only to the database and since they do so only in generic terms, almost all of this software (representing more then fifteen man years) is transferred with few changes. Operator console menus (touchpanels) are also transferred with only superficial changes for the same reasons. The system is modular: the CAMAC I/O software is all in one process; the menu control software is a process; the ring optics model and the orbit model are separate processes, each of which runs concurrently with about 15 others in the multiprogramming environment of the VAX/VMS operating system

  3. Development of knowledge base system linked to material database

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Tsuji, Hirokazu; Mashiko, Shinichi; Miyakawa, Shunichi; Fujita, Mitsutane; Kinugawa, Junichi; Iwata, Shuichi

    2002-01-01

    The distributed material database system named 'Data-Free-Way' has been developed by four organizations (the National Institute for Materials Science, the Japan Atomic Energy Research Institute, the Japan Nuclear Cycle Development Institute, and the Japan Science and Technology Corporation) under a cooperative agreement in order to share fresh and stimulating information as well as accumulated information for the development of advanced nuclear materials, for the design of structural components, etc. In order to create additional values of the system, knowledge base system, in which knowledge extracted from the material database is expressed, is planned to be developed for more effective utilization of Data-Free-Way. XML (eXtensible Markup Language) has been adopted as the description method of the retrieved results and the meaning of them. One knowledge note described with XML is stored as one knowledge which composes the knowledge base. Since this knowledge note is described with XML, the user can easily convert the display form of the table and the graph into the data format which the user usually uses. This paper describes the current status of Data-Free-Way, the description method of knowledge extracted from the material database with XML and the distributed material knowledge base system. (author)

  4. Session II-D. Systems

    International Nuclear Information System (INIS)

    Hall, R.J.

    1981-01-01

    The objectives of the Systems Task in the NWTS Program include: development of program requirements, allocation of the requirements to subsystems or tasks, integration of the task activities towards meeting the overall requirements, and assessment of progress towards achievement of the program mission. The Systems Task also includes a number of ancillary activities which are necessary to the program but which do not logically fall into other work-breakdown structure elements. Activities in the Systems Task, which in the NWTS Program are conducted at both the program and project levels, are generally grouped under the heading systems engineering and include identification of requirements, development of a baseline, integration of the system, baseline control, functional analyses, trade-off studies, and system analyses. The following papers in this session address some of the activities and progress that was achieved in the Systems Task in FY 1981: (1) waste isolation system alternatives: a cost comparison; (2) BWIP technical integration and control; (3) BWIP performance evaluation process: a criteria based method; (4) impacts of waste age; (5) systems studies of subseabed disposal; and (6) systems studies of waste transportation

  5. The use of database management systems in particle physics

    CERN Document Server

    Stevens, P H; Read, B J; Rittenberg, Alan

    1979-01-01

    Examines data-handling needs and problems in particle physics and looks at three very different efforts by the Particle Data Group (PDG) , the CERN-HERA Group in Geneva, and groups cooperating with ZAED in Germany at resolving these problems. The ZAED effort does not use a database management system (DBMS), the CERN-HERA Group uses an existing, limited capability DBMS, and PDG uses the Berkely Database Management (BDMS), which PDG itself designed and implemented with scientific data-handling needs in mind. The range of problems each group tried to resolve was influenced by whether or not a DBMS was available and by what capabilities it had. Only PDG has been able to systematically address all the problems. The authors discuss the BDMS- centered system PDG is now building in some detail. (12 refs).

  6. System of end-to-end symmetric database encryption

    Science.gov (United States)

    Galushka, V. V.; Aydinyan, A. R.; Tsvetkova, O. L.; Fathi, V. A.; Fathi, D. V.

    2018-05-01

    The article is devoted to the actual problem of protecting databases from information leakage, which is performed while bypassing access control mechanisms. To solve this problem, it is proposed to use end-to-end data encryption, implemented at the end nodes of an interaction of the information system components using one of the symmetric cryptographic algorithms. For this purpose, a key management method designed for use in a multi-user system based on the distributed key representation model, part of which is stored in the database, and the other part is obtained by converting the user's password, has been developed and described. In this case, the key is calculated immediately before the cryptographic transformations and is not stored in the memory after the completion of these transformations. Algorithms for registering and authorizing a user, as well as changing his password, have been described, and the methods for calculating parts of a key when performing these operations have been provided.

  7. Development of the plasma movie database system in JT-60

    International Nuclear Information System (INIS)

    Sueoka, Michiharu; Kawamata, Yoichi; Kurihara, Kenichi; Seki, Akiyuki

    2008-03-01

    A plasma movie is generally expected as one of the most efficient methods to know what plasma discharge has been conducted in the experiment. The JT-60 plasma movie is composed of video camera picture looking at a plasma, computer graphics (CG) picture, and magnetic probe signal as a sound channel. In order to use this movie efficiently, we have developed a new system having the following functions: (a) To store a plasma movie in the movie database system automatically combined with the plasma shape CG and the sound according to a discharge sequence. (b) To make a plasma movie is available (downloadable) for experiment data analyses at the Web-site. Especially, this system aimed at minimizing the development cost, and it tried to develop the real-time plasma shape visualization system (RVS) without any operating system (OS) customized for real-time use. As a result, this system succeeded in working under Windows XP. This report deals with the technical details of the plasma movie database system and the real-time plasma shape visualization system. (author)

  8. Efficient Incremental Garbage Collection for Workstation/Server Database Systems

    OpenAIRE

    Amsaleg , Laurent; Gruber , Olivier; Franklin , Michael

    1994-01-01

    Projet RODIN; We describe an efficient server-based algorithm for garbage collecting object-oriented databases in a workstation/server environment. The algorithm is incremental and runs concurrently with client transactions, however, it does not hold any locks on data and does not require callbacks to clients. It is fault tolerant, but performs very little logging. The algorithm has been designed to be integrated into existing OODB systems, and therefore it works with standard implementation ...

  9. Computerized database management system for breast cancer patients.

    Science.gov (United States)

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.

  10. Thioaptamer Diagnostic System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — AM Biotechnologies (AM) in partnership with Sandia National Laboratories will develop a Thioaptamer Diagnostic System (TDS) in response to Topic X10.01 Reusable...

  11. Particle Systems and PDEs II

    CERN Document Server

    Soares, Ana

    2015-01-01

    This book focuses on mathematical problems concerning different applications in physics, engineering, chemistry and biology. It covers topics ranging from interacting particle systems to partial differential equations (PDEs), statistical mechanics and dynamical systems. The purpose of the second meeting on Particle Systems and PDEs was to bring together renowned researchers working actively in the respective fields, to discuss their topics of expertise and to present recent scientific results in both areas. Further, the meeting was intended to present the subject of interacting particle systems, its roots in and impacts on the field of physics, and its relation with PDEs to a vast and varied public, including young researchers. The book also includes the notes from two mini-courses presented at the conference, allowing readers who are less familiar with these areas of mathematics to more easily approach them. The contributions will be of interest to mathematicians, theoretical physicists and other researchers...

  12. SOUL System Maturation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Busek Co. Inc. proposes to advance the maturity of an innovative Spacecraft on Umbilical Line (SOUL) System suitable for a wide variety of applications of interest...

  13. Development of web database system for JAERI ERL-FEL

    International Nuclear Information System (INIS)

    Kikuzawa, Nobuhiro

    2005-01-01

    The accelerator control system for the JAERI ERL-FEL is a PC-based distributed control system. The accelerator status record is stored automatically through the control system to analyze the influence on the electron beam. In order to handle effectively a large number of stored data, it is necessary that the required data can be searched and visualized in easy operation. For this reason, a web database (DB) system which can search of the required data and display visually on a web browser was developed by using open source software. With introduction of this system, accelerator operators can monitor real-time information anytime, anywhere through a web browser. Development of the web DB system is described in this paper. (author)

  14. Development of web database system for JAERI ERL-FEL

    Energy Technology Data Exchange (ETDEWEB)

    Kikuzawa, Nobuhiro [Japan Atomic Energy Research Inst., Kansai Research Establishment, Advanced Photon Research Center, Tokai, Ibaraki (Japan)

    2005-06-01

    The accelerator control system for the JAERI ERL-FEL is a PC-based distributed control system. The accelerator status record is stored automatically through the control system to analyze the influence on the electron beam. In order to handle effectively a large number of stored data, it is necessary that the required data can be searched and visualized in easy operation. For this reason, a web database (DB) system which can search of the required data and display visually on a web browser was developed by using open source software. With introduction of this system, accelerator operators can monitor real-time information anytime, anywhere through a web browser. Development of the web DB system is described in this paper. (author)

  15. Structure health monitoring system using internet and database technologies

    International Nuclear Information System (INIS)

    Kwon, Il Bum; Kim, Chi Yeop; Choi, Man Yong; Lee, Seung Seok

    2003-01-01

    Structural health monitoring system should developed to be based on internet and database technology in order to manage efficiently large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconvince.

  16. Structural health monitoring system using internet and database technologies

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chi Yeop; Choi, Man Yong; Kwon, Il Bum; Lee, Seung Seok [Nonstructive Measurment Lab., KRISS, Daejeon (Korea, Republic of)

    2003-07-01

    Structure health monitoring system should develope to be based on internet and database technology in order to manage efficiency large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconviniences.

  17. Structure health monitoring system using internet and database technologies

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Il Bum; Kim, Chi Yeop; Choi, Man Yong; Lee, Seung Seok [Smart Measurment Group. Korea Resarch Institute of Standards and Science, Saejeon (Korea, Republic of)

    2003-05-15

    Structural health monitoring system should developed to be based on internet and database technology in order to manage efficiently large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconvince.

  18. Structural health monitoring system using internet and database technologies

    International Nuclear Information System (INIS)

    Kim, Chi Yeop; Choi, Man Yong; Kwon, Il Bum; Lee, Seung Seok

    2003-01-01

    Structure health monitoring system should develope to be based on internet and database technology in order to manage efficiency large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconviniences.

  19. Genetic and bibliographic information - GenLibi | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available n Data acquisition method Articles related to genes were obtained from the bibliography database (JDream II)...provided from the JST bibliographic information system (JDream II) About This Database Database Description

  20. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    OpenAIRE

    Raied Salman

    2015-01-01

    In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed an...

  1. How the choice of Operating System can affect databases on a Virtual Machine

    OpenAIRE

    Karlsson, Jan; Eriksson, Patrik

    2014-01-01

    As databases grow in size, the need for optimizing databases is becoming a necessity. Choosing the right operating system to support your database becomes paramount to ensure that the database is fully utilized. Furthermore with the virtualization of operating systems becoming more commonplace, we find ourselves with more choices than we ever faced before. This paper demonstrates why the choice of operating system plays an integral part in deciding the right database for your system in a virt...

  2. A dedicated database system for handling multi-level data in systems biology.

    Science.gov (United States)

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging. To overcome this, we designed and developed a dedicated database system that can serve and solve the vital issues in data management and hereby facilitate data integration, modeling and analysis in systems biology within a sole database. In addition, a yeast data repository was implemented as an integrated database environment which is operated by the database system. Two applications were implemented to demonstrate extensibility and utilization of the system. Both illustrate how the user can access the database via the web query function and implemented scripts. These scripts are specific for two sample cases: 1) Detecting the pheromone pathway in protein interaction networks; and 2) Finding metabolic reactions regulated by Snf1 kinase. In this study we present the design of database system which offers an extensible environment to efficiently capture the majority of biological entities and relations encountered in systems biology. Critical functions and control processes were designed and implemented to ensure consistent, efficient, secure and reliable transactions. The two sample cases on the yeast integrated data clearly demonstrate the value of a sole database environment for systems biology research.

  3. Database application research in real-time data access of accelerator control system

    International Nuclear Information System (INIS)

    Chen Guanghua; Chen Jianfeng; Wan Tianmin

    2012-01-01

    The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

  4. A distributed database view of network tracking systems

    Science.gov (United States)

    Yosinski, Jason; Paffenroth, Randy

    2008-04-01

    In distributed tracking systems, multiple non-collocated trackers cooperate to fuse local sensor data into a global track picture. Generating this global track picture at a central location is fairly straightforward, but the single point of failure and excessive bandwidth requirements introduced by centralized processing motivate the development of decentralized methods. In many decentralized tracking systems, trackers communicate with their peers via a lossy, bandwidth-limited network in which dropped, delayed, and out of order packets are typical. Oftentimes the decentralized tracking problem is viewed as a local tracking problem with a networking twist; we believe this view can underestimate the network complexities to be overcome. Indeed, a subsequent 'oversight' layer is often introduced to detect and handle track inconsistencies arising from a lack of robustness to network conditions. We instead pose the decentralized tracking problem as a distributed database problem, enabling us to draw inspiration from the vast extant literature on distributed databases. Using the two-phase commit algorithm, a well known technique for resolving transactions across a lossy network, we describe several ways in which one may build a distributed multiple hypothesis tracking system from the ground up to be robust to typical network intricacies. We pay particular attention to the dissimilar challenges presented by network track initiation vs. maintenance and suggest a hybrid system that balances speed and robustness by utilizing two-phase commit for only track initiation transactions. Finally, we present simulation results contrasting the performance of such a system with that of more traditional decentralized tracking implementations.

  5. Timeliness and Predictability in Real-Time Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H

    1998-01-01

    The confluence of computers, communications, and databases is quickly creating a globally distributed database where many applications require real time access to both temporally accurate and multimedia data...

  6. Expert system for quality control in bibliographic databases

    International Nuclear Information System (INIS)

    Todeschini, C.; Farrell, M.P.

    1989-01-01

    An Expert System is presented that can identify errors in the intellectual decisions made by indexers when categorizing documents into an a priori category scheme. The system requires the compilation of a Knowledge Base that incorporates in statistical form the decisions on the linking of indexing and categorization derived from a preceding period of the bibliographic database. New input entering the database is checked against the Knowledge Base, using the descriptor indexing assigned to each record, and the system computed a value for the match of each record with the particular category chosen by the indexer. This category match value is used as a criterion for identifying those documents that have been erroneously categorized. The system was tested on large sample of almost 26,000 documents, representing all the literature falling into ten of the subject categories of the Energy Data Base during the five year period 1980-1984. For valid comparisons among categories, the Knowledge Base must be constructed with an approximately equal number of unique descriptors for each subject category. The system identified those items with high probability of having been erroneously categorized. These items, constituting up to 5% of the sample, were evaluated manually by subject specialists for correct categorization and then compared with the results of the Expert System. Of those pieces of literature deemed by the system to be erroneously categorized, about 75% did indeed belong to a different category. This percentage, however, is dependent on the level at which the threshold on the category match value is set. With a lower threshold value, the percentage can be raised to 90%, but this is accompanied by a lowering of the absolute number of wrongly categorized records caught by the system. The Expert System can be considered as a first step to complete semiautomatic categorizing system

  7. Positrons in biomolecular systems. II

    International Nuclear Information System (INIS)

    Glass, J.C.; Graf, G.; Costabal, H.; Ewert, D.H.; English, L.

    1982-01-01

    Pickoff-annihilation parameters, as related to the free volume model, are shown to be indicators of structural fluctuations in membranes and membrane bound proteins. Nitrous oxide anesthetic induces lateral rigidity in a membrane, and an anesthetic mechanism is suggested. Conformational changes of (Na + ,K + )ATPase in natural membrane are observed with ATP and Mg-ion binding. New positron applications to active transport and photosynthetic systems are suggested. (Auth.)

  8. From the LHC Reference Database to the Powering Interlock System

    CERN Document Server

    Dehavay, C; Schmidt, R; Veyrunes, E; Zerlauth, M

    2003-01-01

    The protection of the magnet powering system for the Large Hadron Collider (LHC) currently being built at CERN is a major challenge due to the unprecedented complexity of the accelerator. The Powering Interlock System of the LHC will have to manage more than 1600 DC circuits for magnet powering, different in their structure, complexity and importance to the accelerator. For the coherent description of such complex system, a Reference Database as unique source of the parameters of the electrical circuits has been developed. The information, introduced via a generic circuit description language, is first used for installing the accelerator and making all electrical connections. The data is then used for tests and commissioning. During operation, the Powering Interlock System manages all critical functions. It consists of 36 PLC based controllers dis tributed around the machine and requires a flexible and transparent way of configuration, since each controller manages different numbers and types of electrical ci...

  9. Fossil-Fuel C02 Emissions Database and Exploration System

    Science.gov (United States)

    Krassovski, M.; Boden, T.

    2012-04-01

    Fossil-Fuel C02 Emissions Database and Exploration System Misha Krassovski and Tom Boden Carbon Dioxide Information Analysis Center Oak Ridge National Laboratory The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) quantifies the release of carbon from fossil-fuel use and cement production each year at global, regional, and national spatial scales. These estimates are vital to climate change research given the strong evidence suggesting fossil-fuel emissions are responsible for unprecedented levels of carbon dioxide (CO2) in the atmosphere. The CDIAC fossil-fuel emissions time series are based largely on annual energy statistics published for all nations by the United Nations (UN). Publications containing historical energy statistics make it possible to estimate fossil-fuel CO2 emissions back to 1751 before the Industrial Revolution. From these core fossil-fuel CO2 emission time series, CDIAC has developed a number of additional data products to satisfy modeling needs and to address other questions aimed at improving our understanding of the global carbon cycle budget. For example, CDIAC also produces a time series of gridded fossil-fuel CO2 emission estimates and isotopic (e.g., C13) emissions estimates. The gridded data are generated using the methodology described in Andres et al. (2011) and provide monthly and annual estimates for 1751-2008 at 1° latitude by 1° longitude resolution. These gridded emission estimates are being used in the latest IPCC Scientific Assessment (AR4). Isotopic estimates are possible thanks to detailed information for individual nations regarding the carbon content of select fuels (e.g., the carbon signature of natural gas from Russia). CDIAC has recently developed a relational database to house these baseline emissions estimates and associated derived products and a web-based interface to help users worldwide query these data holdings. Users can identify, explore and download desired CDIAC

  10. METODE RESET PASSWORD LEVEL ROOT PADA RELATIONAL DATABASE MANAGEMENT SYSTEM (RDBMS MySQL

    Directory of Open Access Journals (Sweden)

    Taqwa Hariguna

    2011-08-01

    Full Text Available Database merupakan sebuah hal yang penting untuk menyimpan data, dengan database organisasi akan mendapatkan keuntungan dalam beberapa hal, seperti kecepatan akases dan mengurangi penggunaan kertas, namun dengan implementasi database tidak jarang administrator database lupa akan password yang digunakan, hal ini akan mempersulit dalam proses penangganan database. Penelitian ini bertujuan untuk menggali cara mereset password level root pada relational database management system MySQL.

  11. The TJ-II data acquisition system: an overview

    International Nuclear Information System (INIS)

    Vega, J.; Cremy, C.; Sanchez, E.; Portas, A.

    1999-01-01

    The data acquisition system for the TJ-II fusion machine has been developed to coordinate actions among the several experimental systems devoted to data capture and storage: instrumentation mainframes (VXI, VME, CAMAC), control systems of diagnostics and a host-centralized database. Connectivity between these elements is achieved through local area networks, which ensure both good connections and system growth capability. Three hundred VXI based digitizer channels have been developed for TJ-II diagnostics. They are completely software programmable and provide signal analog conditioning. In addition, some of them supply a programmable DSP for real time signal processing. Data will be stored in a central server using a special compression technique that allows compaction rates of over 80%. A specific application software has been developed to provide user interface for digitizer programming, signal visualization and data processing during TJ-II discharges. The software is an event based application that can be remotely launched from any X terminal An authentication mechanism restricts access to authorised users only. (orig.)

  12. Upgrading of TARN-II vacuum system

    International Nuclear Information System (INIS)

    Chida, K.; Arakaki, Y.; Yoshizawa, M.; Tomizawa, M.; Tanabe, T.; Katayama, I.

    1994-01-01

    Ion pumps and titanium getter pumps have been increased nearly twice in the TARN-II. The pumping speed per unit length is now improved up to 2/3 times that of TARN-I. An average vacuum pressure of 10 -11 Torr order has been achieved at beam time. Performance of the system after the upgrading is reported. (author)

  13. Formal system of communication and understanding. II

    Energy Technology Data Exchange (ETDEWEB)

    Zsuzsanna, M

    1982-01-01

    For pt.I see IBID., no.5, p.252-8 (1982). In this article G. Pask's (1975) formal theory of dialogues and talk is summarized. Part II describes the talk-environment and modelling. The conscious systems and machine-intelligence are mainly dealt with. Finally a couple of cases with Pask's theory implemented are looked at. 7 references.

  14. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T W; Sutton, M

    2011-09-19

    , meaning that they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  15. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    International Nuclear Information System (INIS)

    Wolery, T.W.; Sutton, M.

    2011-01-01

    they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  16. AtlasT4SS: a curated database for type IV secretion systems.

    Science.gov (United States)

    Souza, Rangel C; del Rosario Quispe Saji, Guadalupe; Costa, Maiana O C; Netto, Diogo S; Lima, Nicholas C B; Klein, Cecília C; Vasconcelos, Ana Tereza R; Nicolás, Marisa F

    2012-08-09

    The type IV secretion system (T4SS) can be classified as a large family of macromolecule transporter systems, divided into three recognized sub-families, according to the well-known functions. The major sub-family is the conjugation system, which allows transfer of genetic material, such as a nucleoprotein, via cell contact among bacteria. Also, the conjugation system can transfer genetic material from bacteria to eukaryotic cells; such is the case with the T-DNA transfer of Agrobacterium tumefaciens to host plant cells. The system of effector protein transport constitutes the second sub-family, and the third one corresponds to the DNA uptake/release system. Genome analyses have revealed numerous T4SS in Bacteria and Archaea. The purpose of this work was to organize, classify, and integrate the T4SS data into a single database, called AtlasT4SS - the first public database devoted exclusively to this prokaryotic secretion system. The AtlasT4SS is a manual curated database that describes a large number of proteins related to the type IV secretion system reported so far in Gram-negative and Gram-positive bacteria, as well as in Archaea. The database was created using the RDBMS MySQL and the Catalyst Framework based in the Perl programming language and using the Model-View-Controller (MVC) design pattern for Web. The current version holds a comprehensive collection of 1,617 T4SS proteins from 58 Bacteria (49 Gram-negative and 9 Gram-Positive), one Archaea and 11 plasmids. By applying the bi-directional best hit (BBH) relationship in pairwise genome comparison, it was possible to obtain a core set of 134 clusters of orthologous genes encoding T4SS proteins. In our database we present one way of classifying orthologous groups of T4SSs in a hierarchical classification scheme with three levels. The first level comprises four classes that are based on the organization of genetic determinants, shared homologies, and evolutionary relationships: (i) F-T4SS, (ii) P-T4SS, (iii

  17. The plasma movie database system for JT-60

    International Nuclear Information System (INIS)

    Sueoka, Michiharu; Kawamata, Yoichi; Kurihara, Kenichi; Seki, Akiyuki

    2007-01-01

    The real-time plasma movie with the computer graphics (CG) of plasma shape is one of the most effective methods to know what discharge have been made in the experiment. For an easy use of the movie in the data analysis, we have developed the plasma movie database system (PMDS), which automatically records plasma movie according to the JT-60 discharge sequence, and transfers the movie files on request from the web site. The file is compressed to about 8 MB/shot small enough to be transferred within a few seconds through local area network (LAN). In this report, we describe the developed system from the technical point of view, and discuss a future plan on the basis of advancing video technology

  18. Application of modern reliability database techniques to military system data

    International Nuclear Information System (INIS)

    Bunea, Cornel; Mazzuchi, Thomas A.; Sarkani, Shahram; Chang, H.-C.

    2008-01-01

    This paper focuses on analysis techniques of modern reliability databases, with an application to military system data. The analysis of military system data base consists of the following steps: clean the data and perform operation on it in order to obtain good estimators; present simple plots of data; analyze the data with statistical and probabilistic methods. Each step is dealt with separately and the main results are presented. Competing risks theory is advocated as the mathematical support for the analysis. The general framework of competing risks theory is presented together with simple independent and dependent competing risks models available in literature. These models are used to identify the reliability and maintenance indicators required by the operating personnel. Model selection is based on graphical interpretation of plotted data

  19. Terverticillate penicillia studied by direct electrospray mass spectrometric profiling of crude extracts II. Database and identification

    DEFF Research Database (Denmark)

    Smedsgaard, Jørn

    1997-01-01

    A mass spectral database was built using standard instrument software from 678 electrospray mass spectra (mass profiles) from crude fungal extracts of terverticillate taxa within the genus Penicillium. The match factors calculated from searching all the mass profiles stored in the database were...

  20. Cryptanalysis of Password Protection of Oracle Database Management System (DBMS)

    Science.gov (United States)

    Koishibayev, Timur; Umarova, Zhanat

    2016-04-01

    This article discusses the currently available encryption algorithms in the Oracle database, also the proposed upgraded encryption algorithm, which consists of 4 steps. In conclusion we make an analysis of password encryption of Oracle Database.

  1. The metacompiler system META-II/X

    International Nuclear Information System (INIS)

    Kneis, W.

    1975-03-01

    It is the objective of this work to demonstrate by the properties of the META-II/X system and concrete compiler implementation for IML that a simple and universally applicable symbol processor allows to develop in a very easy manner precompilers for problem oriented languages. The main feature consists in the fact that no auxiliary routines coded manually had to be added for special implementation. The translation of IML is exclusively defined by the compiler description written in the META language. As a whole, META-II/X proves to be a system which is relatively convenient to handle in automating the translation of explicit languages. The decisive point is the choise of an assembler language as target language allowing to transfer to the assembler level references not completely resolved. Implementation includes the possibility of an uncomplicated transfer of the whole system inclusive of the internal compiler representations. (orig.) [de

  2. Computerized nuclear material database management system for power reactors

    International Nuclear Information System (INIS)

    Cheng Binghao; Zhu Rongbao; Liu Daming; Cao Bin; Liu Ling; Tan Yajun; Jiang Jincai

    1994-01-01

    The software packages for nuclear material database management for power reactors are described. The database structure, data flow and model for management of the database are analysed. Also mentioned are the main functions and characterizations of the software packages, which are successfully installed and used at both the Daya Bay Nuclear Power Plant and the Qinshan Nuclear Power Plant for the purposed of handling nuclear material database automatically

  3. SCORPION II persistent surveillance system update

    Science.gov (United States)

    Coster, Michael; Chambers, Jon

    2010-04-01

    This paper updates the improvements and benefits demonstrated in the next generation Northrop Grumman SCORPION II family of persistent surveillance and target recognition systems produced by the Xetron Campus in Cincinnati, Ohio. SCORPION II reduces the size, weight, and cost of all SCORPION components in a flexible, field programmable system that is easier to conceal and enables integration of over fifty different Unattended Ground Sensor (UGS) and camera types from a variety of manufacturers, with a modular approach to supporting multiple Line of Sight (LOS) and Beyond Line of Sight (BLOS) communications interfaces. Since 1998 Northrop Grumman has been integrating best in class sensors with its proven universal modular Gateway to provide encrypted data exfiltration to Common Operational Picture (COP) systems and remote sensor command and control. In addition to feeding COP systems, SCORPION and SCORPION II data can be directly processed using a common sensor status graphical user interface (GUI) that allows for viewing and analysis of images and sensor data from up to seven hundred SCORPION system gateways on single or multiple displays. This GUI enables a large amount of sensor data and imagery to be used for actionable intelligence as well as remote sensor command and control by a minimum number of analysts.

  4. The D0 run II trigger system

    International Nuclear Information System (INIS)

    Schwienhorst, Reinhard; Michigan State U.

    2004-01-01

    The D0 detector at the Fermilab Tevatron was upgraded for Run II. This upgrade included improvements to the trigger system in order to be able to handle the increased Tevatron luminosity and higher bunch crossing rates compared to Run I. The D0 Run II trigger is a highly exible system to select events to be written to tape from an initial interaction rate of about 2.5 MHz. This is done in a three-tier pipelined, buffered system. The first tier (level 1) processes fast detector pick-off signals in a hardware/firmware based system to reduce the event rate to about 1. 5kHz. The second tier (level 2) uses information from level 1 and forms simple Physics objects to reduce the rate to about 850 Hz. The third tier (level 3) uses full detector readout and event reconstruction on a filter farm to reduce the rate to 20-30 Hz. The D0 trigger menu contains a wide variety of triggers. While the emphasis is on triggering on generic lepton and jet final states, there are also trigger terms for specific final state signatures. In this document we describe the D0 trigger system as it was implemented and is currently operating in Run II

  5. Information retrieval system of nuclear power plant database (PPD) user's guide

    International Nuclear Information System (INIS)

    Izumi, Fumio; Horikami, Kunihiko; Kobayashi, Kensuke.

    1990-12-01

    A nuclear power plant database (PPD) and its retrieval system have been developed. The database involves a large number of safety design data of nuclear power plants, operating and planned in Japan. The information stored in the database can be retrieved at high speed, whenever they are needed, by use of the retrieval system. The report is a user's manual of the system to access the database utilizing a display unit of the JAERI computer network system. (author)

  6. Development of the Plasma Movie Database System for JT-60

    International Nuclear Information System (INIS)

    Sueoka, M.; Kawamata, Y.; Kurihara, K.

    2006-01-01

    A plasma movie is generally expected as one of the most efficient methods to know what plasma discharge has been conducted in the experiment. On this motivation we have developed and operated a real-time plasma shape visualization system over ten years. The current plasma movie is composed of (1) video camera picture looking at a plasma, (2) computer graphic (CG) picture, and (3) magnetic probe signal as a sound channel. (1) The plasma video movie is provided by a standard video camera, equipped at the viewing port of the vacuum vessel looking at a plasma poloidal cross section. (2) A plasma shape CG movie is provided by the plasma shape visualization system, which calculates the plasma shape in real-time using the CCS method [Kurihara, K., Fusion Engineering and Design, 51-52, 1049 (2000)]. Thirty snap-shot pictures per second are drawn by the graphic processor. (3) A sound in the movie is a raw signal of magnetic pick up coil. This sound reflects plasma rotation frequency which shows smooth high tone sound seems to mean a good plasma. In order to use this movie efficiently, we have developed a new system having the following functions: (a) To store a plasma movie in the movie database system automatically combined with the plasma shape CG and the sound according to a discharge sequence. (b) To make a plasma movie be available (downloadable) for experiment data analyses at the Web-site. The plasma movie capture system receives the timing signal according to the JT-60 discharge sequence, and starts to record a plasma movie automatically. The movie is stored in a format of MPEG2 in the RAID-disk. In addition, the plasma movie capture system transfers a movie file in a MPEG4 format to the plasma movie web-server at the same time. In response to the user's request the plasma movie web-server transfers a stored movie data immediately. The movie data amount for the MPEG2 format is about 50 Mbyte/shot (65 s discharge), and that for the MPEG4 format is about 7 Mbyte

  7. Developing Visualization Support System for Teaching/Learning Database Normalization

    Science.gov (United States)

    Folorunso, Olusegun; Akinwale, AdioTaofeek

    2010-01-01

    Purpose: In tertiary institution, some students find it hard to learn database design theory, in particular, database normalization. The purpose of this paper is to develop a visualization tool to give students an interactive hands-on experience in database normalization process. Design/methodology/approach: The model-view-controller architecture…

  8. Database Systems and Oracle: Experiences and Lessons Learned

    Science.gov (United States)

    Dunn, Deborah

    2005-01-01

    In a tight job market, IT professionals with database experience are likely to be in great demand. Companies need database personnel who can help improve access to and security of data. The events of September 11 have increased business' awareness of the need for database security, backup, and recovery procedures. It is our responsibility to…

  9. Multiple brain atlas database and atlas-based neuroimaging system.

    Science.gov (United States)

    Nowinski, W L; Fang, A; Nguyen, B T; Raphel, J K; Jagannathan, L; Raghavan, R; Bryan, R N; Miller, G A

    1997-01-01

    For the purpose of developing multiple, complementary, fully labeled electronic brain atlases and an atlas-based neuroimaging system for analysis, quantification, and real-time manipulation of cerebral structures in two and three dimensions, we have digitized, enhanced, segmented, and labeled the following print brain atlases: Co-Planar Stereotaxic Atlas of the Human Brain by Talairach and Tournoux, Atlas for Stereotaxy of the Human Brain by Schaltenbrand and Wahren, Referentially Oriented Cerebral MRI Anatomy by Talairach and Tournoux, and Atlas of the Cerebral Sulci by Ono, Kubik, and Abernathey. Three-dimensional extensions of these atlases have been developed as well. All two- and three-dimensional atlases are mutually preregistered and may be interactively registered with an actual patient's data. An atlas-based neuroimaging system has been developed that provides support for reformatting, registration, visualization, navigation, image processing, and quantification of clinical data. The anatomical index contains about 1,000 structures and over 400 sulcal patterns. Several new applications of the brain atlas database also have been developed, supported by various technologies such as virtual reality, the Internet, and electronic publishing. Fusion of information from multiple atlases assists the user in comprehensively understanding brain structures and identifying and quantifying anatomical regions in clinical data. The multiple brain atlas database and atlas-based neuroimaging system have substantial potential impact in stereotactic neurosurgery and radiotherapy by assisting in visualization and real-time manipulation in three dimensions of anatomical structures, in quantitative neuroradiology by allowing interactive analysis of clinical data, in three-dimensional neuroeducation, and in brain function studies.

  10. NSLS-II booster timing system

    International Nuclear Information System (INIS)

    Cheblakov, P.; Karnaev, S.; De Long, J.

    2012-01-01

    NSLS-II light source includes the main storage ring with beam lines and injection part consisting of 200 MeV linac, a full-energy 3 GeV booster synchrotron and two transport lines. The booster timing system is a part of NSLS-II timing system which uses hardware from MicroResearch Finland: Event Generator (EVG) and Event Receivers (EVRs). The booster timing is based on the events coming from NSLS-II EVG: 'Pre-Injection', 'Injection', 'Pre-Extraction', 'Extraction'. These events are referenced to the selected RF bucket of the storage ring and correspond to the first RF bucket of the booster. EVRs provide triggers both for the injection and the extraction pulse devices. EVRs also provide the timing of booster cycle operation and generation of events for cycle-to-cycle updates of pulsed and ramping parameters, and synchronization of the booster beam instrumentation devices. This paper describes the final design of the booster timing system. The timing system functional diagrams and block diagram are presented. (authors)

  11. LCLS-II CRYOMODULE TRANSPORT SYSTEM TESTING

    Energy Technology Data Exchange (ETDEWEB)

    Huque, Naeem [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Daly, Edward F. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); McGee, Michael W. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2018-04-01

    The Cryomodules (CM) for the Linear Coherent Light Source II (LCLS-II) will be shipped to SLAC (Menlo Park, California) from JLab (Newport News, Virginia) and FNAL (Batavia, Illinois). A transportation system has been designed and built to safely transport the CMs over the road. It uses an array of helical isolator springs to attenuate shocks on the CM to below 1.5g in all directions. The system rides on trailers equipped with Air-Ride suspension, which attenuates vibration loads. The prototype LCLS-II CM (pCM) was driven 750 miles to test the transport system; shock loggers recorded the shock attenuation on the pCM and vacuum gauges were used to detect any compromises in beamline vacuum. Alignment measurements were taken before and after the trip to check whether cavity positions had shifted beyond the ± 0.2mm spec. Passband frequencies and cavity gradients were measured at 2K at the Cryomodule Test Facility (CMTF) at JLab to identify any degradation of CM performance after transportation. The transport system was found to have safely carried the CM and is cleared to begin shipments from JLab and FNAL to SLAC.

  12. PEP-II RF feedback system simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tighe, R [Stanford Linear Accelerator Center, Menlo Park, CA (United States)

    1996-08-01

    A model containing the fundamental impedance of the PEP-II cavity along with the longitudinal beam dynamics and RF feedback system components is in use. It is prepared in a format allowing time-domain as well as frequency-domain analysis and full graphics capability. Matlab and Simulink are control system design and analysis programs (widely available) with many built-in tools. The model allows the use of compiled C-code modules for compute intensive portions. We desire to represent as nearly as possible the components of the feedback system including all delays, sample rates and applicable nonlinearities. (author)

  13. Representing clinical communication knowledge through database management system integration.

    Science.gov (United States)

    Khairat, Saif; Craven, Catherine; Gong, Yang

    2012-01-01

    Clinical communication failures are considered the leading cause of medical errors [1]. The complexity of the clinical culture and the significant variance in training and education levels form a challenge to enhancing communication within the clinical team. In order to improve communication, a comprehensive understanding of the overall communication process in health care is required. In an attempt to further understand clinical communication, we conducted a thorough methodology literature review to identify strengths and limitations of previous approaches [2]. Our research proposes a new data collection method to study the clinical communication activities among Intensive Care Unit (ICU) clinical teams with a primary focus on the attending physician. In this paper, we present the first ICU communication instrument, and, we introduce the use of database management system to aid in discovering patterns and associations within our ICU communications data repository.

  14. SAPE Database Building for a Security System Test Bed

    International Nuclear Information System (INIS)

    Jo, Kwangho; Kim, Woojin

    2013-01-01

    Physical protection to prevent radiological sabotage and the unauthorized removal of nuclear material is one of the important activities. Physical protection system (PPS) of nuclear facilities needs the effectiveness analysis. This effectiveness analysis of PPS is evaluated by the probability of blocking the attack at the most vulnerable path. Systematic Analysis of Physical Protection Effectiveness (SAPE) is one of a computer code developed for the vulnerable path analysis. SAPE is able to analyze based on the data of the experimental results that can be obtained through the Test Bed. In order to utilize the SAPE code, we conducted some field tests on several sensors and acquired data. This paper aims at describing the way of DB (database) establishment

  15. Enabling On-Demand Database Computing with MIT SuperCloud Database Management System

    Science.gov (United States)

    2015-09-15

    arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created

  16. Wide-area-distributed storage system for a multimedia database

    Science.gov (United States)

    Ueno, Masahiro; Kinoshita, Shigechika; Kuriki, Makato; Murata, Setsuko; Iwatsu, Shigetaro

    1998-12-01

    We have developed a wide-area-distribution storage system for multimedia databases, which minimizes the possibility of simultaneous failure of multiple disks in the event of a major disaster. It features a RAID system, whose member disks are spatially distributed over a wide area. Each node has a device, which includes the controller of the RAID and the controller of the member disks controlled by other nodes. The devices in the node are connected to a computer, using fiber optic cables and communicate using fiber-channel technology. Any computer at a node can utilize multiple devices connected by optical fibers as a single 'virtual disk.' The advantage of this system structure is that devices and fiber optic cables are shared by the computers. In this report, we first described our proposed system, and a prototype was used for testing. We then discussed its performance; i.e., how to read and write throughputs are affected by data-access delay, the RAID level, and queuing.

  17. Manganese and the II system in photosynthesis

    International Nuclear Information System (INIS)

    Joyard, Jacques

    1971-01-01

    The evolution during greening of some components of system II of photosynthesis has been followed in plastids extracted from Zea mays grown in the dark. Manganese studies were done by means of neutron activation, electron spin resonance (ESR) was also used in some experiments. Oxygen evolution of isolated plastids was followed by polarography (with a membrane electrode). The evolution of manganese/carotenoids ratio can be divided in three parts. During the first hour of greening, the increase shows an input of Mn in the plastids; then, whereas carotenoids content of those plastids presents no changes, Mn is released in the medium; at last, carotenoids synthesis is parallel to Mn fixation in the plastids, the ratio being constant after 24 hours of greening. From various measurements on chloroplastic manganese, it is shown that the development of system II can be divided in two main phases: during the first one (that is during the first day of light) the components are not yet bound together but the relations become more and more strong. Then, during the last period of the development, the organisation of system II is complete and the transformations of the plastids are parallel to the raise of their activity. (author) [fr

  18. Generic Database Cost Models for Hierarchical Memory Systems

    OpenAIRE

    Manegold, Stefan; Boncz, Peter; Kersten, Martin

    2002-01-01

    textabstractAccurate prediction of operator execution time is a prerequisite for database query optimization. Although extensively studied for conventional disk-based DBMSs, cost modeling in main-memory DBMSs is still an open issue. Recent database research has demonstrated that memory access is more and more becoming a significant---if not the major---cost component of database operations. If used properly, fast but small cache memories---usually organized in cascading hierarchy between CPU ...

  19. Embedded computer systems for control applications in EBR-II

    International Nuclear Information System (INIS)

    Carlson, R.B.; Start, S.E.

    1993-01-01

    The purpose of this paper is to describe the embedded computer systems approach taken at Experimental Breeder Reactor II (EBR-II) for non-safety related systems. The hardware and software structures for typical embedded systems are presented The embedded systems development process is described. Three examples are given which illustrate typical embedded computer applications in EBR-II

  20. Real-time data exchange system in CSRe and RIBBLL II

    International Nuclear Information System (INIS)

    Liu Wufeng; Xu Yang; Li Guohua; Guo Yuhui; Chinese Academy of Sciences, Beijing; Qiao Weimin; Jing Lan; Wang Yongping; Gou Shizhe

    2008-01-01

    The design of real-time data exchange system in HIRFL-CSR's CSRe and RIBLL II has been introduced, including it's design of software and hardware. This system realizes controlling power devices at the same time. In system, data is from web browser to center Oracle database. And then, it arrives at sqlite database in ARM module by way of front-server's Oracle database by COM module. Finally, ARM module transmits data to DSP module's memory to control power devices when event is same. At the same time, ADC can acquire device's current value or voltage value which is saved in center Oracle data-base. Practice shows that this system has the character of high reliability and stability. (authors)

  1. System maintenance test plan for the TWRS controlled baseline database system

    International Nuclear Information System (INIS)

    Spencer, S.G.

    1998-01-01

    TWRS [Tank Waste Remediation System] Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the maintenance testing approach for software testing of the TCBD system once SCR/PRs are implemented

  2. NED-IIS: An Intelligent Information System for Forest Ecosystem Management

    Science.gov (United States)

    W.D. Potter; S. Somasekar; R. Kommineni; H.M. Rauscher

    1999-01-01

    We view Intelligent Information System (IIS) as composed of a unified knowledge base, database, and model base. The model base includes decision support models, forecasting models, and cvsualization models for example. In addition, we feel that the model base should include domain specific porblems solving modules as well as decision support models. This, then,...

  3. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  4. Assessment of Integrated Information System (IIS) in organization ...

    African Journals Online (AJOL)

    Assessment of Integrated Information System (IIS) in organization. ... to enable the Information System (IS) managers, as well as top management to understand the ... since organisational and strategic aspects in IIS should also be considered.

  5. Generic Database Cost Models for Hierarchical Memory Systems

    NARCIS (Netherlands)

    S. Manegold (Stefan); P.A. Boncz (Peter); M.L. Kersten (Martin)

    2002-01-01

    textabstractAccurate prediction of operator execution time is a prerequisite for database query optimization. Although extensively studied for conventional disk-based DBMSs, cost modeling in main-memory DBMSs is still an open issue. Recent database research has demonstrated that memory access is

  6. Checkpointing and Recovery in Distributed and Database Systems

    Science.gov (United States)

    Wu, Jiang

    2011-01-01

    A transaction-consistent global checkpoint of a database records a state of the database which reflects the effect of only completed transactions and not the results of any partially executed transactions. This thesis establishes the necessary and sufficient conditions for a checkpoint of a data item (or the checkpoints of a set of data items) to…

  7. Active in-database processing to support ambient assisted living systems.

    Science.gov (United States)

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  8. Active In-Database Processing to Support Ambient Assisted Living Systems

    Directory of Open Access Journals (Sweden)

    Wagner O. de Morais

    2014-08-01

    Full Text Available As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  9. The relational database system of KM3NeT

    Science.gov (United States)

    Albert, Arnauld; Bozza, Cristiano

    2016-04-01

    The KM3NeT Collaboration is building a new generation of neutrino telescopes in the Mediterranean Sea. For these telescopes, a relational database is designed and implemented for several purposes, such as the centralised management of accounts, the storage of all documentation about components and the status of the detector and information about slow control and calibration data. It also contains information useful during the construction and the data acquisition phases. Highlights in the database schema, storage and management are discussed along with design choices that have impact on performances. In most cases, the database is not accessed directly by applications, but via a custom designed Web application server.

  10. Analysis/design of tensile property database system

    International Nuclear Information System (INIS)

    Park, S. J.; Kim, D. H.; Jeon, I.; Lyu, W. S.

    2001-01-01

    The data base construction using the data produced from tensile experiment can increase the application of test results. Also, we can get the basic data ease from database when we prepare the new experiment and can produce high quality result by compare the previous data. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, the analysis and design was performed to develop the database for tensile extension property

  11. The Nuclear Science References (NSR) database and Web Retrieval System

    International Nuclear Information System (INIS)

    Pritychenko, B.; Betak, E.; Kellett, M.A.; Singh, B.; Totans, J.

    2011-01-01

    The Nuclear Science References (NSR) database together with its associated Web interface is the world's only comprehensive source of easily accessible low- and intermediate-energy nuclear physics bibliographic information for more than 200,000 articles since the beginning of nuclear science. The weekly updated NSR database provides essential support for nuclear data evaluation, compilation and research activities. The principles of the database and Web application development and maintenance are described. Examples of nuclear structure, reaction and decay applications are specifically included. The complete NSR database is freely available at the websites of the National Nuclear Data Center (http://www.nndc.bnl.gov/nsr) and the International Atomic Energy Agency (http://www-nds.iaea.org/nsr).

  12. Computer system for International Reactor Pressure Vessel Materials Database support

    International Nuclear Information System (INIS)

    Arutyunjan, R.; Kabalevsky, S.; Kiselev, V.; Serov, A.

    1997-01-01

    This report presents description of the computer tools for support of International Reactor Pressure Vessel Materials Database developed at IAEA. Work was focused on raw, qualified, processed materials data, search, retrieval, analysis, presentation and export possibilities of data. Developed software has the following main functions: provides software tools for querying and search of any type of data in the database; provides the capability to update the existing information in the database; provides the capability to present and print selected data; provides the possibility of export on yearly basis the run-time IRPVMDB with raw, qualified and processed materials data to Database members; provides the capability to export any selected sets of raw, qualified, processed materials data

  13. 16th East-European Conference on Advances in Databases and Information Systems (ADBIS 2012)

    CERN Document Server

    Härder, Theo; Wrembel, Robert; Advances in Databases and Information Systems

    2013-01-01

    This volume is the second one of the 16th East-European Conference on Advances in Databases and Information Systems (ADBIS 2012), held on September 18-21, 2012, in Poznań, Poland. The first one has been published in the LNCS series.   This volume includes 27 research contributions, selected out of 90. The contributions cover a wide spectrum of topics in the database and information systems field, including: database foundation and theory, data modeling and database design, business process modeling, query optimization in relational and object databases, materialized view selection algorithms, index data structures, distributed systems, system and data integration, semi-structured data and databases, semantic data management, information retrieval, data mining techniques, data stream processing, trust and reputation in the Internet, and social networks. Thus, the content of this volume covers the research areas from fundamentals of databases, through still hot topic research problems (e.g., data mining, XML ...

  14. Database Capture of Natural Language Echocardiographic Reports: A Unified Medical Language System Approach

    OpenAIRE

    Canfield, K.; Bray, B.; Huff, S.; Warner, H.

    1989-01-01

    We describe a prototype system for semi-automatic database capture of free-text echocardiography reports. The system is very simple and uses a Unified Medical Language System compatible architecture. We use this system and a large body of texts to create a patient database and develop a comprehensive hierarchical dictionary for echocardiography.

  15. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    Science.gov (United States)

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  16. The SDH mutation database: an online resource for succinate dehydrogenase sequence variants involved in pheochromocytoma, paraganglioma and mitochondrial complex II deficiency

    Directory of Open Access Journals (Sweden)

    Devilee Peter

    2005-11-01

    Full Text Available Abstract Background The SDHA, SDHB, SDHC and SDHD genes encode the subunits of succinate dehydrogenase (succinate: ubiquinone oxidoreductase, a component of both the Krebs cycle and the mitochondrial respiratory chain. SDHA, a flavoprotein and SDHB, an iron-sulfur protein together constitute the catalytic domain, while SDHC and SDHD encode membrane anchors that allow the complex to participate in the respiratory chain as complex II. Germline mutations of SDHD and SDHB are a major cause of the hereditary forms of the tumors paraganglioma and pheochromocytoma. The largest subunit, SDHA, is mutated in patients with Leigh syndrome and late-onset optic atrophy, but has not as yet been identified as a factor in hereditary cancer. Description The SDH mutation database is based on the recently described Leiden Open (source Variation Database (LOVD system. The variants currently described in the database were extracted from the published literature and in some cases annotated to conform to current mutation nomenclature. Researchers can also directly submit new sequence variants online. Since the identification of SDHD, SDHC, and SDHB as classic tumor suppressor genes in 2000 and 2001, studies from research groups around the world have identified a total of 120 variants. Here we introduce all reported paraganglioma and pheochromocytoma related sequence variations in these genes, in addition to all reported mutations of SDHA. The database is now accessible online. Conclusion The SDH mutation database offers a valuable tool and resource for clinicians involved in the treatment of patients with paraganglioma-pheochromocytoma, clinical geneticists needing an overview of current knowledge, and geneticists and other researchers needing a solid foundation for further exploration of both these tumor syndromes and SDHA-related phenotypes.

  17. Development of the software for the component reliability database system of Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sang Hoon; Kim, Seung Hwan; Choi, Sun Young [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    A study was performed to develop the system for the component reliability database which consists of database system to store the reliability data and softwares to analyze the reliability data.This system is a part of KIND (Korea Information System for Nuclear Reliability Database).The MS-SQL database is used to stores the component population data, component maintenance history, and the results of reliability analysis. Two softwares were developed for the component reliability system. One is the KIND-InfoView for the data storing, retrieving and searching. The other is the KIND-CompRel for the statistical analysis of component reliability. 4 refs., 13 figs., 7 tabs. (Author)

  18. Design of multi-tiered database application based on CORBA component in SDUV-FEL system

    International Nuclear Information System (INIS)

    Sun Xiaoying; Shen Liren; Dai Zhimin

    2004-01-01

    The drawback of usual two-tiered database architecture was analyzed and the Shanghai Deep Ultraviolet-Free Electron Laser database system under development was discussed. A project for realizing the multi-tiered database architecture based on common object request broker architecture (CORBA) component and middleware model constructed by C++ was presented. A magnet database was given to exhibit the design of the CORBA component. (authors)

  19. Functional integration of automated system databases by means of artificial intelligence

    Science.gov (United States)

    Dubovoi, Volodymyr M.; Nikitenko, Olena D.; Kalimoldayev, Maksat; Kotyra, Andrzej; Gromaszek, Konrad; Iskakova, Aigul

    2017-08-01

    The paper presents approaches for functional integration of automated system databases by means of artificial intelligence. The peculiarities of turning to account the database in the systems with the usage of a fuzzy implementation of functions were analyzed. Requirements for the normalization of such databases were defined. The question of data equivalence in conditions of uncertainty and collisions in the presence of the databases functional integration is considered and the model to reveal their possible occurrence is devised. The paper also presents evaluation method of standardization of integrated database normalization.

  20. Performance assessment of EMR systems based on post-relational database.

    Science.gov (United States)

    Yu, Hai-Yan; Li, Jing-Song; Zhang, Xiao-Guang; Tian, Yu; Suzuki, Muneou; Araki, Kenji

    2012-08-01

    Post-relational databases provide high performance and are currently widely used in American hospitals. As few hospital information systems (HIS) in either China or Japan are based on post-relational databases, here we introduce a new-generation electronic medical records (EMR) system called Hygeia, which was developed with the post-relational database Caché and the latest platform Ensemble. Utilizing the benefits of a post-relational database, Hygeia is equipped with an "integration" feature that allows all the system users to access data-with a fast response time-anywhere and at anytime. Performance tests of databases in EMR systems were implemented in both China and Japan. First, a comparison test was conducted between a post-relational database, Caché, and a relational database, Oracle, embedded in the EMR systems of a medium-sized first-class hospital in China. Second, a user terminal test was done on the EMR system Izanami, which is based on the identical database Caché and operates efficiently at the Miyazaki University Hospital in Japan. The results proved that the post-relational database Caché works faster than the relational database Oracle and showed perfect performance in the real-time EMR system.

  1. Soil Properties Database of Spanish Soils Volume II.- Asturias, Cantabria and Pais Vasco

    International Nuclear Information System (INIS)

    Trueba, C; Millan, R.; Schmid, T.; Roquero, C.; Magister, M.

    1998-01-01

    The soil vulnerability determines the sensitivity of the soil after an accidental radioactive contamination due to Cs-137 and Sr-90. The Departamento de Impacto Ambiental de la Energia of CIEMAT is carrying out an assessment of the radiological vulnerability of the different Spanish soils found on the Iberian Peninsula. This requires the knowledge of the soil properties for the various types of existing soils. In order to achieve this aim, a bibliographical compilation of soil profiles has been made to characterize the different soil types and create a database of their properties. Depending on the year of publication and the type of documentary source, the information compiled from the available bibliography is very heterogeneous. Therefore. an important effort has been made to normalize and process the information prior to its incorporation to the database. This volume presents the criteria applied to normalize and process the data as well as the soil properties of the various soil types belonging to the Comunidades Autonomas de Asturias, Cantabria and Pais Vasco. (Author) 34 refs

  2. Diffusivity database (DDB) system for major rocks (Version of 2006/specification and CD-ROM)

    International Nuclear Information System (INIS)

    Tochigi, Yoshikatsu; Sasamoto, Hirosi; Shibata, Masahiro; Sato, Haruo; Yui, Mikazu

    2006-03-01

    The development of the database system has been started to manage with the generally used. The database system has been constructed based on datasheets of the effective diffusion coefficient of the nuclides in the rock matrix in order to be applied on the 'H12: Project to Establish the Scientific and Technical Basis for HLW Disposal in Japan'. In this document, the examination and expansion of the datasheet structure and the process of construction of the database system and conversion of all data existing on datasheets are described. As the first step of the development of the database, this database system and its data will continue to be updated and the interface will be revised to improve the availability. The developed database system is attached on the CD-ROM as the file format of Microsoft Access. (author)

  3. Current status of system development to provide databases of nuclides migration

    International Nuclear Information System (INIS)

    Sasamoto, Hiroshi; Yoshida, Yasushi; Isogai, Takeshi; Suyama, Tadahiro; Shibata, Masahiro; Yui, Mikazu; Jintoku, Takashi

    2005-01-01

    JNC has developed databases of nuclides migration for safety assessment of high-level radioactive waste (HLW) repository, and they have been used in the second progress report to present the technical reliability of HLW geological disposal system in Japan. The technical level and applicability of databases have been highly evaluated even overseas. To provide the databases broadly over the world and to promote the use of the databases, we have performed the followings: 1) development of tools to convert the database format from geochemical code PHREEQE to PHREEQC, GWB and EQ3/6 and 2) set up a web site (http://migrationdb.jnc.go.jp) which enables the public to access to the databases. As a result, the number of database users has significantly increased. Additionally, a number of useful comments from the users can be applied to modification and/or update of databases. (author)

  4. The relational clinical database: a possible solution to the star wars in registry systems.

    Science.gov (United States)

    Michels, D K; Zamieroski, M

    1990-12-01

    In summary, having data from other service areas available in a relational clinical database could resolve many of the problems existing in today's registry systems. Uniting sophisticated information systems into a centralized database system could definitely be a corporate asset in managing the bottom line.

  5. 78 FR 2363 - Notification of Deletion of a System of Records; Automated Trust Funds Database

    Science.gov (United States)

    2013-01-11

    ... [Docket No. APHIS-2012-0041] Notification of Deletion of a System of Records; Automated Trust Funds Database AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice of deletion of a system... establishing the Automated Trust Funds (ATF) database system of records. The Federal Information Security...

  6. The CDF-II silicon tracking system

    Energy Technology Data Exchange (ETDEWEB)

    F. Palmonari et al.

    2002-01-18

    The CDFII silicon tracking system, SVX, for Run II of the Fermilab Tevatron has up to 8 cylindrical layers with average radii spanning from {approx} (1.5 to 28.7) cm, and lengths ranging from {approx} (90 to 200) cm for a total active-area of {approx} 6 m{sup 2} and {approx} 7.2 x 10{sup 5} readout channels. SVX will improve the CDFII acceptance and efficiency for both B and high-Pt physics dependent upon b-tagging. Along with the description of the SVX we report some alignment survey data from the SVX assembly phase and the actual status of the alignment as it results from the offline data analysis. The problems encountered are also reviewed.

  7. Armada: a reference model for an evolving database system

    NARCIS (Netherlands)

    F.E. Groffen (Fabian); M.L. Kersten (Martin); S. Manegold (Stefan)

    2006-01-01

    textabstractThe current database deployment palette ranges from networked sensor-based devices to large data/compute Grids. Both extremes present common challenges for distributed DBMS technology. The local storage per device/node/site is severely limited compared to the total data volume being

  8. Generic database cost models for hierarchical memory systems

    NARCIS (Netherlands)

    S. Manegold (Stefan); P.A. Boncz (Peter); M.L. Kersten (Martin)

    2002-01-01

    textabstractAccurate prediction of operator execution time is a prerequisite fordatabase query optimization. Although extensively studied for conventionaldisk-based DBMSs, cost modeling in main-memory DBMSs is still an openissue. Recent database research has demonstrated that memory access ismore

  9. Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App

    Science.gov (United States)

    Nurnawati, E. K.; Ermawati, E.

    2018-02-01

    An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.

  10. A searching and reporting system for relational databases using a graph-based metadata representation.

    Science.gov (United States)

    Hewitt, Robin; Gobbi, Alberto; Lee, Man-Ling

    2005-01-01

    Relational databases are the current standard for storing and retrieving data in the pharmaceutical and biotech industries. However, retrieving data from a relational database requires specialized knowledge of the database schema and of the SQL query language. At Anadys, we have developed an easy-to-use system for searching and reporting data in a relational database to support our drug discovery project teams. This system is fast and flexible and allows users to access all data without having to write SQL queries. This paper presents the hierarchical, graph-based metadata representation and SQL-construction methods that, together, are the basis of this system's capabilities.

  11. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Science.gov (United States)

    Yang, Xiaohuan; Huang, Yaohuan; Dong, Pinliang; Jiang, Dong; Liu, Honghui

    2009-01-01

    The spatial distribution of population is closely related to land use and land cover (LULC) patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS) have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS) is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B) data integrated with a Pattern Decomposition Method (PDM) and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM). The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable. PMID:22399959

  12. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Directory of Open Access Journals (Sweden)

    Xiaohuan Yang

    2009-02-01

    Full Text Available The spatial distribution of population is closely related to land use and land cover (LULC patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B data integrated with a Pattern Decomposition Method (PDM and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM. The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable.

  13. CRITICAL ASSESSMENT OF AUDITING CONTRIBUTIONS TO EFFECTIVE AND EFFICIENT SECURITY IN DATABASE SYSTEMS

    OpenAIRE

    Olumuyiwa O. Matthew; Carl Dudley

    2015-01-01

    Database auditing has become a very crucial aspect of security as organisations increase their adoption of database management systems (DBMS) as major asset that keeps, maintain and monitor sensitive information. Database auditing is the group of activities involved in observing a set of stored data in order to be aware of the actions of users. The work presented here outlines the main auditing techniques and methods. Some architectural based auditing systems were also consider...

  14. A dedicated database system for handling multi-level data in systems biology

    OpenAIRE

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Background Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging...

  15. CRAVE: a database, middleware and visualization system for phenotype ontologies.

    Science.gov (United States)

    Gkoutos, Georgios V; Green, Eain C J; Greenaway, Simon; Blake, Andrew; Mallon, Ann-Marie; Hancock, John M

    2005-04-01

    A major challenge in modern biology is to link genome sequence information to organismal function. In many organisms this is being done by characterizing phenotypes resulting from mutations. Efficiently expressing phenotypic information requires combinatorial use of ontologies. However tools are not currently available to visualize combinations of ontologies. Here we describe CRAVE (Concept Relation Assay Value Explorer), a package allowing storage, active updating and visualization of multiple ontologies. CRAVE is a web-accessible JAVA application that accesses an underlying MySQL database of ontologies via a JAVA persistent middleware layer (Chameleon). This maps the database tables into discrete JAVA classes and creates memory resident, interlinked objects corresponding to the ontology data. These JAVA objects are accessed via calls through the middleware's application programming interface. CRAVE allows simultaneous display and linking of multiple ontologies and searching using Boolean and advanced searches.

  16. 18th East European Conference on Advances in Databases and Information Systems and Associated Satellite Events

    CERN Document Server

    Ivanovic, Mirjana; Kon-Popovska, Margita; Manolopoulos, Yannis; Palpanas, Themis; Trajcevski, Goce; Vakali, Athena

    2015-01-01

    This volume contains the papers of 3 workshops and the doctoral consortium, which are organized in the framework of the 18th East-European Conference on Advances in Databases and Information Systems (ADBIS’2014). The 3rd International Workshop on GPUs in Databases (GID’2014) is devoted to subjects related to utilization of Graphics Processing Units in database environments. The use of GPUs in databases has not yet received enough attention from the database community. The intention of the GID workshop is to provide a discussion on popularizing the GPUs and providing a forum for discussion with respect to the GID’s research ideas and their potential to achieve high speedups in many database applications. The 3rd International Workshop on Ontologies Meet Advanced Information Systems (OAIS’2014) has a twofold objective to present: new and challenging issues in the contribution of ontologies for designing high quality information systems, and new research and technological developments which use ontologie...

  17. An information integration system for structured documents, Web, and databases

    OpenAIRE

    Morishima, Atsuyuki

    1998-01-01

    Rapid advance in computer network technology has changed the style of computer utilization. Distributed computing resources over world-wide computer networks are available from our local computers. They include powerful computers and a variety of information sources. This change is raising more advanced requirements. Integration of distributed information sources is one of such requirements. In addition to conventional databases, structured documents have been widely used, and have increasing...

  18. EPAUS9R - An Energy Systems Database for use with the Market Allocation (MARKAL) Model

    Science.gov (United States)

    EPA’s MARKAL energy system databases estimate future-year technology dispersals and associated emissions. These databases are valuable tools for exploring a variety of future scenarios for the U.S. energy-production systems that can impact climate change c

  19. The PEP-II abort kicker system

    International Nuclear Information System (INIS)

    Lamare, J de; Donaldson, A.; Kulikov, A. Lipari, J.

    1997-07-01

    The PEP-II project has two storage rings. The HER (High Energy Ring) has up to 1.48 A of electron beam at 9 GeV, and the LER (Low Energy Ring) has up to 2.14 A of positron beam at 3.1 GeV. To protect the HER and LER beam lines in the event of a ring component failure, each ring has an abort kicker system which directs the beam into a dump when a failure is detected. Due to the high current of the beams, the beam kick is tapered from 100% to 80% in 7.33 uS (the beam transit time around the time). This taper distributes the energy evenly across the window which separates the ring from the beam dump such that the window is not damaged. The abort kicker trigger is synchronized with the ion clearing gap of the beam allowing for the kicker field to rise from 0-80% in 370 nS. This report discusses the design of the system controls, interlocks, power supplies, and modulator

  20. The Belle II SVD data readout system

    Energy Technology Data Exchange (ETDEWEB)

    Thalmeier, R., E-mail: Richard.Thalmeier@oeaw.ac.at [Institute of High Energy Physics, Austrian Academy of Sciences, 1050 Vienna (Austria); Adamczyk, K. [H. Niewodniczanski Institute of Nuclear Physics, Krakow 31-342 (Poland); Aihara, H. [Department of Physics, University of Tokyo, Tokyo 113-0033 (Japan); Angelini, C. [Dipartimento di Fisica, Universita’ di Pisa, I-56127 Pisa (Italy); INFN Sezione di Pisa, I-56127 Pisa (Italy); Aziz, T.; Babu, V. [Tata Institute of Fundamental Research, Mumbai 400005 (India); Bacher, S. [H. Niewodniczanski Institute of Nuclear Physics, Krakow 31-342 (Poland); Bahinipati, S. [Indian Institute of Technology Bhubaneswar, Satya Nagar (India); Barberio, E.; Baroncelli, Ti.; Baroncelli, To. [School of Physics, University of Melbourne, Melbourne, Victoria 3010 (Australia); Basith, A.K. [Indian Institute of Technology Madras, Chennai 600036 (India); Batignani, G. [Dipartimento di Fisica, Universita’ di Pisa, I-56127 Pisa (Italy); INFN Sezione di Pisa, I-56127 Pisa (Italy); Bauer, A. [Institute of High Energy Physics, Austrian Academy of Sciences, 1050 Vienna (Austria); Behera, P.K. [Indian Institute of Technology Madras, Chennai 600036 (India); Bergauer, T. [Institute of High Energy Physics, Austrian Academy of Sciences, 1050 Vienna (Austria); Bettarini, S. [Dipartimento di Fisica, Universita’ di Pisa, I-56127 Pisa (Italy); INFN Sezione di Pisa, I-56127 Pisa (Italy); Bhuyan, B. [Indian Institute of Technolog y Guwahati, Assam 781039 (India); Bilka, T. [Faculty of Mathematics and Physics, Charles University, 12116 Prague (Czech Republic); Bosi, F. [INFN Sezione di Pisa, I-56127 Pisa (Italy); and others

    2017-02-11

    The Belle II Experiment at the High Energy Accelerator Research Organization (KEK) in Tsukuba, Japan, will explore the asymmetry between matter and antimatter and search for new physics beyond the standard model. 172 double-sided silicon strip detectors are arranged cylindrically in four layers around the collision point to be part of a system which measures the tracks of the collision products of electrons and positrons. A total of 1748 radiation-hard APV25 chips read out 128 silicon strips each and send the analog signals by time-division multiplexing out of the radiation zone to 48 Flash Analog Digital Converter Modules (FADC). Each of them applies processing to the data; for example, it uses a digital finite impulse response filter to compensate line signal distortions, and it extracts the peak timing and amplitude from a set of several data points for each hit, using a neural network. We present an overview of the SVD data readout system, along with front-end electronics, cabling, power supplies and data processing.

  1. Experience using a distributed object oriented database for a DAQ system

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    To configure the RD13 data acquisition system, we need many parameters which describe the various hardware and software components. Such information has been defined using an entity-relation model and stored in a commercial memory-resident database. during the last year, Itasca, an object oriented database management system (OODB), was chosen as a replacement database system. We have ported the existing databases (hs and sw configurations, run parameters etc.) to Itasca and integrated it with the run control system. We believe that it is possible to use an OODB in real-time environments such as DAQ systems. In this paper, we present our experience and impression: why we wanted to change from an entity-relational approach, some useful features of Itasca, the issues we meet during this project including integration of the database into an existing distributed environment and factors which influence performance. (author)

  2. A user's manual for the database management system of impact property

    International Nuclear Information System (INIS)

    Ryu, Woo Seok; Park, S. J.; Kong, W. S.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the impact database system for managing the impact property test data. The data base constructed the data produced from impact property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The impact database system was developed by internet method using jsp(Java Server pages) tool

  3. Development of database systems for safety of repositories for disposal of radioactive wastes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yeong Hun; Han, Jeong Sang; Shin, Hyeon Jun; Ham, Sang Won; Kim, Hye Seong [Yonsei Univ., Seoul (Korea, Republic of)

    1999-03-15

    In the study, GSIS os developed for the maximizing effectiveness of the database system. For this purpose, the spatial relation of data from various fields that are constructed in the database which was developed for the site selection and management of repository for radioactive waste disposal. By constructing the integration system that can link attribute and spatial data, it is possible to evaluate the safety of repository effectively and economically. The suitability of integrating database and GSIS is examined by constructing the database in the test district where the site characteristics are similar to that of repository for radioactive waste disposal.

  4. An Implementation of a Database System for Book Loan in an ...

    African Journals Online (AJOL)

    A Case Study of the Polytechnic, Ibadan Library) ... the deletion, updating and query operations. Reports can be generated using report generator incorporated into the system. Key Words: Database, Book, loan, Academic, Library System, File ...

  5. Diagnosing the PEP-II Injection System

    Energy Technology Data Exchange (ETDEWEB)

    Decker, F.-J.; Donald, M.H.; Iverson, R.H.; Kulikov, A.; Pappas, G.C.; Weaver, M.; /SLAC

    2005-05-09

    The injection of beam into the PEP-II B-Factory, especially into the High Energy Ring (HER) has some challenges. A high background level in the BaBar detector has for a while inhibited us from trickling charge into the HER similar to the Low Energy Ring (LER). Analyzing the injection system has revealed many issues which could be improved. The injection bump between two kickers was not closed, mainly because the phase advance wasn't exactly 180{sup o} and the two kicker strengths were not balanced. Additionally we found reflections which kick the stored beam after the main kick and cause the average luminosity to drop about 3% for a 10 Hz injection rate. The strength of the overall kick is nearly twice as high as the design, indicating a much bigger effective septum thickness. Compared with single beam the background is worse when the HER beam is colliding with the LER beam. This hints that the beam-beam force and the observed vertical blow-up in the HER pushes the beam and especially the injected beam further out to the edge of the dynamic aperture or beyond.

  6. Diagnosing the PEP-II Injection System

    International Nuclear Information System (INIS)

    Decker, F.-J.; Donald, M.H.; Iverson, R.H.; Kulikov, A.; Pappas, G.C.; Weaver, M.; SLAC

    2005-01-01

    The injection of beam into the PEP-II B-Factory, especially into the High Energy Ring (HER) has some challenges. A high background level in the BaBar detector has for a while inhibited us from trickling charge into the HER similar to the Low Energy Ring (LER). Analyzing the injection system has revealed many issues which could be improved. The injection bump between two kickers was not closed, mainly because the phase advance wasn't exactly 180 o and the two kicker strengths were not balanced. Additionally we found reflections which kick the stored beam after the main kick and cause the average luminosity to drop about 3% for a 10 Hz injection rate. The strength of the overall kick is nearly twice as high as the design, indicating a much bigger effective septum thickness. Compared with single beam the background is worse when the HER beam is colliding with the LER beam. This hints that the beam-beam force and the observed vertical blow-up in the HER pushes the beam and especially the injected beam further out to the edge of the dynamic aperture or beyond

  7. A survey of the use of database management systems in accelerator projects

    OpenAIRE

    Poole, John; Strubin, Pierre M

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accele...

  8. Data-base system for northern Midwest regional aquifer-system analysis

    Science.gov (United States)

    Kontis, A.L.; Mandle, Richard J.

    1980-01-01

    The U.S. Geological Survey is conducting a study of the Cambrian and Ordovician aquifer system of the northern Midwest as part of a national series of Regional Aquifer-Systems Analysis (RASA). An integral part of this study will be a simulation of the ground-water flow regime using the Geological Survey's three-dimensional finite-difference model. The first step in the modeling effort is the design and development of a systematic set of processes to facilitate the collection, evaluation, manipulation, and use of large quantities of information. A computerized data-base system to accomplish these goals has been completed for the northern Midwest RASA.

  9. Security in the CernVM File System and the Frontier Distributed Database Caching System

    International Nuclear Information System (INIS)

    Dykstra, D; Blomer, J

    2014-01-01

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.

  10. Security in the CernVM File System and the Frontier Distributed Database Caching System

    Science.gov (United States)

    Dykstra, D.; Blomer, J.

    2014-06-01

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.

  11. Design of special purpose database for credit cooperation bank business processing network system

    Science.gov (United States)

    Yu, Yongling; Zong, Sisheng; Shi, Jinfa

    2011-12-01

    With the popularization of e-finance in the city, the construction of e-finance is transfering to the vast rural market, and quickly to develop in depth. Developing the business processing network system suitable for the rural credit cooperative Banks can make business processing conveniently, and have a good application prospect. In this paper, We analyse the necessity of adopting special purpose distributed database in Credit Cooperation Band System, give corresponding distributed database system structure , design the specical purpose database and interface technology . The application in Tongbai Rural Credit Cooperatives has shown that system has better performance and higher efficiency.

  12. Spent fuel composition database system on WWW. SFCOMPO on WWW Ver.2

    International Nuclear Information System (INIS)

    Mochizuki, Hiroki; Suyama, Kenya; Nomura, Yasushi; Okuno, Hiroshi

    2001-08-01

    'SFCOMPO on WWW Ver.2' is an advanced version of 'SFCOMPO on WWW (Spent Fuel Composition Database System on WWW' released in 1997. This new version has a function of database management by an introduced relational database software 'PostgreSQL' and has various searching methods. All of the data required for the calculation of isotopic composition is available from the web site of this system. This report describes the outline of this system and the searching method using Internet. In addition, the isotopic composition data and the reactor data of the 14 LWRs (7 PWR and 7 BWR) registered in this system are described. (author)

  13. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  14. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  15. The Database and Data Analysis Software of Radiation Monitoring System

    International Nuclear Information System (INIS)

    Wang Weizhen; Li Jianmin; Wang Xiaobing; Hua Zhengdong; Xu Xunjiang

    2009-01-01

    Shanghai Synchrotron Radiation Facility (SSRF for short) is a third-generation light source building in China, including a 150MeV injector, 3.5GeV booster, 3.5GeV storage ring and an amount of beam line stations. The data is fetched by the monitoring computer from collecting modules in the front end, and saved in the MySQL database in the managing computer. The data analysis software is coded with Python, a script language, to inquire, summarize and plot the data of a certain monitoring channel during a certain period and export to an external file. In addition, the warning event can be inquired separately. The website for historical and real-time data inquiry and plotting is coded with PHP. (authors)

  16. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  17. Development of database system on MOX fuel for water reactors (I)

    International Nuclear Information System (INIS)

    Kikuchi, Keiichi; Nakazawa, Hiroaki; Abe, Tomoyuki; Shirai, Takao

    2000-04-01

    JNC has been conducted a great number of irradiation tests to develop MOX fuels for Advanced Thermal Reactor and Light Water Reactors. In order to manage irradiation data consistently and to effectively utilize valuable data obtained from the irradiation tests, we commenced construction of database system on MOX fuel for water reactors in 1998 JFY. Collection and selection of irradiation data and relevant fuel fabrication data, design of the database system and preparation of assisting programs have been finished and data registration onto the system is under way according to priority at present. The database system can be operated through the menu screen on PC. About 94,000 records of data on 11 fuel assemblies in total have been registered onto the database up to the present. By conducting registration of the remaining data and some modification of the system, if necessary, the database system is expected to complete in 2000 JFY. The completed database system is to be distributed to relevant sections in JNC by means of CD-R as a media. This report is an interim report covering 1998 and 1999 JFY, which gives the structure explanation and users manual concerning to the prepared database up to the present. (author)

  18. Studies on preparation of the database system for clinical records of atomic bomb survivors

    International Nuclear Information System (INIS)

    Nakamura, Tsuyoshi

    1981-01-01

    Construction of the database system aimed at multipurpose application of data on clinical medicine was studied through the preparation of database system for clinical records of atomic bomb survivors. The present database includes the data about 110,000 atomic bomb survivors in Nagasaki City. This study detailed: (1) Analysis of errors occurring in a period from generation of data in the clinical field to input into the database, and discovery of a highly precise, effective method of input. (2) Development of a multipurpose program for uniform processing of data on physical examinations from many organizations. (3) Development of a record linkage method for voluminous files which are essential in the construction of a large-scale medical information system. (4) A database model suitable for clinical research and a method for designing a segment suitable for physical examination data. (Chiba, N.)

  19. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  20. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    Science.gov (United States)

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  1. The establishment of the Blacknest seismological database on the Rutherford Laboratory system 360/195 computer

    International Nuclear Information System (INIS)

    Blamey, C.

    1977-01-01

    In order to assess the problems which might arise from monitoring a comprehensive test ban treaty by seismological methods, an experimental monitoring operation is being conducted. This work has involved the establishment of a database on the Rutherford Laboratory 360/195 system computer. The database can be accessed in the UK over the public telephone network and in the USA via ARPANET. (author)

  2. PACSY, a relational database management system for protein structure and chemical shift analysis.

    Science.gov (United States)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L

    2012-10-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu.

  3. PACSY, a relational database management system for protein structure and chemical shift analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Woonghee, E-mail: whlee@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States); Yu, Wookyung [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Kim, Suhkmann [Pusan National University, Department of Chemistry and Chemistry Institute for Functional Materials (Korea, Republic of); Chang, Iksoo [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Lee, Weontae, E-mail: wlee@spin.yonsei.ac.kr [Yonsei University, Structural Biochemistry and Molecular Biophysics Laboratory, Department of Biochemistry (Korea, Republic of); Markley, John L., E-mail: markley@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States)

    2012-10-15

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  4. Transport and Environment Database System (TRENDS): Maritime Air Pollutant Emission Modelling

    DEFF Research Database (Denmark)

    Georgakaki, Aliki; Coffey, Robert; Lock, Grahm

    2005-01-01

    This paper reports the development of the maritime module within the framework of the Transport and Environment Database System (TRENDS) project. A detailed database has been constructed for the calculation of energy consumption and air pollutant emissions. Based on an in-house database...... changes from findings reported in Methodologies for Estimating air pollutant Emissions from Transport (MEET). The database operates on statistical data provided by Eurostat, which describe vessel and freight movements from and towards EU 15 major ports. Data are at port to Maritime Coastal Area (MCA...... with a view to this purpose, are mentioned. Examples of the results obtained by the database are presented. These include detailed air pollutant emission calculations for bulk carriers entering the port of Helsinki, as an example of the database operation, and aggregate results for different types...

  5. PACSY, a relational database management system for protein structure and chemical shift analysis

    Science.gov (United States)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu. PMID:22903636

  6. PACSY, a relational database management system for protein structure and chemical shift analysis

    International Nuclear Information System (INIS)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L.

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  7. Field validation of food service listings: a comparison of commercial and online geographic information system databases.

    Science.gov (United States)

    Seliske, Laura; Pickett, William; Bates, Rebecca; Janssen, Ian

    2012-08-01

    Many studies examining the food retail environment rely on geographic information system (GIS) databases for location information. The purpose of this study was to validate information provided by two GIS databases, comparing the positional accuracy of food service places within a 1 km circular buffer surrounding 34 schools in Ontario, Canada. A commercial database (InfoCanada) and an online database (Yellow Pages) provided the addresses of food service places. Actual locations were measured using a global positioning system (GPS) device. The InfoCanada and Yellow Pages GIS databases provided the locations for 973 and 675 food service places, respectively. Overall, 749 (77.1%) and 595 (88.2%) of these were located in the field. The online database had a higher proportion of food service places found in the field. The GIS locations of 25% of the food service places were located within approximately 15 m of their actual location, 50% were within 25 m, and 75% were within 50 m. This validation study provided a detailed assessment of errors in the measurement of the location of food service places in the two databases. The location information was more accurate for the online database, however, when matching criteria were more conservative, there were no observed differences in error between the databases.

  8. The PEP II injection kicker system

    International Nuclear Information System (INIS)

    Pappas, G.C.; Donaldson, A.R.; Williams, D.

    1997-07-01

    PEP II or the B Factory consists of two asymmetric storage rings. The injection energy for electrons is 9 GeV, while that for positrons is 3.1 GeV. The bend angle into the high energy ring (HER) is 0.35 m-rad, and the angle into the low energy ring (LER) is 0.575 m-rad. The magnetic length for the HER kicker is 0.85 m, and 0.55 m for the LER kicker. The field produced by the magnet is therefore 123.5 G for the HER, and 132 G for the LER. Each ring has a kicker magnet upstream of the injection line which is used to distort the orbit of the stored beam. An identical magnet downstream of the injection line is used to restore the orbit of the stored beam and inject the incoming beam. The two magnets are driven in parallel by the modulator. The apeture of the magnets is 3.86x3.46 cm (HxV). Therefore the current required to drive the HER is 863 A, while for the LER it is 756 A. The inductance of the magnet is approximately 1.4 uH/m. The current pulse is a critically damped sinusoid with a rise time of less than 300 ns. A kicker system has been designed which can be used for injection of both beams by varying the charge of voltage. The modulator uses a conjugate circuit to match the impedance of the magnet, and coupling to the beam chamber

  9. Database retrieval systems for nuclear and astronomical data

    International Nuclear Information System (INIS)

    Suda, Takuma; Korennov, Sergei; Otuka, Naohiko; Yamada, Shimako; Katsuta, Yutaka; Ohnishi, Akira; Kato, Kiyoshi; Fujimoto, Masayuki Y.

    2006-01-01

    Data retrieval and plot systems of nuclear and astronomical data are constructed on a common platform. Web-based systems will soon be opened to the users of both fields of nuclear physics and astronomy. (author)

  10. Facility information system `SOINS-IIS`; Shisetsu joho kanri system `SOINS-IIS`

    Energy Technology Data Exchange (ETDEWEB)

    Shimaoka, S.; Watanabe, M.; Mizuno, Y. [Fuji Electric Co. Ltd., Tokyo (Japan)

    1998-07-10

    With the informatization in the industry, office space is becoming the center of business activities. Also as to the facility control, a facility control system is required which is added with functions of information service to users and the management support system, in addition to the conventional system used mainly for equipment maintenance. Fuji Electric Co. developed a facility information control system, SOINS-IIS (social information system-infrastructure information system), into which the above-mentioned functions were integrated. The features of the system were presented in examples of the introduction to Ichibankan, YRP (Yokosuka Research Park) Center and R and D Center, NTT DoCoMo. The system roughly has an information service function for facility users, function of management for office staff such as tenant management and bill management, management support function for facility owners and planning departments. Beside the above-mentioned functions, in case of YRP Center, for example, the system has functions of management of reservation of meeting rooms, etc., terminal display of common use information and terminal display of information, and many other management support functions. 10 figs.

  11. Comparison of VATS and Robotic Approaches For Clinical Stage I and II NSCLC Using the STS Database

    Science.gov (United States)

    Louie, Brian E.; Wilson, Jennifer L.; Kim, Sunghee; Cerfolio, Robert J.; Park, Bernard J.; Farivar, Alexander S.; Vallières, Eric; Aye, Ralph W.; Burfeind, William R.; Block, Mark I.

    2016-01-01

    Background Data from selected centers show that robotic lobectomy (RL) is safe, effective and has comparable 30-day mortality to video assisted lobectomy (VATS). However, widespread adoption of RL is controversial. We used the STS-GTS-Database to evaluate quality metrics for these two minimally invasive lobectomy techniques. Methods A database query for primary clinical stage I or II NSCLC at high volume centers from 2009 to 2013 identified 1,220 RLs and 12,378 VATS. Quality metrics evaluated included operative morbidity, 30-day mortality and nodal upstaging (NU), defined as cN0 to pN1. Multivariable logistic regression was used to evaluate NU. Results RL patients were older, less active, less likely to be an ever smoker, and had higher BMI (all p<0.05). They were also more likely to have coronary heart disease or hypertension (all p<0.001) and to have had preoperative mediastinal staging (p<0.0001). RL operative times were longer (median 186 vs 173 min, p<0.001); all other operative parameters were similar. All postoperative outcomes were similar including complications and 30-day mortality (RL 0.6% vs VATS 0.8%, p=0.4). Median length of stay was 4 days for both, but a higher proportion of RLs stayed < 4 days: 48% vs 39%, p<0.001. NU overall was similar (p=0.6), but with trends favoring VATS in the cT1b group, and RL in the cT2a group. Conclusions RL patients had more co-morbidities and RL operative times were longer, but quality outcome measures including complications, hospital stay, 30-day mortality, and NU suggest RL and VATS are equivalent. PMID:27209613

  12. Advanced operating technique using the VR database system

    International Nuclear Information System (INIS)

    Lee, Il-Suk; Yoon, Sang-Hyuk; Suh, Kune Y.

    2003-01-01

    For the timely and competitive response to rapidly changing energy environment in the twenty-first century, there is a growing need to build the advanced nuclear power plants in the unlimited workspace of virtual reality (VR) prior to commissioning. One can then realistically evaluate their construction time and cost per varying methods and options available from the leading-edge technology. In particular, a great deal of efforts have yet to be made for time- and cost-dependent plant simulation and dynamically coupled database construction in the VR space. The present work is being proposed in the three-dimensional space and time plus cost coordinates, i.e. four plus dimensional (4 + D) coordinates. The 4 + D VR technology TM will help the preliminary VR simulation capability for the plants will supply the vital information not only for the actual design and construction of the engineered structures but also for the on-line design modification. Quite a few companies and research institutions have supplied various information services to the nuclear market. A great deal of the information exists in the form of reports, articles, books, which are just kind of simple texts and graphic images. But if very large and important information transfer methods are developed for the nuclear plants by means of the 4 + D technology database, they will tend to greatly benefit the designers, manufacturers, users and even the public. Moreover, one can understand clearly the total structure of the nuclear plants if the 4 + D VR technology TM database operates together with the transient analysis simulator. This technique should be available for public information about the nuclear industry as well as nuclear plant structure and components. By using the 4 + D VR technology TM one can supply the information to users which couldn't have been expressed by the existing technology. Users can not only spin or observe closely the structural elements by simple mouse control, but also know

  13. Asynchronous data change notification between database server and accelerator controls system

    International Nuclear Information System (INIS)

    Fu, W.; Morris, J.; Nemesure, S.

    2011-01-01

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.

  14. Space-Ready Advanced Imaging System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase II effort Toyon will increase the state-of-the-art for video/image systems. This will include digital image compression algorithms as well as system...

  15. 7th Asian Conference on Intelligent Information and Database Systems (ACIIDS 2015)

    CERN Document Server

    Nguyen, Ngoc; Batubara, John; New Trends in Intelligent Information and Database Systems

    2015-01-01

    Intelligent information and database systems are two closely related subfields of modern computer science which have been known for over thirty years. They focus on the integration of artificial intelligence and classic database technologies to create the class of next generation information systems. The book focuses on new trends in intelligent information and database systems and discusses topics addressed to the foundations and principles of data, information, and knowledge models, methodologies for intelligent information and database systems analysis, design, and implementation, their validation, maintenance and evolution. They cover a broad spectrum of research topics discussed both from the practical and theoretical points of view such as: intelligent information retrieval, natural language processing, semantic web, social networks, machine learning, knowledge discovery, data mining, uncertainty management and reasoning under uncertainty, intelligent optimization techniques in information systems, secu...

  16. Smart travel guide: from internet image database to intelligent system

    Science.gov (United States)

    Chareyron, Ga"l.; Da Rugna, Jérome; Cousin, Saskia

    2011-02-01

    To help the tourist to discover a city, a region or a park, many options are provided by public tourism travel centers, by free online guides or by dedicated book guides. Nonetheless, these guides provide only mainstream information which are not conform to a particular tourist behavior. On the other hand, we may find several online image databases allowing users to upload their images and to localize each image on a map. These websites are representative of tourism practices and constitute a proxy to analyze tourism flows. Then, this work intends to answer this question: knowing what I have visited and what other people have visited, where should I go now? This process needs to profile users, sites and photos. our paper presents the acquired data and relationship between photographers, sites and photos and introduces the model designed to correctly estimate the site interest of each tourism point. The third part shows an application of our schema: a smart travel guide on geolocated mobile devices. This android application is a travel guide truly matching the user wishes.

  17. Component configuration control system development at EBR-II

    International Nuclear Information System (INIS)

    Monson, L.R.; Stratton, R.C.

    1984-01-01

    One ofthe major programs being pursued by the EBR-II Division of Argonne National Laboratory is to improve the reliability of plant control and protection systems. This effort involves looking closely at the present state of the art and needs associated with plant diagnostic, control and protection systems. One of the areas of development at EBR-II involves a component configuration control system (CCCS). This system is a computerized control and planning aid for the nuclear power operator

  18. Database system of geological information for geological evaluation base of NPP sites(I)

    International Nuclear Information System (INIS)

    Lim, C. B.; Choi, K. R.; Sim, T. M.; No, M. H.; Lee, H. W.; Kim, T. K.; Lim, Y. S.; Hwang, S. K.

    2002-01-01

    This study aims to provide database system for site suitability analyses of geological information and a processing program for domestic NPP site evaluation. This database system program includes MapObject provided by ESRI and Spread 3.5 OCX, and is coded with Visual Basic language. Major functions of the systematic database program includes vector and raster farmat topographic maps, database design and application, geological symbol plot, the database search for the plotted geological symbol, and so on. The program can also be applied in analyzing not only for lineament trends but also for statistic treatment from geologically site and laboratory information and sources in digital form and algorithm, which is usually used internationally

  19. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    Science.gov (United States)

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  20. Development of nuclear power plants database system, (2)

    International Nuclear Information System (INIS)

    Izumi, Fumio; Ichikawa, Michio

    1984-06-01

    A nuclear power plant data base system has been developed. The data base involves a large amount of safety design informations for nuclear power plants on operating and planning stage in Japan. The informations, if necessary, can be searched for at high speed by use of this system. The present report is an user's guide for access to the informations utilizing display unit of the JAERI computer network system. (author)

  1. A Coding System for Analysing a Spoken Text Database.

    Science.gov (United States)

    Cutting, Joan

    1994-01-01

    This paper describes a coding system devised to analyze conversations of graduate students in applied linguistics at Edinburgh University. The system was devised to test the hypothesis that as shared knowledge among conversation participants grows, the textual density of in-group members has more cues than that of strangers. The informal…

  2. Schema architecture and their relationships to transaction processing in distributed database systems

    NARCIS (Netherlands)

    Apers, Peter M.G.; Scheuermann, P.

    1991-01-01

    We discuss the different types of schema architectures which could be supported by distributed database systems, making a clear distinction between logical, physical, and federated distribution. We elaborate on the additional mapping information required in architecture based on logical distribution

  3. Semantic-Based Concurrency Control for Object-Oriented Database Systems Supporting Real-Time Applications

    National Research Council Canada - National Science Library

    Lee, Juhnyoung; Son, Sang H

    1994-01-01

    .... This paper investigates major issues in designing semantic-based concurrency control for object-oriented database systems supporting real-time applications, and it describes approaches to solving...

  4. Content Based Retrieval Database Management System with Support for Similarity Searching and Query Refinement

    National Research Council Canada - National Science Library

    Ortega-Binderberger, Michael

    2002-01-01

    ... as a critical area of research. This thesis explores how to enhance database systems with content based search over arbitrary abstract data types in a similarity based framework with query refinement...

  5. Development of the severe accident risk information database management system SARD

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies

  6. Development of the severe accident risk information database management system SARD

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies.

  7. Prototype system tests of the Belle II PXD DAQ system

    Energy Technology Data Exchange (ETDEWEB)

    Fleischer, Soeren; Gessler, Thomas; Kuehn, Wolfgang; Lange, Jens Soeren; Muenchow, David; Spruck, Bjoern [II. Physikalisches Institut, Justus-Liebig-Universitaet Giessen (Germany); Liu, Zhen' An; Xu, Hao; Zhao, Jingzhou [Institute of High Energy Physics, Chinese Academy of Sciences (China); Collaboration: II PXD Collaboration

    2012-07-01

    The data acquisition system for the Belle II DEPFET Pixel Vertex Detector (PXD) is designed to cope with a high input data rate of up to 21.6 GB/s. The main hardware component will be AdvancedTCA-based Compute Nodes (CN) equipped with Xilinx Virtex-5 FX70T FPGAs. The design for the third Compute Node generation was completed recently. The xTCA-compliant system features a carrier board and 4 AMC daughter boards. First test results of a prototype board will be presented, including tests of (a) The high-speed optical links used for data input, (b) The two 2 GB DDR2-chips on the board and (c) Output of data via ethernet, using UDP and TCP/IP with both hardware and software protocol stacks.

  8. Search extension transforms Wiki into a relational system: a case for flavonoid metabolite database.

    Science.gov (United States)

    Arita, Masanori; Suwa, Kazuhiro

    2008-09-17

    In computer science, database systems are based on the relational model founded by Edgar Codd in 1970. On the other hand, in the area of biology the word 'database' often refers to loosely formatted, very large text files. Although such bio-databases may describe conflicts or ambiguities (e.g. a protein pair do and do not interact, or unknown parameters) in a positive sense, the flexibility of the data format sacrifices a systematic query mechanism equivalent to the widely used SQL. To overcome this disadvantage, we propose embeddable string-search commands on a Wiki-based system and designed a half-formatted database. As proof of principle, a database of flavonoid with 6902 molecular structures from over 1687 plant species was implemented on MediaWiki, the background system of Wikipedia. Registered users can describe any information in an arbitrary format. Structured part is subject to text-string searches to realize relational operations. The system was written in PHP language as the extension of MediaWiki. All modifications are open-source and publicly available. This scheme benefits from both the free-formatted Wiki style and the concise and structured relational-database style. MediaWiki supports multi-user environments for document management, and the cost for database maintenance is alleviated.

  9. A human friendly reporting and database system for brain PET analysis

    International Nuclear Information System (INIS)

    Jamzad, M.; Ishii, Kenji; Toyama, Hinako; Senda, Michio

    1996-01-01

    We have developed a human friendly reporting and database system for clinical brain PET (Positron Emission Tomography) scans, which enables statistical data analysis on qualitative information obtained from image interpretation. Our system consists of a Brain PET Data (Input) Tool and Report Writing Tool. In the Brain PET Data Tool, findings and interpretations are input by selecting menu icons in a window panel instead of writing a free text. This method of input enables on-line data entry into and update of the database by means of pre-defined consistent words, which facilitates statistical data analysis. The Report Writing Tool generates a one page report of natural English sentences semi-automatically by using the above input information and the patient information obtained from our PET center's main database. It also has a keyword selection function from the report text so that we can save a set of keywords on the database for further analysis. By means of this system, we can store the data related to patient information and visual interpretation of the PET examination while writing clinical reports in daily work. The database files in our system can be accessed by means of commercially available databases. We have used the 4th Dimension database that runs on a Macintosh computer and analyzed 95 cases of 18 F-FDG brain PET studies. The results showed high specificity of parietal hypometabolism for Alzheimer's patients. (author)

  10. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    International Nuclear Information System (INIS)

    Gaponov, Yu.A.; Igarashi, N.; Hiraki, M.; Sasajima, K.; Matsugaki, N.; Suzuki, M.; Kosuge, T.; Wakatsuki, S.

    2004-01-01

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view, create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments

  11. Development of the LEP high level control system using ORACLE as an online database

    International Nuclear Information System (INIS)

    Bailey, R.; Belk, A.; Collier, P.; Lamont, M.; De Rijk, G.; Tarrant, M.

    1994-01-01

    A complete rewrite of the high level application software for the control of LEP has been carried out. ORACLE was evaluated and subsequently used as the on-line database in the implementation of the system. All control information and settings are stored on this database. This paper describes the project development cycle, the method used, the use of CASE and the project management used by the team. The performance of the system and the database and their impact on the LEP performance is discussed. ((orig.))

  12. Acceptance test procedure for the master equipment list (MEL)database system -- phase I

    International Nuclear Information System (INIS)

    Jech, J.B.

    1997-01-01

    The Waste Remediation System/.../Facilities Configuration Management Integration group has requested development of a system to help resolve many of the difficulties associated with management of master equipment list information. This project has been identified as Master Equipment List (MEL) database system. Further definition is contained in the system requirements specification (SRS), reference 7

  13. A Preliminary Study on the Multiple Mapping Structure of Classification Systems for Heterogeneous Databases

    OpenAIRE

    Seok-Hyoung Lee; Hwan-Min Kim; Ho-Seop Choe

    2012-01-01

    While science and technology information service portals and heterogeneous databases produced in Korea and other countries are integrated, methods of connecting the unique classification systems applied to each database have been studied. Results of technologists' research, such as, journal articles, patent specifications, and research reports, are organically related to each other. In this case, if the most basic and meaningful classification systems are not connected, it is difficult to ach...

  14. Massively Parallel Sort-Merge Joins in Main Memory Multi-Core Database Systems

    OpenAIRE

    Martina-Cezara Albutiu, Alfons Kemper, Thomas Neumann

    2012-01-01

    Two emerging hardware trends will dominate the database system technology in the near future: increasing main memory capacities of several TB per server and massively parallel multi-core processing. Many algorithmic and control techniques in current database technology were devised for disk-based systems where I/O dominated the performance. In this work we take a new look at the well-known sort-merge join which, so far, has not been in the focus of research ...

  15. Comparison of open source database systems(characteristics, limits of usage)

    OpenAIRE

    Husárik, Braňko

    2008-01-01

    The goal of this work is to compare some chosen open source database systems (Ingres, PostgreSQL, Firebird, Mysql). First part of work is focused on history and present situation of companies which are developing these products. Second part contains the comparision of certain group of specific features and limits. The benchmark of some operations is its own part. Possibilities of usage of mentioned database systems are summarized at the end of work.

  16. Coupling an Unstructured NoSQL Database with a Geographic Information System

    OpenAIRE

    Holemans, Amandine; Kasprzyk, Jean-Paul; Donnay, Jean-Paul

    2018-01-01

    The management of unstructured NoSQL (Not only Structured Query Language) databases has undergone a great development in the last years mainly thanks to Big Data. Nevertheless, the specificity of spatial information is not purposely taken into account. To overcome this difficulty, we propose to couple a NoSQL database with a spatial Relational Data Base Management System (RDBMS). Exchanges of information between these two systems are illustrated with relevant examples ...

  17. HPC Colony II: FAST_OS II: Operating Systems and Runtime Systems at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Moreira, Jose [IBM, Armonk, NY (United States)

    2013-11-13

    HPC Colony II has been a 36-month project focused on providing portable performance for leadership class machines—a task made difficult by the emerging variety of more complex computer architectures. The project attempts to move the burden of portable performance to adaptive system software, thereby allowing domain scientists to concentrate on their field rather than the fine details of a new leadership class machine. To accomplish our goals, we focused on adding intelligence into the system software stack. Our revised components include: new techniques to address OS jitter; new techniques to dynamically address load imbalances; new techniques to map resources according to architectural subtleties and application dynamic behavior; new techniques to dramatically improve the performance of checkpoint-restart; and new techniques to address membership service issues at scale.

  18. System/subsystem specifications for the Worldwide Port System (WPS) Regional Integrated Cargo Database (ICDB)

    Energy Technology Data Exchange (ETDEWEB)

    Rollow, J.P.; Shipe, P.C.; Truett, L.F. [Oak Ridge National Lab., TN (United States); Faby, E.Z.; Fluker, J.; Grubb, J.; Hancock, B.R. [Univ. of Tennessee, Knoxville, TN (United States); Ferguson, R.A. [Science Applications International Corp., Oak Ridge, TN (United States)

    1995-11-20

    A system is being developed by the Military Traffic Management Command (MTMC) to provide data integration and worldwide management and tracking of surface cargo movements. The Integrated Cargo Database (ICDB) will be a data repository for the WPS terminal-level system, will be a primary source of queries and cargo traffic reports, will receive data from and provide data to other MTMC and non-MTMC systems, will provide capabilities for processing Advance Transportation Control and Movement Documents (ATCMDs), and will process and distribute manifests. This System/Subsystem Specifications for the Worldwide Port System Regional ICDB documents the system/subsystem functions, provides details of the system/subsystem analysis in order to provide a communication link between developers and operational personnel, and identifies interfaces with other systems and subsystems. It must be noted that this report is being produced near the end of the initial development phase of ICDB, while formal software testing is being done. Following the initial implementation of the ICDB system, maintenance contractors will be in charge of making changes and enhancing software modules. Formal testing and user reviews may indicate the need for additional software units or changes to existing ones. This report describes the software units that are components of this ICDB system as of August 1995.

  19. Comparison of scientific and administrative database management systems

    Science.gov (United States)

    Stoltzfus, J. C.

    1983-01-01

    Some characteristics found to be different for scientific and administrative data bases are identified and some of the corresponding generic requirements for data base management systems (DBMS) are discussed. The requirements discussed are especially stringent for either the scientific or administrative data bases. For some, no commercial DBMS is fully satisfactory, and the data base designer must invent a suitable approach. For others, commercial systems are available with elegant solutions, and a wrong choice would mean an expensive work-around to provide the missing features. It is concluded that selection of a DBMS must be based on the requirements for the information system. There is no unique distinction between scientific and administrative data bases or DBMS. The distinction comes from the logical structure of the data, and understanding the data and their relationships is the key to defining the requirements and selecting an appropriate DBMS for a given set of applications.

  20. Implementing real-time robotic systems using CHIMERA II

    Science.gov (United States)

    Stewart, David B.; Schmitz, Donald E.; Khosla, Pradeep K.

    1990-01-01

    A description is given of the CHIMERA II programming environment and operating system, which was developed for implementing real-time robotic systems. Sensor-based robotic systems contain both general- and special-purpose hardware, and thus the development of applications tends to be a time-consuming task. The CHIMERA II environment is designed to reduce the development time by providing a convenient software interface between the hardware and the user. CHIMERA II supports flexible hardware configurations which are based on one or more VME-backplanes. All communication across multiple processors is transparent to the user through an extensive set of interprocessor communication primitives. CHIMERA II also provides a high-performance real-time kernel which supports both deadline and highest-priority-first scheduling. The flexibility of CHIMERA II allows hierarchical models for robot control, such as NASREM, to be implemented with minimal programming time and effort.

  1. Designing the database for a reliability aware Model-Based System Engineering process

    International Nuclear Information System (INIS)

    Cressent, Robin; David, Pierre; Idasiak, Vincent; Kratz, Frederic

    2013-01-01

    This article outlines the need for a reliability database to implement model-based description of components failure modes and dysfunctional behaviors. We detail the requirements such a database should honor and describe our own solution: the Dysfunctional Behavior Database (DBD). Through the description of its meta-model, the benefits of integrating the DBD in the system design process is highlighted. The main advantages depicted are the possibility to manage feedback knowledge at various granularity and semantic levels and to ease drastically the interactions between system engineering activities and reliability studies. The compliance of the DBD with other reliability database such as FIDES is presented and illustrated. - Highlights: ► Model-Based System Engineering is more and more used in the industry. ► It results in a need for a reliability database able to deal with model-based description of dysfunctional behavior. ► The Dysfunctional Behavior Database aims to fulfill that need. ► It helps dealing with feedback management thanks to its structured meta-model. ► The DBD can profit from other reliability database such as FIDES.

  2. Data-based control trajectory planning for nonlinear systems

    International Nuclear Information System (INIS)

    Rhodes, C.; Morari, M.; Tsimring, L.S.; Rulkov, N.F.

    1997-01-01

    An open-loop trajectory planning algorithm is presented for computing an input sequence that drives an input-output system such that a reference trajectory is tracked. The algorithm utilizes only input-output data from the system to determine the proper control sequence, and does not require a mathematical or identified description of the system dynamics. From the input-output data, the controlled input trajectory is calculated in a open-quotes one-step-aheadclose quotes fashion using local modeling. Since the algorithm is calculated in this fashion, the output trajectories to be tracked can be nonperiodic. The algorithm is applied to a driven Lorenz system, and an experimental electrical circuit and the results are analyzed. Issues of stability associated with the implementation of this open-loop scheme are also examined using an analytic example of a driven Hacute enon map, problems associated with inverse controllers are illustrated, and solutions to these problems are proposed. copyright 1997 The American Physical Society

  3. The Cronus Distributed DBMS (Database Management System) Project

    Science.gov (United States)

    1989-10-01

    projects, e.g., HiPAC [Dayal 88] and Postgres [Stonebraker 86]. Although we expect to use these techniques, they have been developed for centralized...Computing Systems, June 1989. (To appear). [Stonebraker 86] Stonebraker, M. and Rowe, L. A., "The Design of POSTGRES ," Proceedings ACM SIGMOD Annual

  4. Development of radiation oncology learning system combined with multi-institutional radiotherapy database (ROGAD)

    International Nuclear Information System (INIS)

    Takemura, Akihiro; Iinuma, Masahiro; Kou, Hiroko; Harauchi, Hajime; Inamura, Kiyonari

    1999-01-01

    We have constructed and are operating a multi-institutional radiotherapy database ROGAD (Radiation Oncology Greater Area Database) since 1992. One of it's purpose is 'to optimize individual radiotherapy plans'. We developed Radiation oncology learning system combined with ROGAD' which conforms to that purpose. Several medical doctors evaluated our system. According to those evaluations, we are now confident that our system is able to contribute to improvement of radiotherapy results. Our final target is to generate a good cyclic relationship among three components: radiotherapy results according to ''Radiation oncology learning system combined with ROGAD.'; The growth of ROGAD; and radiation oncology learning system. (author)

  5. CardioTF, a database of deconstructing transcriptional circuits in the heart system.

    Science.gov (United States)

    Zhen, Yisong

    2016-01-01

    Information on cardiovascular gene transcription is fragmented and far behind the present requirements of the systems biology field. To create a comprehensive source of data for cardiovascular gene regulation and to facilitate a deeper understanding of genomic data, the CardioTF database was constructed. The purpose of this database is to collate information on cardiovascular transcription factors (TFs), position weight matrices (PWMs), and enhancer sequences discovered using the ChIP-seq method. The Naïve-Bayes algorithm was used to classify literature and identify all PubMed abstracts on cardiovascular development. The natural language learning tool GNAT was then used to identify corresponding gene names embedded within these abstracts. Local Perl scripts were used to integrate and dump data from public databases into the MariaDB management system (MySQL). In-house R scripts were written to analyze and visualize the results. Known cardiovascular TFs from humans and human homologs from fly, Ciona, zebrafish, frog, chicken, and mouse were identified and deposited in the database. PWMs from Jaspar, hPDI, and UniPROBE databases were deposited in the database and can be retrieved using their corresponding TF names. Gene enhancer regions from various sources of ChIP-seq data were deposited into the database and were able to be visualized by graphical output. Besides biocuration, mouse homologs of the 81 core cardiac TFs were selected using a Naïve-Bayes approach and then by intersecting four independent data sources: RNA profiling, expert annotation, PubMed abstracts and phenotype. The CardioTF database can be used as a portal to construct transcriptional network of cardiac development. Database URL: http://www.cardiosignal.org/database/cardiotf.html.

  6. IMG: the integrated microbial genomes database and comparative analysis system

    Science.gov (United States)

    Markowitz, Victor M.; Chen, I-Min A.; Palaniappan, Krishna; Chu, Ken; Szeto, Ernest; Grechkin, Yuri; Ratner, Anna; Jacob, Biju; Huang, Jinghua; Williams, Peter; Huntemann, Marcel; Anderson, Iain; Mavromatis, Konstantinos; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2012-01-01

    The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp). PMID:22194640

  7. Harmonised information exchange between decentralised food composition database systems

    DEFF Research Database (Denmark)

    Pakkala, Heikki; Christensen, Tue; Martínez de Victoria, Ignacio

    2010-01-01

    documentation and by the use of standardised thesauri. Subjects/Methods: The data bank is implemented through a network of local FCD storages (usually national) under the control and responsibility of the local (national) EuroFIR partner. Results: The implementation of the system based on the Euro......FIR specifications is under development. The data interchange happens through the EuroFIR Web Services interface, allowing the partners to implement their system using methods and software suitable for the local computer environment. The implementation uses common international standards, such as Simple Object...... Access Protocol, Web Service Description Language and Extensible Markup Language (XML). A specifically constructed EuroFIR search facility (eSearch) was designed for end users. The EuroFIR eSearch facility compiles queries using a specifically designed Food Data Query Language and sends a request...

  8. Ezilla Cloud Service with Cassandra Database for Sensor Observation System

    OpenAIRE

    Kuo-Yang Cheng; Yi-Lun Pan; Chang-Hsing Wu; His-En Yu; Hui-Shan Chen; Weicheng Huang

    2012-01-01

    The main mission of Ezilla is to provide a friendly interface to access the virtual machine and quickly deploy the high performance computing environment. Ezilla has been developed by Pervasive Computing Team at National Center for High-performance Computing (NCHC). Ezilla integrates the Cloud middleware, virtualization technology, and Web-based Operating System (WebOS) to form a virtual computer in distributed computing environment. In order to upgrade the dataset and sp...

  9. Background qualitative analysis of the European reference life cycle database (ELCD) energy datasets - part II: electricity datasets.

    Science.gov (United States)

    Garraín, Daniel; Fazio, Simone; de la Rúa, Cristina; Recchioni, Marco; Lechón, Yolanda; Mathieux, Fabrice

    2015-01-01

    The aim of this paper is to identify areas of potential improvement of the European Reference Life Cycle Database (ELCD) electricity datasets. The revision is based on the data quality indicators described by the International Life Cycle Data system (ILCD) Handbook, applied on sectorial basis. These indicators evaluate the technological, geographical and time-related representativeness of the dataset and the appropriateness in terms of completeness, precision and methodology. Results show that ELCD electricity datasets have a very good quality in general terms, nevertheless some findings and recommendations in order to improve the quality of Life-Cycle Inventories have been derived. Moreover, these results ensure the quality of the electricity-related datasets to any LCA practitioner, and provide insights related to the limitations and assumptions underlying in the datasets modelling. Giving this information, the LCA practitioner will be able to decide whether the use of the ELCD electricity datasets is appropriate based on the goal and scope of the analysis to be conducted. The methodological approach would be also useful for dataset developers and reviewers, in order to improve the overall Data Quality Requirements of databases.

  10. Extension of Generalized Fluid System Simulation Program's Fluid Property Database

    Science.gov (United States)

    Patel, Kishan

    2011-01-01

    This internship focused on the development of additional capabilities for the General Fluid Systems Simulation Program (GFSSP). GFSSP is a thermo-fluid code used to evaluate system performance by a finite volume-based network analysis method. The program was developed primarily to analyze the complex internal flow of propulsion systems and is capable of solving many problems related to thermodynamics and fluid mechanics. GFSSP is integrated with thermodynamic programs that provide fluid properties for sub-cooled, superheated, and saturation states. For fluids that are not included in the thermodynamic property program, look-up property tables can be provided. The look-up property tables of the current release version can only handle sub-cooled and superheated states. The primary purpose of the internship was to extend the look-up tables to handle saturated states. This involves a) generation of a property table using REFPROP, a thermodynamic property program that is widely used, and b) modifications of the Fortran source code to read in an additional property table containing saturation data for both saturated liquid and saturated vapor states. Also, a method was implemented to calculate the thermodynamic properties of user-fluids within the saturation region, given values of pressure and enthalpy. These additions required new code to be written, and older code had to be adjusted to accommodate the new capabilities. Ultimately, the changes will lead to the incorporation of this new capability in future versions of GFSSP. This paper describes the development and validation of the new capability.

  11. Construction of database server system for fuel thermo-physical properties

    International Nuclear Information System (INIS)

    Park, Chang Je; Kang, Kwon Ho; Song, Kee Chan

    2003-12-01

    To perform the evaluation of various fuels in the nuclear reactors, not only the mechanical properties but also thermo-physical properties are required as one of most important inputs for fuel performance code system. The main objective of this study is to make a database system for fuel thermo-physical properties and a PC-based hardware system has been constructed for ease use for the public with visualization such as web-based server system. This report deals with the hardware and software which are used in the database server system for nuclear fuel thermo-physical properties. It is expected to be highly useful to obtain nuclear fuel data without such a difficulty through opening the database of fuel properties to the public and is also helpful to research of development of various fuel of nuclear industry. Furthermore, the proposed models of nuclear fuel thermo-physical properties will be enough utilized to the fuel performance code system

  12. Development of the sorption and diffusion database system for safety assessment of geological disposal

    International Nuclear Information System (INIS)

    Tachi, Yukio; Tochigi, Yoshikatsu; Suyama, Tadahiro; Saito, Yoshihiko; Yui, Mikazu; Ochs, Michael

    2009-02-01

    Japan Atomic Energy Agency (JAEA) has been developing databases of sorption and diffusion parameters in buffer material (bentonite) and rock, which are key parameters for safety assessment of the geological disposal. These sorption and diffusion databases (SDB/DDB) have been firstly developed as an important basis for the H12 performance assessment (PA) of high-level radioactive waste disposal in Japan, and have been provided through the Web. JAEA has been and is continuing to improve and update the SDB/DDB in view of potential future data needs, focusing on assuring the desired quality level and testing the usefulness of the existing databases for possible applications to parameter-setting for the deep geological environment. The new web-based sorption and diffusion database system (JAEA-SDB/DDB) has been developed to utilize quality assuring procedure and to allow effective application for parameter setting, by adding the following functions to the existing database; - consistency and linkage between sorption and diffusion database - effective utilization of quality assuring (QA) guideline and categolized QA data - additional function for estimating of parameters and graphing of relation between parameters - counting and summarizing function for effective access to respective data for parameter setting. In the present report, practical examples were illustrated regarding the applicability of the database system to the parameter setting by using additional functions such as QA information and data estimation. This database system is expected to make it possible to obtain quick overview of the available data from the database, and to have suitable access to the respective data for parameter-setting for performance assessment and parameter-deriving for mechanistic modeling in traceable and transparent manner. (author)

  13. CONCEPTUAL DESIGN OF THE NSLS-II INJECTION SYSTEM.

    Energy Technology Data Exchange (ETDEWEB)

    SHAFTAN,T.; ROSE, T.; PINAYEV, I.; HEESE, R.; BENGTSSON, J.; SKARITKA, J.; MENG, W.; OZAKI, S.; MEIER, R.; STELMACH, C.; LITVINENKO, V.; PJEROV, S.; SHARMA, S.; GANETIS, G.; HSEUH, H.C.; JOHNSON, E.D.; TSOUPAS, N.; GUO, W.; BEEBE-WANG, J.; LUCCIO, A.U.; YU, L.H.; RAPARIA, D.; WANG, D.

    2007-06-25

    We present the conceptual design of the NSLS-II injection system [1,2]. The injection system consists of a low-energy linac, booster and transport lines. We review two different injection system configurations; a booster located in the storage ring tunnel and a booster housed in a separate building. We briefly discuss main parameters and layout of the injection system components.

  14. Nonmaterialized Relations and the Support of Information Retrieval Applications by Relational Database Systems.

    Science.gov (United States)

    Lynch, Clifford A.

    1991-01-01

    Describes several aspects of the problem of supporting information retrieval system query requirements in the relational database management system (RDBMS) environment and proposes an extension to query processing called nonmaterialized relations. User interactions with information retrieval systems are discussed, and nonmaterialized relations are…

  15. Database usage for the CMS ECAL Laser Monitoring System

    CERN Document Server

    Timciuc, Vladlen

    2009-01-01

    The CMS detector at LHC is equipped with a high precision electromagnetic crystal calorimeter (ECAL). The crystals experience a transparency change when exposed to radiation during LHC operation, which recovers in absents of irradiation on the time scale of hours. This change of the crystal response is monitored with a laser system which performs a transparency measurement of each crystal of the ECAL within twenty minutes. The monitoring data is analyzed on a PC farm attached to the central data acquisition system of CMS. After analyzing the raw data, a reduced data set is stored in the Online Master Data Base (OMDS) which is connected to the online computing infrastructure of CMS. The data stored in OMDS, representing the largest data set stored in OMDS for ECAL, contains all necessary information to perform a detailed crystal response monitoring as well as an analysis of the dynamics of the transparency change. For the CMS physics event data reconstruction, only a reduced set of information from the transpa...

  16. The International Experimental Thermal Hydraulic Systems database – TIETHYS: A new NEA validation tool

    Energy Technology Data Exchange (ETDEWEB)

    Rohatgi, Upendra S.

    2018-07-22

    Nuclear reactor codes require validation with appropriate data representing the plant for specific scenarios. The thermal-hydraulic data is scattered in different locations and in different formats. Some of the data is in danger of being lost. A relational database is being developed to organize the international thermal hydraulic test data for various reactor concepts and different scenarios. At the reactor system level, that data is organized to include separate effect tests and integral effect tests for specific scenarios and corresponding phenomena. The database relies on the phenomena identification sections of expert developed PIRTs. The database will provide a summary of appropriate data, review of facility information, test description, instrumentation, references for the experimental data and some examples of application of the data for validation. The current database platform includes scenarios for PWR, BWR, VVER, and specific benchmarks for CFD modelling data and is to be expanded to include references for molten salt reactors. There are place holders for high temperature gas cooled reactors, CANDU and liquid metal reactors. This relational database is called The International Experimental Thermal Hydraulic Systems (TIETHYS) database and currently resides at Nuclear Energy Agency (NEA) of the OECD and is freely open to public access. Going forward the database will be extended to include additional links and data as they become available. https://www.oecd-nea.org/tiethysweb/

  17. Failure and Maintenance Analysis Using Web-Based Reliability Database System

    International Nuclear Information System (INIS)

    Hwang, Seok Won; Kim, Myoung Su; Seong, Ki Yeoul; Na, Jang Hwan; Jerng, Dong Wook

    2007-01-01

    Korea Hydro and Nuclear Power Company has lunched the development of a database system for PSA and Maintenance Rule implementation. It focuses on the easy processing of raw data into a credible and useful database for the risk-informed environment of nuclear power plant operation and maintenance. Even though KHNP had recently completed the PSA for all domestic NPPs as a requirement of the severe accident mitigation strategy, the component failure data were only gathered as a means of quantification purposes for the relevant project. So, the data were not efficient enough for the Living PSA or other generic purposes. Another reason to build a real time database is for the newly adopted Maintenance Rule, which requests the utility to continuously monitor the plant risk based on its operation and maintenance performance. Furthermore, as one of the pre-condition for the Risk Informed Regulation and Application, the nuclear regulatory agency of Korea requests the development and management of domestic database system. KHNP is stacking up data of operation and maintenance on the Enterprise Resource Planning (ERP) system since its first opening on July, 2003. But, so far a systematic review has not been performed to apply the component failure and maintenance history for PSA and other reliability analysis. The data stored in PUMAS before the ERP system is introduced also need to be converted and managed into the new database structure and methodology. This reliability database system is a web-based interface on a UNIX server with Oracle relational database. It is designed to be applicable for all domestic NPPs with a common database structure and the web interfaces, therefore additional program development would not be necessary for data acquisition and processing in the near future. Categorization standards for systems and components have been implemented to analyze all domestic NPPs. For example, SysCode (for a system code) and CpCode (for a component code) were newly

  18. The Make 2D-DB II package: conversion of federated two-dimensional gel electrophoresis databases into a relational format and interconnection of distributed databases.

    Science.gov (United States)

    Mostaguir, Khaled; Hoogland, Christine; Binz, Pierre-Alain; Appel, Ron D

    2003-08-01

    The Make 2D-DB tool has been previously developed to help build federated two-dimensional gel electrophoresis (2-DE) databases on one's own web site. The purpose of our work is to extend the strength of the first package and to build a more efficient environment. Such an environment should be able to fulfill the different needs and requirements arising from both the growing use of 2-DE techniques and the increasing amount of distributed experimental data.

  19. Distributed Database Semantic Integration of Wireless Sensor Network to Access the Environmental Monitoring System

    Directory of Open Access Journals (Sweden)

    Ubaidillah Umar

    2018-06-01

    Full Text Available A wireless sensor network (WSN works continuously to gather information from sensors that generate large volumes of data to be handled and processed by applications. Current efforts in sensor networks focus more on networking and development services for a variety of applications and less on processing and integrating data from heterogeneous sensors. There is an increased need for information to become shareable across different sensors, database platforms, and applications that are not easily implemented in traditional database systems. To solve the issue of these large amounts of data from different servers and database platforms (including sensor data, a semantic sensor web service platform is needed to enable a machine to extract meaningful information from the sensor’s raw data. This additionally helps to minimize and simplify data processing and to deduce new information from existing data. This paper implements a semantic web data platform (SWDP to manage the distribution of data sensors based on the semantic database system. SWDP uses sensors for temperature, humidity, carbon monoxide, carbon dioxide, luminosity, and noise. The system uses the Sesame semantic web database for data processing and a WSN to distribute, minimize, and simplify information processing. The sensor nodes are distributed in different places to collect sensor data. The SWDP generates context information in the form of a resource description framework. The experiment results demonstrate that the SWDP is more efficient than the traditional database system in terms of memory usage and processing time.

  20. BtoxDB: a comprehensive database of protein structural data on toxin-antitoxin systems.

    Science.gov (United States)

    Barbosa, Luiz Carlos Bertucci; Garrido, Saulo Santesso; Marchetto, Reinaldo

    2015-03-01

    Toxin-antitoxin (TA) systems are diverse and abundant genetic modules in prokaryotic cells that are typically formed by two genes encoding a stable toxin and a labile antitoxin. Because TA systems are able to repress growth or kill cells and are considered to be important actors in cell persistence (multidrug resistance without genetic change), these modules are considered potential targets for alternative drug design. In this scenario, structural information for the proteins in these systems is highly valuable. In this report, we describe the development of a web-based system, named BtoxDB, that stores all protein structural data on TA systems. The BtoxDB database was implemented as a MySQL relational database using PHP scripting language. Web interfaces were developed using HTML, CSS and JavaScript. The data were collected from the PDB, UniProt and Entrez databases. These data were appropriately filtered using specialized literature and our previous knowledge about toxin-antitoxin systems. The database provides three modules ("Search", "Browse" and "Statistics") that enable searches, acquisition of contents and access to statistical data. Direct links to matching external databases are also available. The compilation of all protein structural data on TA systems in one platform is highly useful for researchers interested in this content. BtoxDB is publicly available at http://www.gurupi.uft.edu.br/btoxdb. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A computer database system to calculate staff radiation doses and maintain records

    International Nuclear Information System (INIS)

    Clewer, P.

    1985-01-01

    A database has been produced to record the personal dose records of all employees monitored for radiation exposure in the Wessex Health Region. Currently there are more than 2000 personnel in 115 departments but the capacity of the database allows for expansion. The computer is interfaced to a densitometer for film badge reading. The hardware used by the database, which is based on a popular microcomputer, is described, as are the various programs that make up the software. The advantages over the manual card index system that it replaces are discussed. (author)

  2. Role of Database Management Systems in Selected Engineering Institutions of Andhra Pradesh: An Analytical Survey

    Directory of Open Access Journals (Sweden)

    Kutty Kumar

    2016-06-01

    Full Text Available This paper aims to analyze the function of database management systems from the perspective of librarians working in engineering institutions in Andhra Pradesh. Ninety-eight librarians from one hundred thirty engineering institutions participated in the study. The paper reveals that training by computer suppliers and software packages are the significant mode of acquiring DBMS skills by librarians; three-fourths of the librarians are postgraduate degree holders. Most colleges use database applications for automation purposes and content value. Electrical problems and untrained staff seem to be major constraints faced by respondents for managing library databases.

  3. A survey of the use of database management systems in accelerator projects

    CERN Document Server

    Poole, John

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accelerator projects and what they are being used for. Initially introduced to handle equipment builders' data, commercial DBMS are now being used in almost all areas of accelerators from on-line control to personnel data. A variety of commercial systems are being used in conjunction with a diverse selection of application software for data maintenance/manipulation and controls. This paper reviews the database activities known to IADBG.

  4. A role for relational databases in high energy physics software systems

    International Nuclear Information System (INIS)

    Lauer, R.; Slaughter, A.J.; Wolin, E.

    1987-01-01

    This paper presents the design and initial implementation of software which uses a relational database management system for storage and retrieval of real and Monte Carlo generated events from a charm and beauty spectrometer with a vertex detector. The purpose of the software is to graphically display and interactively manipulate the events, fit tracks and vertices and calculate physics quantities. The INGRES database forms the core of the system, while the DI3000 graphics package is used to plot the events. The paper introduces relational database concepts and their applicability to high energy physics data. It also evaluates the environment provided by INGRES, particularly its usefulness in code development and its Fortran interface. Specifics of the database design we have chosen are detailed as well. (orig.)

  5. Rapid storage and retrieval of genomic intervals from a relational database system using nested containment lists.

    Science.gov (United States)

    Wiley, Laura K; Sivley, R Michael; Bush, William S

    2013-01-01

    Efficient storage and retrieval of genomic annotations based on range intervals is necessary, given the amount of data produced by next-generation sequencing studies. The indexing strategies of relational database systems (such as MySQL) greatly inhibit their use in genomic annotation tasks. This has led to the development of stand-alone applications that are dependent on flat-file libraries. In this work, we introduce MyNCList, an implementation of the NCList data structure within a MySQL database. MyNCList enables the storage, update and rapid retrieval of genomic annotations from the convenience of a relational database system. Range-based annotations of 1 million variants are retrieved in under a minute, making this approach feasible for whole-genome annotation tasks. Database URL: https://github.com/bushlab/mynclist.

  6. Retrieval program system of Chinese Evaluated (frequently useful) Nuclear Decay Database

    International Nuclear Information System (INIS)

    Huang Xiaolong; Zhou Chunmei

    1995-01-01

    The Chinese Evaluated (frequently useful) Nuclear Decay Database has been set up in MICRO-VAX-11 computer at Chinese Nuclear Data Center (CNDC). For users' convenience, the retrieval program system of the database is written. Retrieval can be carried out for one nucleus or multi-nucleus. The retrieved results can be displayed on terminal screen or output to M3081 printer and laser printer in ENSDF format, table report or scheme diagrams

  7. The new ENSDF search system NESSY: IBM/PC nuclear spectroscopy database

    International Nuclear Information System (INIS)

    Boboshin, I.N.; Varlamov, V.V.

    1996-01-01

    The universal relational nuclear structure and decay database NESSY (New ENSDF Search SYstem) developed for the IBM/PC and compatible PCs, and based on the international file ENSDF (Evaluated Nuclear Structure Data File), is described. The NESSY provides the possibility of high efficiency processing (the search and retrieval of any kind of physical data) of the information from ENSDF. The principles of the database development are described and examples of applications are presented. (orig.)

  8. Alternative pathways for angiotensin II generation in the cardiovascular system

    Directory of Open Access Journals (Sweden)

    C. Becari

    2011-09-01

    Full Text Available The classical renin-angiotensin system (RAS consists of enzymes and peptides that regulate blood pressure and electrolyte and fluid homeostasis. Angiotensin II (Ang II is one of the most important and extensively studied components of the RAS. The beneficial effects of angiotensin converting enzyme (ACE inhibitors in the treatment of hypertension and heart failure, among other diseases, are well known. However, it has been reported that patients chronically treated with effective doses of these inhibitors do not show suppression of Ang II formation, suggesting the involvement of pathways alternative to ACE in the generation of Ang II. Moreover, the finding that the concentration of Ang II is preserved in the kidney, heart and lungs of mice with an ACE deletion indicates the important role of alternative pathways under basal conditions to maintain the levels of Ang II. Our group has characterized the serine protease elastase-2 as an alternative pathway for Ang II generation from Ang I in rats. A role for elastase-2 in the cardiovascular system was suggested by studies performed in heart and conductance and resistance vessels of normotensive and spontaneously hypertensive rats. This mini-review will highlight the pharmacological aspects of the RAS, emphasizing the role of elastase-2, an alternative pathway for Ang II generation.

  9. The Muon system of the run II D0 detector

    Energy Technology Data Exchange (ETDEWEB)

    Abazov, V.M.; Acharya, B.S.; Alexeev, G.D.; Alkhazov, G.; Anosov, V.A.; Baldin, B.; Banerjee, S.; Bardon, O.; Bartlett, J.F.; Baturitsky, M.A.; Beutel, D.; Bezzubov,; Bodyagin, V.; Butler, J.M.; Cease, H.; Chi, E.; Denisov, D.; Denisov, S.P.; Diehl, H.T.; Doulas, S.; Dugad, S.R.; /Beijing, Inst. High Energy Phys. /Charles U. /Prague, Tech.

    2005-03-01

    The authors describe the design, construction and performance of the upgraded D0 muon system for Run II of the Fermilab Tevatron collider. Significant improvements have been made to the major subsystems of the D0 muon detector: trigger scintillation counters, tracking detectors, and electronics. The Run II central muon detector has a new scintillation counter system inside the iron toroid and an improved scintillation counter system outside the iron toroid. In the forward region, new scintillation counter and tracking systems have been installed. Extensive shielding has been added in the forward region. A large fraction of the muon system electronics is also new.

  10. Received Signal Strength Database Interpolation by Kriging for a Wi-Fi Indoor Positioning System.

    Science.gov (United States)

    Jan, Shau-Shiun; Yeh, Shuo-Ju; Liu, Ya-Wen

    2015-08-28

    The main approach for a Wi-Fi indoor positioning system is based on the received signal strength (RSS) measurements, and the fingerprinting method is utilized to determine the user position by matching the RSS values with the pre-surveyed RSS database. To build a RSS fingerprint database is essential for an RSS based indoor positioning system, and building such a RSS fingerprint database requires lots of time and effort. As the range of the indoor environment becomes larger, labor is increased. To provide better indoor positioning services and to reduce the labor required for the establishment of the positioning system at the same time, an indoor positioning system with an appropriate spatial interpolation method is needed. In addition, the advantage of the RSS approach is that the signal strength decays as the transmission distance increases, and this signal propagation characteristic is applied to an interpolated database with the Kriging algorithm in this paper. Using the distribution of reference points (RPs) at measured points, the signal propagation model of the Wi-Fi access point (AP) in the building can be built and expressed as a function. The function, as the spatial structure of the environment, can create the RSS database quickly in different indoor environments. Thus, in this paper, a Wi-Fi indoor positioning system based on the Kriging fingerprinting method is developed. As shown in the experiment results, with a 72.2% probability, the error of the extended RSS database with Kriging is less than 3 dBm compared to the surveyed RSS database. Importantly, the positioning error of the developed Wi-Fi indoor positioning system with Kriging is reduced by 17.9% in average than that without Kriging.

  11. Spent fuel composition database system on WWW. SFCOMPO on WWW Ver.2

    Energy Technology Data Exchange (ETDEWEB)

    Mochizuki, Hiroki [Japan Research Institute, Ltd., Tokyo (Japan); Suyama, Kenya; Nomura, Yasushi; Okuno, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-08-01

    'SFCOMPO on WWW Ver.2' is an advanced version of 'SFCOMPO on WWW' ('Spent Fuel Composition Database System on WWW') released in 1997. This new version has a function of database management by an introduced relational database software 'PostgreSQL' and has various searching methods. All of the data required for the calculation of isotopic composition is available from the web site of this system. This report describes the outline of this system and the searching method using Internet. In addition, the isotopic composition data and the reactor data of the 14 LWRs (7 PWR and 7 BWR) registered in this system are described. (author)

  12. Multi-dimensional database design and implementation of dam safety monitoring system

    Directory of Open Access Journals (Sweden)

    Zhao Erfeng

    2008-09-01

    Full Text Available To improve the effectiveness of dam safety monitoring database systems, the development process of a multi-dimensional conceptual data model was analyzed and a logic design was achieved in multi-dimensional database mode. The optimal data model was confirmed by identifying data objects, defining relations and reviewing entities. The conversion of relations among entities to external keys and entities and physical attributes to tables and fields was interpreted completely. On this basis, a multi-dimensional database that reflects the management and analysis of a dam safety monitoring system on monitoring data information has been established, for which factual tables and dimensional tables have been designed. Finally, based on service design and user interface design, the dam safety monitoring system has been developed with Delphi as the development tool. This development project shows that the multi-dimensional database can simplify the development process and minimize hidden dangers in the database structure design. It is superior to other dam safety monitoring system development models and can provide a new research direction for system developers.

  13. PEP-II RF System Operation and Performance

    International Nuclear Information System (INIS)

    McIntosh, P.

    2005-01-01

    The Low Energy Ring (LER) and High Energy Ring (HER) RF systems have operated now on PEP-II since July 1998 and have assisted in breaking all design luminosity records back in June 2001. Luminosity on PEP-II has steadily increased since then as a consequence of larger e+ and e- beam currents being accumulated. This has meant that the RF systems have inevitably been driven harder, not only to achieve these higher stored beam currents, but also to reliably keep the beams circulating whilst at the same time minimizing the number of aborts due to RF system faults. This paper details the current PEP-II RF system configurations for both rings, as well as future upgrade plans spanning the next 3-5 years. Limitations of the current RF system configurations are presented, highlighting improvement projects which will target specific areas within the RF systems to ensure that adequate operating overheads are maintained and reliable operation is assured. The Low Energy Ring (LER) and High Energy Ring (HER) RF systems have operated now on PEP-II since July 1998 and have assisted in breaking all design luminosity records back in June 2001. Luminosity on PEP-II has steadily increased since then as a consequence of larger e+ and e- beam currents being accumulated. This has meant that the RF systems have inevitably been driven harder, not only to achieve these higher stored beam currents, but also to reliably keep the beams circulating whilst at the same time minimizing the number of aborts due to RF system faults. This paper details the current PEP-II RF system configurations for both rings, as well as future upgrade plans spanning the next 3-5 years. Limitations of the current RF system configurations are presented, highlighting improvement projects which will target specific areas within the RF systems to ensure that adequate operating overheads are maintained and reliable operation is assured

  14. ARACHNID: A prototype object-oriented database tool for distributed systems

    Science.gov (United States)

    Younger, Herbert; Oreilly, John; Frogner, Bjorn

    1994-01-01

    This paper discusses the results of a Phase 2 SBIR project sponsored by NASA and performed by MIMD Systems, Inc. A major objective of this project was to develop specific concepts for improved performance in accessing large databases. An object-oriented and distributed approach was used for the general design, while a geographical decomposition was used as a specific solution. The resulting software framework is called ARACHNID. The Faint Source Catalog developed by NASA was the initial database testbed. This is a database of many giga-bytes, where an order of magnitude improvement in query speed is being sought. This database contains faint infrared point sources obtained from telescope measurements of the sky. A geographical decomposition of this database is an attractive approach to dividing it into pieces. Each piece can then be searched on individual processors with only a weak data linkage between the processors being required. As a further demonstration of the concepts implemented in ARACHNID, a tourist information system is discussed. This version of ARACHNID is the commercial result of the project. It is a distributed, networked, database application where speed, maintenance, and reliability are important considerations. This paper focuses on the design concepts and technologies that form the basis for ARACHNID.

  15. Quantum theory of the nonconservative system II

    International Nuclear Information System (INIS)

    Yeon, K.H.

    1984-01-01

    Utilizing the propagator for a damped harmonic oscillator in nonconservative system, we show the corresponding wave function, energy expectation value, transition amplitude and uncertainty relation. (Author)

  16. Rapid Automated Mission Planning System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is an automated UAS mission planning system that will rapidly identify emergency (contingency) landing sites, manage contingency routing, and...

  17. Experience of MAPS in monitoring of personnel movement with on-line database management system

    International Nuclear Information System (INIS)

    Rajendran, T.S.; Anand, S.D.

    1992-01-01

    As a part of physical protection system, access control system has been installed in Madras Atomic Power Station(MAPS) to monitor and regulate the movement of persons within MAPS. The present system in its original form was meant only for security monitoring. A PC based database management system was added to this to computerize the availability of work force for actual work. (author). 2 annexures

  18. Removal of Pb(II), Cu(II) and Cd(II) from aqueous solution by some fungi and natural adsorbents in single and multiple metal systems

    International Nuclear Information System (INIS)

    Shoaib, A.; Badar, T.; Aslam, N.

    2011-01-01

    Six fungal and 10 natural biosorbents were analyzed for their Cu(II), Cd(II) and Pb(II) uptake capacity from single, binary and ternary metal ion system. Preliminary screening biosorption of assays revealed 2 fungi (Aspergillus niger and Cunninghamella echinulata) and three natural [Cicer arietinum husk, Moringa oleifera flower and soil (clay)] adsorbents hold considerable high adsorption efficiency and capacity for 3 meta l ions amongst the adsorbents. Further biosorption trials with five elected adsorbents showed a considerable reduction in metal uptake capability of adsorbents in binary- and ternary systems as compared to singly metal system. Cd(II) manifested the highest inhibitory effect on the biosorption of other metal ions, followed by Pb(II) and Cu(II). On account of metal preference, the selectivity order for metal ion towards the studied biomass matrices was Pb(II) (40-90%) > Cd(II) (2-53%) > Cu(II) (2-30%). (author)

  19. 17th East European Conference on Advances in Databases and Information Systems and Associated Satellite Events

    CERN Document Server

    Cerquitelli, Tania; Chiusano, Silvia; Guerrini, Giovanna; Kämpf, Mirko; Kemper, Alfons; Novikov, Boris; Palpanas, Themis; Pokorný, Jaroslav; Vakali, Athena

    2014-01-01

    This book reports on state-of-art research and applications in the field of databases and information systems. It includes both fourteen selected short contributions, presented at the East-European Conference on Advances in Databases and Information Systems (ADBIS 2013, September 1-4, Genova, Italy), and twenty-six papers from ADBIS 2013 satellite events. The short contributions from the main conference are collected in the first part of the book, which covers a wide range of topics, like data management, similarity searches, spatio-temporal and social network data, data mining, data warehousing, and data management on novel architectures, such as graphics processing units, parallel database management systems, cloud and MapReduce environments. In contrast, the contributions from the satellite events are organized in five different parts, according to their respective ADBIS satellite event: BiDaTA 2013 - Special Session on Big Data: New Trends and Applications); GID 2013 – The Second International Workshop ...

  20. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  1. Massively Parallel Sort-Merge Joins in Main Memory Multi-Core Database Systems

    OpenAIRE

    Albutiu, Martina-Cezara; Kemper, Alfons; Neumann, Thomas

    2012-01-01

    Two emerging hardware trends will dominate the database system technology in the near future: increasing main memory capacities of several TB per server and massively parallel multi-core processing. Many algorithmic and control techniques in current database technology were devised for disk-based systems where I/O dominated the performance. In this work we take a new look at the well-known sort-merge join which, so far, has not been in the focus of research in scalable massively parallel mult...

  2. Design of remote weather monitor system based on embedded web database

    International Nuclear Information System (INIS)

    Gao Jiugang; Zhuang Along

    2010-01-01

    The remote weather monitoring system is designed by employing the embedded Web database technology and the S3C2410 microprocessor as the core. The monitoring system can simultaneously monitor the multi-channel sensor signals, and can give a dynamic Web pages display of various types of meteorological information on the remote computer. It gives a elaborated introduction of the construction and application of the Web database under the embedded Linux. Test results show that the client access the Web page via the GPRS or the Internet, acquires data and uses an intuitive graphical way to display the value of various types of meteorological information. (authors)

  3. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  4. Electronic construction collaboration system -- phase II.

    Science.gov (United States)

    2010-06-01

    During the first year of research, work was completed to identify Iowa DOT needs for web-based project management system (WPMS) : and evaluate how commercially available solutions could meet these needs. Researchers also worked to pilot test custom d...

  5. Mars Aqueous Processing System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The Mars Aqueous Processing System (MAPS) is a novel technology for recovering oxygen, iron, and other constituents from lunar and Mars soils. The closed-loop...

  6. Wearable Health Monitoring Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposal is to demonstrate the feasibility of producing a wearable health monitoring system for the human body that is functional, comfortable,...

  7. Tactile Data Entry System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Building on our successful Phase I Tactile Data Entry program, Barron Associates proposes development of a Glove-Enabled Computer Operations (GECO) system to permit...

  8. Advanced Green Micropropulsion System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Systima in collaboration with University of Washington is developing a high performance injection system for advanced green monopropellant AF-M315E micropropulsion...

  9. Enhanced Brine Dewatering System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of the Enhanced Brine Dewatering System (EBDS) is to provide a scalable means of completely recovering usable water from byproducts created by reverse...

  10. Preliminary study for unified management of CANDU safety codes and construction of database system

    International Nuclear Information System (INIS)

    Min, Byung Joo; Kim, Hyoung Tae

    2003-03-01

    It is needed to develop the Graphical User Interface(GUI) for the unified management of CANDU safety codes and to construct database system for the validation of safety codes, for which the preliminary study is done in the first stage of the present work. The input and output structures and data flow of CATHENA and PRESCON2 are investigated and the interaction of the variables between CATHENA and PRESCON2 are identified. Furthermore, PC versions of CATHENA and PRESCON2 codes are developed for the interaction of these codes and GUI(Graphic User Interface). The PC versions are assessed by comparing the calculation results with those by HP workstation or from FSAR(Final Safety Analysis Report). Preliminary study on the GUI for the safety codes in the unified management system are done. The sample of GUI programming is demonstrated preliminarily. Visual C++ is selected as the programming language for the development of GUI system. The data for Wolsong plants, reactor core, and thermal-hydraulic experiments executed in the inside and outside of the country, are collected and classified following the structure of the database system, of which two types are considered for the final web-based database system. The preliminary GUI programming for database system is demonstrated, which is updated in the future work

  11. HYLIFE-II tritium management system

    International Nuclear Information System (INIS)

    Longhurst, G.R.; Dolan, T.J.

    1993-06-01

    The tritium management system performs seven functions: (1) tritium gas removal from the blast chamber, (2) tritium removal from the Flibe, (3) tritium removal from helium sweep gas, (4) tritium removal from room air, (5) hydrogen isotope separation, (6) release of non-hazardous gases through the stack, (7) fixation and disposal of hazardous effluents. About 2 TBq/s (5 MCi/day) of tritium is bred in the Flibe (Li 2 BeF 4 ) molten salt coolant by neutron absorption. Tritium removal is accomplished by a two-stage vacuum disengager in each of three steam generator loops. Each stage consists of a spray of 0.4 mm diameter, hot Flibe droplets into a vacuum chamber 4 m in diameter and 7 m tall. As droplets fall downward into the vacuum, most of the tritium diffuses out and is pumped away. A fraction Φ∼10 -5 of the tritium remains in the Flibe as it leaves the second stage of the vacuum disengager, and about 24% of the remaining tritium penetrates through the steam generator tubes, per pass, so the net leakage into the steam system is about 4.7 MBq/s (11 Ci/day). The required Flibe pumping power for the vacuum disengager system is 6.6 MW. With Flibe primary coolant and a vacuum disengager, an intermediate coolant loop is not needed to prevent tritium from leaking into the steam system. An experiment is needed to demonstrate vacuum disengager operation with Flibe. A secondary containment shell with helium sweep gas captures the tritium permeating out of the Flibe ducts, limiting leaks there to about 1 Ci/day. The tritium inventory in the reactor is about 190 g, residing mostly in the large Flibe recirculation duct walls. The total cost of the tritium management system is 92 M$, of which the vacuum disengagers cost = 56%, the blast chamber vacuum system = 15%, the cryogenic plant = 9%, the emergency air cleanup and waste treatment systems each = 6%, the protium removal system = 3%, and the fuel storage system and inert gas system each = 2%

  12. Detection of criticality accidents. The Intertechnique EDAC II system

    International Nuclear Information System (INIS)

    Prigent, R.

    1991-01-01

    The chief aim of the new generation of EDAC II criticality accidents detection system is to reduce the risks associated to the handling of fissile material by providing a swift and safe warning of the development of any criticality accident. To this function already devolving on the EDAC system of the previous generation, the EDAC II adds the possibility of storing in memory the characteristics of the accident, providing a daily follow-up of the striking events in the system through the print-out of a log book and providing assistance to the operators during the periodical tests. (Author)

  13. Development of FBR cycle data base system (II)

    International Nuclear Information System (INIS)

    Kubota, Sadae; Ohtaki, Akira; Hirao, Kazuhiro

    2003-05-01

    In the 'Feasibility Study on Commercialized FBR Cycle Systems (F/S)', scenario evaluations, cost-benefit evaluations and system characteristic evaluations to show the significance of the FBR cycle system introduction concretely are performed while design studies for FBR plants, reprocessing systems and fabrication systems are conducted. In these evaluations, future society of various conditions and situation is assumed, and investigation and analysis about needs and social effects of FBR cycle are carried out. In this study, promising FBR cycle concepts are suggested by taking information such as domestic and foreign policies and bills, an economic prediction, a supply and demand prediction of resources, a project of technology development into consideration in addition to system design information. The development of the FBR Cycle Database which this report introduced started in 1999 fiscal year to enable managed unitarity and searched reference information to use for the above scenario evaluations, cost-benefit evaluations and system characteristic evaluations. In 2000 fiscal year, its prototype was made and used tentatively, and we extracted the problems in operation and functions from that, and, in 2001 fiscal year, the entry system and the search system using the Web page were made in order to solve problems of the prototype, and started use in our group. Moreover, in 2002 fiscal year, we expanded and improved the search system and promoted the efficiency of management work, and use in JNC through intranet of the database was started. In addition, as a result of having made the entry of about 350 data in 2002 fiscal year, the collected number of the database reaches about 7,250 by the end of March, 2003. We are to continue the entry of related information of various evaluations in F/S phase 2 from now on. In addition, we are to examine improvement of convenience of the search system and cooperation with the economy database. (author)

  14. Development of a database system for operational use in the selection of titanium alloys

    Science.gov (United States)

    Han, Yuan-Fei; Zeng, Wei-Dong; Sun, Yu; Zhao, Yong-Qing

    2011-08-01

    The selection of titanium alloys has become a complex decision-making task due to the growing number of creation and utilization for titanium alloys, with each having its own characteristics, advantages, and limitations. In choosing the most appropriate titanium alloys, it is very essential to offer a reasonable and intelligent service for technical engineers. One possible solution of this problem is to develop a database system (DS) to help retrieve rational proposals from different databases and information sources and analyze them to provide useful and explicit information. For this purpose, a design strategy of the fuzzy set theory is proposed, and a distributed database system is developed. Through ranking of the candidate titanium alloys, the most suitable material is determined. It is found that the selection results are in good agreement with the practical situation.

  15. Implementation of dragon-I database system based on B/S model

    International Nuclear Information System (INIS)

    Jiang Wei; Lai Qinggui; Chen Nan; Gao Feng

    2010-01-01

    B/S architecture is utilized in the database system of 'Dragon-I'. The dynamic web software is designed with the technology of ASP. NET, and the web software are divided into three main tiers: user interface tier, business logic tier and access tier. The data of accelerator status and the data generated in experiment processes are managed with SQL Server DBMS, and the database is accessed based on the technology of ADO. NET. The status of facility, control parameters and testing waves are queried by the experiment number and experiment time. The demand of storage, management, browse, query and offline analysis are implemented entirely in this database system based on B/S architecture. (authors)

  16. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  17. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    International Nuclear Information System (INIS)

    Waters, Michael; Jackson, Marcus

    2008-01-01

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  18. A Systematic Review of Coding Systems Used in Pharmacoepidemiology and Database Research.

    Science.gov (United States)

    Chen, Yong; Zivkovic, Marko; Wang, Tongtong; Su, Su; Lee, Jianyi; Bortnichak, Edward A

    2018-02-01

    Clinical coding systems have been developed to translate real-world healthcare information such as prescriptions, diagnoses and procedures into standardized codes appropriate for use in large healthcare datasets. Due to the lack of information on coding system characteristics and insufficient uniformity in coding practices, there is a growing need for better understanding of coding systems and their use in pharmacoepidemiology and observational real world data research. To determine: 1) the number of available coding systems and their characteristics, 2) which pharmacoepidemiology databases are they adopted in, 3) what outcomes and exposures can be identified from each coding system, and 4) how robust they are with respect to consistency and validity in pharmacoepidemiology and observational database studies. Electronic literature database and unpublished literature searches, as well as hand searching of relevant journals were conducted to identify eligible articles discussing characteristics and applications of coding systems in use and published in the English language between 1986 and 2016. Characteristics considered included type of information captured by codes, clinical setting(s) of use, adoption by a pharmacoepidemiology database, region, and available mappings. Applications articles describing the use and validity of specific codes, code lists, or algorithms were also included. Data extraction was performed independently by two reviewers and a narrative synthesis was performed. A total of 897 unique articles and 57 coding systems were identified, 17% of which included country-specific modifications or multiple versions. Procedures (55%), diagnoses (36%), drugs (38%), and site of disease (39%) were most commonly and directly captured by these coding systems. The systems were used to capture information from the following clinical settings: inpatient (63%), ambulatory (55%), emergency department (ED, 34%), and pharmacy (13%). More than half of all coding

  19. NVST Data Archiving System Based On FastBit NoSQL Database

    Science.gov (United States)

    Liu, Ying-bo; Wang, Feng; Ji, Kai-fan; Deng, Hui; Dai, Wei; Liang, Bo

    2014-06-01

    The New Vacuum Solar Telescope (NVST) is a 1-meter vacuum solar telescope that aims to observe the fine structures of active regions on the Sun. The main tasks of the NVST are high resolution imaging and spectral observations, including the measurements of the solar magnetic field. The NVST has been collecting more than 20 million FITS files since it began routine observations in 2012 and produces a maximum observational records of 120 thousand files in a day. Given the large amount of files, the effective archiving and retrieval of files becomes a critical and urgent problem. In this study, we implement a new data archiving system for the NVST based on the Fastbit Not Only Structured Query Language (NoSQL) database. Comparing to the relational database (i.e., MySQL; My Structured Query Language), the Fastbit database manifests distinctive advantages on indexing and querying performance. In a large scale database of 40 million records, the multi-field combined query response time of Fastbit database is about 15 times faster and fully meets the requirements of the NVST. Our study brings a new idea for massive astronomical data archiving and would contribute to the design of data management systems for other astronomical telescopes.

  20. Complement System Part II: Role in Immunity

    Science.gov (United States)

    Merle, Nicolas S.; Noe, Remi; Halbwachs-Mecarelli, Lise; Fremeaux-Bacchi, Veronique; Roumenina, Lubka T.

    2015-01-01

    The complement system has been considered for a long time as a simple lytic cascade, aimed to kill bacteria infecting the host organism. Nowadays, this vision has changed and it is well accepted that complement is a complex innate immune surveillance system, playing a key role in host homeostasis, inflammation, and in the defense against pathogens. This review discusses recent advances in the understanding of the role of complement in physiology and pathology. It starts with a description of complement contribution to the normal physiology (homeostasis) of a healthy organism, including the silent clearance of apoptotic cells and maintenance of cell survival. In pathology, complement can be a friend or a foe. It acts as a friend in the defense against pathogens, by inducing opsonization and a direct killing by C5b–9 membrane attack complex and by triggering inflammatory responses with the anaphylatoxins C3a and C5a. Opsonization plays also a major role in the mounting of an adaptive immune response, involving antigen presenting cells, T-, and B-lymphocytes. Nevertheless, it can be also an enemy, when pathogens hijack complement regulators to protect themselves from the immune system. Inadequate complement activation becomes a disease cause, as in atypical hemolytic uremic syndrome, C3 glomerulopathies, and systemic lupus erythematosus. Age-related macular degeneration and cancer will be described as examples showing that complement contributes to a large variety of conditions, far exceeding the classical examples of diseases associated with complement deficiencies. Finally, we discuss complement as a therapeutic target. PMID:26074922

  1. A New Clinical HIFU System (Teleson II)

    Science.gov (United States)

    Ma, Yixin; Symonds-Tayler, Richard; Rivens, Ian H.; ter Haar, Gail R.

    2007-05-01

    Previous clinical trials with our first prototype HIFU system (Teleson I) for the treatment of liver tumors, demonstrated a major challenge to be treatment of those tumors located behind the ribs. We have designed a new multi-element transducer for rib sparing. Initial simulation and experimental results (using a single channel power amplifier) are very encouraging. A new clinical HIFU system which can drive the multi-element transducer and control each channel independently is being designed and constructed. This second version of a clinical prototype HIFU system consists of a 3D motorised gantry, a multi-channel signal generator, a multi-channel power amplifier, a user interface PC, an embedded controller and auxiliary circuits for real-time interleaving/synchronization control and a to-be-implemented safety monitoring and data logging unit. For multi-element transducers, each element can be individually switched on and off for rib sparing, and phase and amplitude modulated for potential phased array applications. The multi-channel power amplifier can be switched on/off very rapidly at required intervals to interleave with ultrasound B-Scan imaging for HIFU monitoring or radiation force elastography imaging via a dedicated interleaving/timing module. The gantry movement can also be synchronised with power amplifier on/off and phase/amplitude updating for lesion generation under a wide variety of conditions including single lesions, lesion arrays and lesions "tracks" created whilst translating the active transducer. Results from testing the system using excised tissue will be presented.

  2. Overview of the TJ-II remote participation system

    International Nuclear Information System (INIS)

    Vega, J.; Sanchez, E.; Portas, A.; Pereira, A.; Mollinedo, A.; Munoz, J.A.; Ruiz, M.; Barrera, E.; Lopez, S.; Machon, D.; Castro, R.; Lopez, D.

    2006-01-01

    The TJ-II remote participation system (RPS) is focused on providing remote access to elements that depend exclusively on characteristics of the TJ-II environment: data acquisition, diagnostics control systems and TJ-II operation tracking. Four key points were taken into account prior to starting the software design: access security, software execution platforms, software maintenance and distribution and delivery of operation events. The first, access security, was addressed by means of a distributed authentication and authorization system, PAPI. Regarding the other points, the development was based on the use of web servers (due to their standard character, flexibility and scalability) and Java technologies (due to their open nature, security properties and technological maturity). Software deployment was prepared to make use of the Java Network Launching Protocol (JNLP). On-line message distribution was planned according to a message oriented middleware. At present, the TJ-II RPS manages over 1000 digitization channels and 20 diagnostic control systems. The TJ-II RPS architecture is flexible, scalable and powerful enough to be applied to distributed environments and, in particular, it could be used in the ITER environment

  3. Overview of the TJ-II remote participation system

    Energy Technology Data Exchange (ETDEWEB)

    Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense, 22, 28040 Madrid (Spain)]. E-mail: jesus.vega@ciemat.es; Sanchez, E. [Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense, 22, 28040 Madrid (Spain); Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense, 22, 28040 Madrid (Spain); Pereira, A. [Asociacion EURATOM/CIEMAT para Fusion, Avda. Complutense, 22, 28040 Madrid (Spain); Mollinedo, A. [Computer Centre, CIEMAT, Avda. Complutense, 22, 28040 Madrid (Spain); Munoz, J.A. [Computer Centre, CIEMAT, Avda. Complutense, 22, 28040 Madrid (Spain); Ruiz, M. [Dpto. De Sistemas Electronicos y de Control, UPM, Campus Sur, Ctra. Valencia km 7, 28031 Madrid (Spain); Barrera, E. [Dpto. De Sistemas Electronicos y de Control, UPM, Campus Sur, Ctra. Valencia km 7, 28031 Madrid (Spain); Lopez, S. [Dpto. De Sistemas Electronicos y de Control, UPM, Campus Sur, Ctra. Valencia km 7, 28031 Madrid (Spain); Machon, D. [Dpto. De Sistemas Electronicos y de Control, UPM, Campus Sur, Ctra. Valencia km 7, 28031 Madrid (Spain); Castro, R. [Red.es-RedIRIS, Edificio Bronce, Plaza Manuel Gomez Moreno s/n, 28020 Madrid (Spain); Lopez, D. [Red.es-RedIRIS, Edificio Bronce, Plaza Manuel Gomez Moreno s/n, 28020 Madrid (Spain)

    2006-07-15

    The TJ-II remote participation system (RPS) is focused on providing remote access to elements that depend exclusively on characteristics of the TJ-II environment: data acquisition, diagnostics control systems and TJ-II operation tracking. Four key points were taken into account prior to starting the software design: access security, software execution platforms, software maintenance and distribution and delivery of operation events. The first, access security, was addressed by means of a distributed authentication and authorization system, PAPI. Regarding the other points, the development was based on the use of web servers (due to their standard character, flexibility and scalability) and Java technologies (due to their open nature, security properties and technological maturity). Software deployment was prepared to make use of the Java Network Launching Protocol (JNLP). On-line message distribution was planned according to a message oriented middleware. At present, the TJ-II RPS manages over 1000 digitization channels and 20 diagnostic control systems. The TJ-II RPS architecture is flexible, scalable and powerful enough to be applied to distributed environments and, in particular, it could be used in the ITER environment.

  4. MRI findings in central nervous system of neurofibromatosis-II

    International Nuclear Information System (INIS)

    Chen Maoen; Huang Suiqiao; Shen Jun; Hong Guobin; Wu Zhuo; Lin Xiaofeng

    2007-01-01

    Objective: To investigate the diagnostic value of MR imaging in central nervous system involvement of neurofibromatosis II. Methods: 7 patients with surgically and pathologically proved neurofibromatosis II were included. Their MR imaging findings and clinical features were retrospectively analyzed. Results: The main findings of 7 cases of neurofibraomaosis II on MR imaging included bilateral acoustic neurilemoma, multiple neurofibroma, meningioma and schwannoma. Among the 7 patients, Tl-weighted imaging after contrast enhancement displayed additional lesions which had been ignored on un-enhanced scan. Conclusion: MR imaging has advantages in the detection of central nervous sys- tem involvement of neurofibromatosis II with regard to its ability to show the lesions well, meanwhile displaying the size, morphology and signal features clearly. (authors)

  5. Development of radiation oncology learning system combined with multi-institutional radiotherapy database (ROGAD)

    Energy Technology Data Exchange (ETDEWEB)

    Takemura, Akihiro; Iinuma, Masahiro; Kou, Hiroko [Kanazawa Univ. (Japan). School of Medicine; Harauchi, Hajime; Inamura, Kiyonari

    1999-09-01

    We have constructed and are operating a multi-institutional radiotherapy database ROGAD (Radiation Oncology Greater Area Database) since 1992. One of it's purpose is 'to optimize individual radiotherapy plans'. We developed Radiation oncology learning system combined with ROGAD' which conforms to that purpose. Several medical doctors evaluated our system. According to those evaluations, we are now confident that our system is able to contribute to improvement of radiotherapy results. Our final target is to generate a good cyclic relationship among three components: radiotherapy results according to ''Radiation oncology learning system combined with ROGAD.'; The growth of ROGAD; and radiation oncology learning system. (author)

  6. A computer network system for mutual usage four databases of nuclear materials (Data-Free-Way)

    International Nuclear Information System (INIS)

    Fujita, M.; Kurihara, Y.; Shindou, M.; Yokoyama, N.; Tachi, Y.; Kano, S.; Iwata, S.

    1996-01-01

    Distributed database system named 'Data-Free-Way' for advanced nuclear materials has been developed by National Research Institute for Metals (NRIM), Japan Atomic Energy Research Institute (JAERI) and Power Reactor and Nuclear Fuel Development Corporation (PNC) under cooperation agreement between these three organizations. In the paper, features and functions of the system including input data are described together with method to share database among the three organizations as well as examples of the easy accessible search of material properties. Results of analysis of tensile and creep properties data on type 316 stainless steel collected by the different organizations and stored in the present system are also introduced as an example of attractive utilization of the system. Moreover, in order to consider the system in near future, some trails of WWW server of several sites in 'Data-Free-Way' to supply the information on nuclear materials to Internet are introduced. (author)

  7. Design and Implementation of an Embedded NIOS II System for JPEG2000 Tier II Encoding

    Directory of Open Access Journals (Sweden)

    John M. McNichols

    2013-01-01

    Full Text Available This paper presents a novel implementation of the JPEG2000 standard as a system on a chip (SoC. While most of the research in this field centers on acceleration of the EBCOT Tier I encoder, this work focuses on an embedded solution for EBCOT Tier II. Specifically, this paper proposes using an embedded softcore processor to perform Tier II processing as the back end of an encoding pipeline. The Altera NIOS II processor is chosen for the implementation and is coupled with existing embedded processing modules to realize a fully embedded JPEG2000 encoder. The design is synthesized on a Stratix IV FPGA and is shown to out perform other comparable SoC implementations by 39% in computation time.

  8. The Establishment of the SAR images database System Based on Oracle and ArcSDE

    International Nuclear Information System (INIS)

    Zhou, Jijin; Li, Zhen; Chen, Quan; Tian, Bangsen

    2014-01-01

    Synthetic aperture radar is a kind of microwave imaging system, and has the advantages of multi-band, multi-polarization and multi-angle. At present, there is no SAR images database system based on typical features. For solving problems in interpretation and identification, a new SAR images database system of the typical features is urgent in the current development need. In this article, a SAR images database system based on Oracle and ArcSDE was constructed. The main works involving are as follows: (1) SAR image data was calibrated and corrected geometrically and geometrically. Besides, the fully polarimetric image was processed as the coherency matrix[T] to preserve the polarimetric information. (2) After analyzing multiple space borne SAR images, the metadata table was defined as: IMAGEID; Name of features; Latitude and Longitude; Sensor name; Range and Azimuth resolution etc. (3) Through the comparison between GeoRaster and ArcSDE, result showed ArcSDE is a more appropriate technology to store images in a central database. The System stores and manages multisource SAR image data well, reflects scattering, geometry, polarization, band and angle characteristics, and combines with analysis of the managed objects and service objects of the database as well as focuses on constructing SAR image system in the aspects of data browse and data retrieval. According the analysis of characteristics of SAR images such as scattering, polarization, incident angle and wave band information, different weights can be given to these characteristics. Then an interpreted tool is formed to provide an efficient platform for interpretation

  9. A database system for the management of severe accident risk information, SARD

    International Nuclear Information System (INIS)

    Ahn, K. I.; Kim, D. H.

    2003-01-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies

  10. Database System Design and Implementation for Marine Air-Traffic-Controller Training

    Science.gov (United States)

    2017-06-01

    units used larger applications such as Microsoft Access or MySQL . These systems have outdated platforms, and individuals currently maintaining these...Oracle Database 12c was version 12.2.0.20.96, IDE version 12.2.1.0.42.151001.0541. SQL Developer was version 4.1.3.20.96, which used Java platform

  11. Forest Vegetation Simulator translocation techniques with the Bureau of Land Management's Forest Vegetation Information system database

    Science.gov (United States)

    Timothy A. Bottomley

    2008-01-01

    The BLM uses a database, called the Forest Vegetation Information System (FORVIS), to store, retrieve, and analyze forest resource information on a majority of their forested lands. FORVIS also has the capability of easily transferring appropriate data electronically into Forest Vegetation Simulator (FVS) for simulation runs. Only minor additional data inputs or...

  12. MAPS: The Organization of a Spatial Database System Using Imagery, Terrain, and Map Data

    Science.gov (United States)

    1983-06-01

    segments which share the same pixel position. Finally, in any largo system, a logical partitioning of the database must be performed in order to avoid...34theodore roosevelt memoria entry 0; entry 1: Virginia ’northwest Washington* 2 en 11" ies for "crossover" for ’theodore roosevelt memor i entry 0

  13. Design and implementation of Web-based SDUV-FEL engineering database system

    International Nuclear Information System (INIS)

    Sun Xiaoying; Shen Liren; Dai Zhimin; Xie Dong

    2006-01-01

    A design of Web-based SDUV-FEL engineering database and its implementation are introduced. This system will save and offer static data and archived data of SDUV-FEL, and build a proper and effective platform for share of SDUV-FEL data. It offers usable and reliable SDUV-FEL data for operators and scientists. (authors)

  14. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    Science.gov (United States)

    Lawrence N. Hudson; Joseph Wunderle M.; And Others

    2016-01-01

    The PREDICTS project—Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)—has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to...

  15. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    NARCIS (Netherlands)

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara; Hill, Samantha L L; Lysenko, Igor; De Palma, Adriana; Phillips, Helen R P; Alhusseini, Tamera I; Bedford, Felicity E; Bennett, Dominic J; Booth, Hollie; Burton, Victoria J; Chng, Charlotte W T; Choimes, Argyrios; Correia, David L P; Day, Julie; Echeverría-Londoño, Susy; Emerson, Susan R; Gao, Di; Garon, Morgan; Harrison, Michelle L K; Ingram, Daniel J; Jung, Martin; Kemp, Victoria; Kirkpatrick, Lucinda; Martin, Callum D; Pan, Yuan; Pask-Hale, Gwilym D; Pynegar, Edwin L; Robinson, Alexandra N; Sanchez-Ortiz, Katia; Senior, Rebecca A; Simmons, Benno I; White, Hannah J; Zhang, Hanbin; Aben, Job; Abrahamczyk, Stefan; Adum, Gilbert B; Aguilar-Barquero, Virginia; Aizen, Marcelo A; Albertos, Belén; Alcala, E L; Del Mar Alguacil, Maria; Alignier, Audrey; Ancrenaz, Marc; Andersen, Alan N; Arbeláez-Cortés, Enrique; Armbrecht, Inge; Arroyo-Rodríguez, Víctor; Aumann, Tom; Axmacher, Jan C; Azhar, Badrul; Azpiroz, Adrián B; Baeten, Lander; Bakayoko, Adama; Báldi, András; Banks, John E; Baral, Sharad K; Barlow, Jos; Barratt, Barbara I P; Barrico, Lurdes; Bartolommei, Paola; Barton, Diane M; Basset, Yves; Batáry, Péter; Bates, Adam J; Baur, Bruno; Bayne, Erin M; Beja, Pedro; Benedick, Suzan; Berg, Åke; Bernard, Henry; Berry, Nicholas J; Bhatt, Dinesh; Bicknell, Jake E; Bihn, Jochen H; Blake, Robin J; Bobo, Kadiri S; Bóçon, Roberto; Boekhout, Teun; Böhning-Gaese, Katrin; Bonham, Kevin J; Borges, Paulo A V; Borges, Sérgio H; Boutin, Céline; Bouyer, Jérémy; Bragagnolo, Cibele; Brandt, Jodi S; Brearley, Francis Q; Brito, Isabel; Bros, Vicenç; Brunet, Jörg; Buczkowski, Grzegorz; Buddle, Christopher M; Bugter, Rob; Buscardo, Erika; Buse, Jörn; Cabra-García, Jimmy; Cáceres, Nilton C; Cagle, Nicolette L; Calviño-Cancela, María; Cameron, Sydney A; Cancello, Eliana M; Caparrós, Rut; Cardoso, Pedro; Carpenter, Dan; Carrijo, Tiago F; Carvalho, Anelena L; Cassano, Camila R; Castro, Helena; Castro-Luna, Alejandro A; Rolando, Cerda B; Cerezo, Alexis; Chapman, Kim Alan; Chauvat, Matthieu; Christensen, Morten; Clarke, Francis M; Cleary, Daniel F R; Colombo, Giorgio; Connop, Stuart P; Craig, Michael D; Cruz-López, Leopoldo; Cunningham, Saul A; D'Aniello, Biagio; D'Cruze, Neil; da Silva, Pedro Giovâni; Dallimer, Martin; Danquah, Emmanuel; Darvill, Ben; Dauber, Jens; Davis, Adrian L V; Dawson, Jeff; de Sassi, Claudio; de Thoisy, Benoit; Deheuvels, Olivier; Dejean, Alain; Devineau, Jean-Louis; Diekötter, Tim; Dolia, Jignasu V; Domínguez, Erwin; Dominguez-Haydar, Yamileth; Dorn, Silvia; Draper, Isabel; Dreber, Niels; Dumont, Bertrand; Dures, Simon G; Dynesius, Mats; Edenius, Lars; Eggleton, Paul; Eigenbrod, Felix; Elek, Zoltán; Entling, Martin H; Esler, Karen J; de Lima, Ricardo F; Faruk, Aisyah; Farwig, Nina; Fayle, Tom M; Felicioli, Antonio; Felton, Annika M; Fensham, Roderick J; Fernandez, Ignacio C; Ferreira, Catarina C; Ficetola, Gentile F; Fiera, Cristina; Filgueiras, Bruno K C; Fırıncıoğlu, Hüseyin K; Flaspohler, David; Floren, Andreas; Fonte, Steven J; Fournier, Anne; Fowler, Robert E; Franzén, Markus; Fraser, Lauchlan H; Fredriksson, Gabriella M; Freire, Geraldo B; Frizzo, Tiago L M; Fukuda, Daisuke; Furlani, Dario; Gaigher, René; Ganzhorn, Jörg U; García, Karla P; Garcia-R, Juan C; Garden, Jenni G; Garilleti, Ricardo; Ge, Bao-Ming; Gendreau-Berthiaume, Benoit; Gerard, Philippa J; Gheler-Costa, Carla; Gilbert, Benjamin; Giordani, Paolo; Giordano, Simonetta; Golodets, Carly; Gomes, Laurens G L; Gould, Rachelle K; Goulson, Dave; Gove, Aaron D; Granjon, Laurent; Grass, Ingo; Gray, Claudia L; Grogan, James; Gu, Weibin; Guardiola, Moisès; Gunawardene, Nihara R; Gutierrez, Alvaro G; Gutiérrez-Lamus, Doris L; Haarmeyer, Daniela H; Hanley, Mick E; Hanson, Thor; Hashim, Nor R; Hassan, Shombe N; Hatfield, Richard G; Hawes, Joseph E; Hayward, Matt W; Hébert, Christian; Helden, Alvin J; Henden, John-André; Henschel, Philipp; Hernández, Lionel; Herrera, James P; Herrmann, Farina; Herzog, Felix; Higuera-Diaz, Diego; Hilje, Branko; Höfer, Hubert; Hoffmann, Anke; Horgan, Finbarr G; Hornung, Elisabeth; Horváth, Roland; Hylander, Kristoffer; Isaacs-Cubides, Paola; Ishida, Hiroaki; Ishitani, Masahiro; Jacobs, Carmen T; Jaramillo, Víctor J; Jauker, Birgit; Hernández, F Jiménez; Johnson, McKenzie F; Jolli, Virat; Jonsell, Mats; Juliani, S Nur; Jung, Thomas S; Kapoor, Vena; Kappes, Heike; Kati, Vassiliki; Katovai, Eric; Kellner, Klaus; Kessler, Michael; Kirby, Kathryn R; Kittle, Andrew M; Knight, Mairi E; Knop, Eva; Kohler, Florian; Koivula, Matti; Kolb, Annette; Kone, Mouhamadou; Kőrösi, Ádám; Krauss, Jochen; Kumar, Ajith; Kumar, Raman; Kurz, David J; Kutt, Alex S; Lachat, Thibault; Lantschner, Victoria; Lara, Francisco; Lasky, Jesse R; Latta, Steven C; Laurance, William F; Lavelle, Patrick; Le Féon, Violette; LeBuhn, Gretchen; Légaré, Jean-Philippe; Lehouck, Valérie; Lencinas, María V; Lentini, Pia E; Letcher, Susan G; Li, Qi; Litchwark, Simon A; Littlewood, Nick A; Liu, Yunhui; Lo-Man-Hung, Nancy; López-Quintero, Carlos A; Louhaichi, Mounir; Lövei, Gabor L; Lucas-Borja, Manuel Esteban; Luja, Victor H; Luskin, Matthew S; MacSwiney G, M Cristina; Maeto, Kaoru; Magura, Tibor; Mallari, Neil Aldrin; Malone, Louise A; Malonza, Patrick K; Malumbres-Olarte, Jagoba; Mandujano, Salvador; Måren, Inger E; Marin-Spiotta, Erika; Marsh, Charles J; Marshall, E J P; Martínez, Eliana; Martínez Pastur, Guillermo; Moreno Mateos, David; Mayfield, Margaret M; Mazimpaka, Vicente; McCarthy, Jennifer L; McCarthy, Kyle P; McFrederick, Quinn S; McNamara, Sean; Medina, Nagore G; Medina, Rafael; Mena, Jose L; Mico, Estefania; Mikusinski, Grzegorz; Milder, Jeffrey C; Miller, James R; Miranda-Esquivel, Daniel R; Moir, Melinda L; Morales, Carolina L; Muchane, Mary N; Muchane, Muchai; Mudri-Stojnic, Sonja; Munira, A Nur; Muoñz-Alonso, Antonio; Munyekenye, B F; Naidoo, Robin; Naithani, A; Nakagawa, Michiko; Nakamura, Akihiro; Nakashima, Yoshihiro; Naoe, Shoji; Nates-Parra, Guiomar; Navarrete Gutierrez, Dario A; Navarro-Iriarte, Luis; Ndang'ang'a, Paul K; Neuschulz, Eike L; Ngai, Jacqueline T; Nicolas, Violaine; Nilsson, Sven G; Noreika, Norbertas; Norfolk, Olivia; Noriega, Jorge Ari; Norton, David A; Nöske, Nicole M; Nowakowski, A Justin; Numa, Catherine; O'Dea, Niall; O'Farrell, Patrick J; Oduro, William; Oertli, Sabine; Ofori-Boateng, Caleb; Oke, Christopher Omamoke; Oostra, Vicencio; Osgathorpe, Lynne M; Otavo, Samuel Eduardo; Page, Navendu V; Paritsis, Juan; Parra-H, Alejandro; Parry, Luke; Pe'er, Guy; Pearman, Peter B; Pelegrin, Nicolás; Pélissier, Raphaël; Peres, Carlos A; Peri, Pablo L; Persson, Anna S; Petanidou, Theodora; Peters, Marcell K; Pethiyagoda, Rohan S; Phalan, Ben; Philips, T Keith; Pillsbury, Finn C; Pincheira-Ulbrich, Jimmy; Pineda, Eduardo; Pino, Joan; Pizarro-Araya, Jaime; Plumptre, A J; Poggio, Santiago L; Politi, Natalia; Pons, Pere; Poveda, Katja; Power, Eileen F; Presley, Steven J; Proença, Vânia; Quaranta, Marino; Quintero, Carolina; Rader, Romina; Ramesh, B R; Ramirez-Pinilla, Martha P; Ranganathan, Jai; Rasmussen, Claus; Redpath-Downing, Nicola A; Reid, J Leighton; Reis, Yana T; Rey Benayas, José M; Rey-Velasco, Juan Carlos; Reynolds, Chevonne; Ribeiro, Danilo Bandini; Richards, Miriam H; Richardson, Barbara A; Richardson, Michael J; Ríos, Rodrigo Macip; Robinson, Richard; Robles, Carolina A; Römbke, Jörg; Romero-Duque, Luz Piedad; Rös, Matthias; Rosselli, Loreta; Rossiter, Stephen J; Roth, Dana S; Roulston, T'ai H; Rousseau, Laurent; Rubio, André V; Ruel, Jean-Claude; Sadler, Jonathan P; Sáfián, Szabolcs; Saldaña-Vázquez, Romeo A; Sam, Katerina; Samnegård, Ulrika; Santana, Joana; Santos, Xavier; Savage, Jade; Schellhorn, Nancy A; Schilthuizen, Menno; Schmiedel, Ute; Schmitt, Christine B; Schon, Nicole L; Schüepp, Christof; Schumann, Katharina; Schweiger, Oliver; Scott, Dawn M; Scott, Kenneth A; Sedlock, Jodi L; Seefeldt, Steven S; Shahabuddin, Ghazala; Shannon, Graeme; Sheil, Douglas; Sheldon, Frederick H; Shochat, Eyal; Siebert, Stefan J; Silva, Fernando A B; Simonetti, Javier A; Slade, Eleanor M; Smith, Jo; Smith-Pardo, Allan H; Sodhi, Navjot S; Somarriba, Eduardo J; Sosa, Ramón A; Soto Quiroga, Grimaldo; St-Laurent, Martin-Hugues; Starzomski, Brian M; Stefanescu, Constanti; Steffan-Dewenter, Ingolf; Stouffer, Philip C; Stout, Jane C; Strauch, Ayron M; Struebig, Matthew J; Su, Zhimin; Suarez-Rubio, Marcela; Sugiura, Shinji; Summerville, Keith S; Sung, Yik-Hei; Sutrisno, Hari; Svenning, Jens-Christian; Teder, Tiit; Threlfall, Caragh G; Tiitsaar, Anu; Todd, Jacqui H; Tonietto, Rebecca K; Torre, Ignasi; Tóthmérész, Béla; Tscharntke, Teja; Turner, Edgar C; Tylianakis, Jason M; Uehara-Prado, Marcio; Urbina-Cardona, Nicolas; Vallan, Denis; Vanbergen, Adam J; Vasconcelos, Heraldo L; Vassilev, Kiril; Verboven, Hans A F; Verdasca, Maria João; Verdú, José R; Vergara, Carlos H; Vergara, Pablo M; Verhulst, Jort; Virgilio, Massimiliano; Vu, Lien Van; Waite, Edward M; Walker, Tony R; Wang, Hua-Feng; Wang, Yanping; Watling, James I; Weller, Britta; Wells, Konstans; Westphal, Catrin; Wiafe, Edward D; Williams, Christopher D; Willig, Michael R; Woinarski, John C Z; Wolf, Jan H D; Wolters, Volkmar; Woodcock, Ben A; Wu, Jihua; Wunderle, Joseph M; Yamaura, Yuichi; Yoshikura, Satoko; Yu, Douglas W; Zaitsev, Andrey S; Zeidler, Juliane; Zou, Fasheng; Collen, Ben; Ewers, Rob M; Mace, Georgina M; Purves, Drew W; Scharlemann, Jörn P W; Purvis, Andy

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of

  16. Asynchronous data change notification between database server and accelerator control systems

    International Nuclear Information System (INIS)

    Wenge Fu; Seth Nemesure; Morris, J.

    2012-01-01

    Database data change notification (DCN) is a commonly used feature, it allows to be informed when the data has been changed on the server side by another client. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. (authors)

  17. Content Based Retrieval Database Management System with Support for Similarity Searching and Query Refinement

    Science.gov (United States)

    2002-01-01

    to the OODBMS approach. The ORDBMS approach produced such research prototypes as Postgres [155], and Starburst [67] and commercial products such as...Kemnitz. The POSTGRES Next-Generation Database Management System. Communications of the ACM, 34(10):78–92, 1991. [156] Michael Stonebreaker and Dorothy

  18. Document control system as an integral part of RA documentation database application

    International Nuclear Information System (INIS)

    Steljic, M.M; Ljubenov, V.Lj. . E-mail address of corresponding author: milijanas@vin.bg.ac.yu; Steljic, M.M.)

    2005-01-01

    The decision about the final shutdown of the RA research reactor in Vinca Institute has been brought in 2002, and therefore the preparations for its decommissioning have begun. All activities are supervised by the International Atomic Energy Agency (IAEA), which also provides technical and experts' support. This paper describes the document control system is an integral part of the existing RA documentation database. (author)

  19. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    DEFF Research Database (Denmark)

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara

    2017-01-01

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity ...

  20. A database system for the management of severe accident risk information, SARD

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, K. I.; Kim, D. H. [KAERI, Taejon (Korea, Republic of)

    2003-10-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies.

  1. Inactive trials of transport systems: phase II

    International Nuclear Information System (INIS)

    Haberlin, M.M.; Hardy, A.R.; Kennedy, S.T.

    1986-11-01

    Progress made during 1984-85 is reviewed in four sections: the design and installation of a stainless steel working floor in the mock-up of a crate handling and size reduction facility; the detailed evaluation of a single air pad of the type used on commercial air-transporter; an experimental programme designed to examine the problems associated with the operation of a commercial air-transporter; the design, manufacture and commissioning trials of two powered conveyor units which when combined complete a remotely operated transfer system for transporting crated waste into and within the mock-up facility. (author)

  2. Establishing the user requirements for the research reactor decommissioning database system

    International Nuclear Information System (INIS)

    Park, S. K.; Park, H. S.; Lee, G. W.; Park, J. H.

    2002-01-01

    In generally, so much information and data will be raised during the decommissioning activities. It is need a systematical electric system for the management of that. A database system for the decommissioning information and data management from the KRR-1 and 2 decommissioning project is developing now. All information and data will be put into this database system and retrieval also. For the developing the DB system, the basic concept, user requirements were established the then set up the system for categorizing the information and data. The entities of tables for input the data was raised and categorized and then converted the code. The ERD (Entity Relation Diagram) was also set up to show their relation. In need of the developing the user interface system for retrieval the data, is should be studied the analyzing on the relation between the input and output the data. Through this study, as results, the items of output tables are established and categorized according to the requirement of the user interface system for the decommissioning information and data. These tables will be used for designing the prototype and be set up by several feeds back for establishing the decommissioning database system

  3. A Preliminary Study on the Multiple Mapping Structure of Classification Systems for Heterogeneous Databases

    Directory of Open Access Journals (Sweden)

    Seok-Hyoung Lee

    2012-06-01

    Full Text Available While science and technology information service portals and heterogeneous databases produced in Korea and other countries are integrated, methods of connecting the unique classification systems applied to each database have been studied. Results of technologists' research, such as, journal articles, patent specifications, and research reports, are organically related to each other. In this case, if the most basic and meaningful classification systems are not connected, it is difficult to achieve interoperability of the information and thus not easy to implement meaningful science technology information services through information convergence. This study aims to address the aforementioned issue by analyzing mapping systems between classification systems in order to design a structure to connect a variety of classification systems used in the academic information database of the Korea Institute of Science and Technology Information, which provides science and technology information portal service. This study also aims to design a mapping system for the classification systems to be applied to actual science and technology information services and information management systems.

  4. A Database-Based and Web-Based Meta-CASE System

    Science.gov (United States)

    Eessaar, Erki; Sgirka, Rünno

    Each Computer Aided Software Engineering (CASE) system provides support to a software process or specific tasks or activities that are part of a software process. Each meta-CASE system allows us to create new CASE systems. The creators of a new CASE system have to specify abstract syntax of the language that is used in the system and functionality as well as non-functional properties of the new system. Many meta-CASE systems record their data directly in files. In this paper, we introduce a meta-CASE system, the enabling technology of which is an object-relational database system (ORDBMS). The system allows users to manage specifications of languages and create models by using these languages. The system has web-based and form-based user interface. We have created a proof-of-concept prototype of the system by using PostgreSQL ORDBMS and PHP scripting language.

  5. Dielectric susceptibility of classical Coulomb systems. II

    International Nuclear Information System (INIS)

    Choquard, Ph.; Piller, B.; Rentsch, R.

    1987-01-01

    This paper deals with the shape dependence of the dielectric susceptibility (equivalently defined, in a canonical ensemble, by the mean square fluctuation of the electric polarization or by the second moment of the charge-charge correlation function) of classical Coulomb systems. The concept of partial second moment is introduced with the aim of analyzing the contributions to the total susceptibility of pairs of particles of increasing separation. For a disk-shaped one-component plasma with coupling parameter γ=2 it is shown, numerically and algebraically for small and large systems, that (1) the correlation function of two particles close to the edge of the disk decays as the inverse of the square of their distance, and (2) the susceptibility is made up of a bulk contribution, which saturates rapidly toward the Stillinger-Lovett value, and of surface contribution, which varies on the scale of the disk diameter and is described by a new law called the arc sine law. It is also shown that electrostatics and statistical mechanics with shape-dependent thermodynamic limits are consistent for the same model in a strip geometry, whereas the Stillinger-Lovett sum rule is verified for a boundary-free geometry such as the surface of a sphere. Some results of extensive computer simulations of one- and two-component plasmas in circular and elliptic geometries are shown. Anisotropy effects on the susceptibilities are clearly demonstrated and the arc sine law for a circular plasma is well confirmed

  6. WARRIOR II, a high performance modular electric robot system

    International Nuclear Information System (INIS)

    Downton, G.C.

    1996-01-01

    Initially designed for in-reactor welding by the Central Electricity Generating Board, WARRIOR has been developed using the concept of modular technology to become a light-weight, high performance robotic system. Research work on existing machines for in-reactor inspection and repair and heavy duty hydraulic manipulators was progressed in order to develop WARRIOR II, a versatile in-reactor welding system usable at any nuclear power station light enough to be deployed by existing remote handling equipment. WARRIOR II can be significantly reconfigured quickly to pursue different ends. (UK)

  7. Site initialization, recovery, and back-up in a distributed database system

    International Nuclear Information System (INIS)

    Attar, R.; Bernstein, P.A.; Goodman, N.

    1982-01-01

    Site initialization is the problem of integrating a new site into a running distributed database system (DDBS). Site recovery is the problem of integrating an old site into a DDBS when the site recovers from failure. Site backup is the problem of creating a static backup copy of a database for archival or query purposes. We present an algorithm that solves the site initialization problem. By modifying the algorithm slightly, we get solutions to the other two problems as well. Our algorithm exploits the fact that a correct DDBS must run a serializable concurrency control algorithm. Our algorithm relies on the concurrency control algorithm to handle all inter-site synchronization

  8. System requirements and design description for the document basis database interface (DocBasis)

    International Nuclear Information System (INIS)

    Lehman, W.J.

    1997-01-01

    This document describes system requirements and the design description for the Document Basis Database Interface (DocBasis). The DocBasis application is used to manage procedures used within the tank farms. The application maintains information in a small database to track the document basis for a procedure, as well as the current version/modification level and the basis for the procedure. The basis for each procedure is substantiated by Administrative, Technical, Procedural, and Regulatory requirements. The DocBasis user interface was developed by Science Applications International Corporation (SAIC)

  9. Ultra-Structure database design methodology for managing systems biology data and analyses

    Directory of Open Access Journals (Sweden)

    Hemminger Bradley M

    2009-08-01

    Full Text Available Abstract Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping. Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find

  10. Control system for the Spanish Stellarator TJ-II

    International Nuclear Information System (INIS)

    Pacios, L.; Blaumoser, M.; Pena, A. de la; Carrasco, R.; Labrador, I.; Lapayese, F.; Diaz, J.C.; Laso, L.M.

    1995-01-01

    We describe the distributed control and monitoring system for the Spanish Stellarator TJ-II, which is under construction at CIEMAT in Madrid. It consists of one central UNIX workstation and several autonomous subsystems based on VME crates with embedded processors under OS-9 real-time operating system and PLCs. The system integrates the machine and discharge control. An operator can perform the control and plasma discharge by means of a user-friendly graphic interface. (orig.)

  11. EBR-II water-to-sodium leak detection system

    International Nuclear Information System (INIS)

    Wrightson, M.M.; McKinley, K.; Ruther, W.E.; Holmes, J.T.

    1976-01-01

    The water-to-sodium leak detection system installed at EBR-II in April, 1975, is described in detail. Topics covered include operational characteristics, maintenance problems, alarm functions, background hydrogen level data, and future plans for refinements to the system. Particular emphasis is given to the failures of eight of the ten leak detectors due to sodium-to-vacuum leakage, and the program anticipated for complete recovery of the system

  12. Mark-II Data Acquisition and Trigger system

    International Nuclear Information System (INIS)

    Breidenbach, M.

    1984-06-01

    The Mark-II Data Acquisition and Trigger system requirements and general solution are described. The solution takes advantage of the synchronous crossing times and low event rates of an electron positron collider to permit a very highly multiplexed analog scheme to be effective. The system depends on a two level trigger to operate with acceptable dead time. The trigger, multiplexing, data reduction, calibration, and CAMAC systems are described

  13. 16th East-European Conference on Advances in Databases and Information Systems (ADBIS 2012)

    CERN Document Server

    Wojciechowski, Marek; New Trends in Databases and Information Systems

    2013-01-01

    Database and information systems technologies have been rapidly evolving in several directions over the past years. New types and kinds of data, new types of applications and information systems to support them raise diverse challenges to be addressed. The so-called big data challenge, streaming data management and processing, social networks and other complex data analysis, including semantic reasoning into information systems supporting for instance trading, negotiations, and bidding mechanisms are just some of the emerging research topics. This volume contains papers contributed by six workshops: ADBIS Workshop on GPUs in Databases (GID 2012), Mining Complex and Stream Data (MCSD'12), International Workshop on Ontologies meet Advanced Information Systems (OAIS'2012), Second Workshop on Modeling Multi-commodity Trade: Data models and processing (MMT'12), 1st ADBIS Workshop on Social Data Processing (SDP'12), 1st ADBIS Workshop on Social and Algorithmic Issues in Business Support (SAIBS), and the Ph.D. Conso...

  14. Reliability of piping system components. Volume 4: The pipe failure event database

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, R; Erixon, S [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Tomic, B [ENCONET Consulting GmbH, Vienna (Austria); Lydell, B [RSA Technologies, Visat, CA (United States)

    1996-07-01

    Available public and proprietary databases on piping system failures were searched for relevant information. Using a relational database to identify groupings of piping failure modes and failure mechanisms, together with insights from published PSAs, the project team determined why, how and where piping systems fail. This report represents a compendium of technical issues important to the analysis of pipe failure events, and statistical estimation of failure rates. Inadequacies of traditional PSA methodology are addressed, with directions for PSA methodology enhancements. A `data driven and systems oriented` analysis approach is proposed to enable assignment of unique identities to risk-significant piping system component failure. Sufficient operating experience does exist to generate quality data on piping failures. Passive component failures should be addressed by today`s PSAs to allow for aging analysis and effective, on-line risk management. 42 refs, 25 figs.

  15. Database Foundation For The Configuration Management Of The CERN Accelerator Controls Systems

    CERN Document Server

    Zaharieva, Z; Peryt, M

    2011-01-01

    The Controls Configuration Database (CCDB) and its interfaces have been developed over the last 25 years in order to become nowadays the basis for the Configuration Management of the Controls System for all accelerators at CERN. The CCDB contains data for all configuration items and their relationships, required for the correct functioning of the Controls System. The configuration items are quite heterogeneous, depicting different areas of the Controls System – ranging from 3000 Front-End Computers, 75 000 software devices allowing remote control of the accelerators, to valid states of the Accelerators Timing System. The article will describe the different areas of the CCDB, their interdependencies and the challenges to establish the data model for such a diverse configuration management database, serving a multitude of clients. The CCDB tracks the life of the configuration items by allowing their clear identification, triggering of change management processes as well as providing status accounting and aud...

  16. Data-based fault-tolerant control for affine nonlinear systems with actuator faults.

    Science.gov (United States)

    Xie, Chun-Hua; Yang, Guang-Hong

    2016-09-01

    This paper investigates the fault-tolerant control (FTC) problem for unknown nonlinear systems with actuator faults including stuck, outage, bias and loss of effectiveness. The upper bounds of stuck faults, bias faults and loss of effectiveness faults are unknown. A new data-based FTC scheme is proposed. It consists of the online estimations of the bounds and a state-dependent function. The estimations are adjusted online to compensate automatically the actuator faults. The state-dependent function solved by using real system data helps to stabilize the system. Furthermore, all signals in the resulting closed-loop system are uniformly bounded and the states converge asymptotically to zero. Compared with the existing results, the proposed approach is data-based. Finally, two simulation examples are provided to show the effectiveness of the proposed approach. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Tailored patient information using a database system: Increasing patient compliance in a day surgery setting

    DEFF Research Database (Denmark)

    Grode, Jesper Nicolai Riis; Grode, Louise; Steinsøe, Ulla

    rehabilitation. The hospital is responsible of providing the patients with accurate information enabling the patient to prepare for surgery. Often patients are overloaded with uncoordinated information, letters and leaflets. The contribution of this project is a database system enabling health professionals...... to empower patients through tailored individualized information. Performing 6500 operations per year at our Day Surgery Centre, health professionals need a computer based system to create individualized information material. Health professionals must be able to adapt the information material quickly...... was established to support these requirements. A relational database system holds all information pieces in a granular, structured form. Each individual piece of information can be joined with other pieces thus supporting the tailoring of information. A web service layer caters for integration with output systems...

  18. Reliability of piping system components. Volume 4: The pipe failure event database

    International Nuclear Information System (INIS)

    Nyman, R.; Erixon, S.; Tomic, B.; Lydell, B.

    1996-07-01

    Available public and proprietary databases on piping system failures were searched for relevant information. Using a relational database to identify groupings of piping failure modes and failure mechanisms, together with insights from published PSAs, the project team determined why, how and where piping systems fail. This report represents a compendium of technical issues important to the analysis of pipe failure events, and statistical estimation of failure rates. Inadequacies of traditional PSA methodology are addressed, with directions for PSA methodology enhancements. A 'data driven and systems oriented' analysis approach is proposed to enable assignment of unique identities to risk-significant piping system component failure. Sufficient operating experience does exist to generate quality data on piping failures. Passive component failures should be addressed by today's PSAs to allow for aging analysis and effective, on-line risk management. 42 refs, 25 figs

  19. Computer-aided diagnosis system for bone scintigrams from Japanese patients: importance of training database

    DEFF Research Database (Denmark)

    Horikoshi, Hiroyuki; Kikuchi, Akihiro; Onoguchi, Masahisa

    2012-01-01

    higher performance than the corresponding CAD software trained with a European database for the analysis of bone scans from Japanese patients. These results could at least partly be caused by the physical differences between Japanese and European patients resulting in less influence of attenuation......Computer-aided diagnosis (CAD) software for bone scintigrams have recently been introduced as a clinical quality assurance tool. The purpose of this study was to compare the diagnostic accuracy of two CAD systems, one based on a European and one on a Japanese training database, in a group of bone...... scans from Japanese patients.The two CAD software are trained to interpret bone scans using training databases consisting of bone scans with the desired interpretation, metastatic disease or not. One software was trained using 795 bone scans from European patients and the other with 904 bone scans from...

  20. Data systems in FFTF and EBR-II

    International Nuclear Information System (INIS)

    Warrick, R.P.; Ritter, W.M.

    1980-02-01

    This paper describes the Data System used to monitor operation and collect experimental data in FFTF. This data system has evolved since initial inception from a relatively simple, single computer system monitoring a relatively few (approx. 1000) instrument channels important for operation to one which has increased capability to support the long-range testing needs in FFTF. The system, while still relatively simple, now contains multiple computers which normally perform independent functions. The computers, however, provide backup processing for certain simple tasks. Operator interfacing is provided through CRT's. The output capabilities of the system are described. A description of the Data System in EBR-II is also included

  1. Sensory systems II senses other than vision

    CERN Document Server

    Wolfe, Jeremy M

    1988-01-01

    This series of books, "Readings from the Encyclopedia of Neuroscience." consists of collections of subject-clustered articles taken from the Encyclopedia of Neuroscience. The Encyclopedia of Neuroscience is a reference source and compendium of more than 700 articles written by world authorities and covering all of neuroscience. We define neuroscience broadly as including all those fields that have as a primary goal the under­ standing of how the brain and nervous system work to mediate/control behavior, including the mental behavior of humans. Those interested in specific aspects of the neurosciences, particular subject areas or specialties, can of course browse through the alphabetically arranged articles of the En­ cyclopedia or use its index to find the topics they wish to read. However. for those readers-students, specialists, or others-who will find it useful to have collections of subject-clustered articles from the Encyclopedia, we issue this series of "Readings" in paperback. Students in neuroscienc...

  2. Applying cognitive load theory to the redesign of a conventional database systems course

    Science.gov (United States)

    Mason, Raina; Seton, Carolyn; Cooper, Graham

    2016-01-01

    Cognitive load theory (CLT) was used to redesign a Database Systems course for Information Technology students. The redesign was intended to address poor student performance and low satisfaction, and to provide a more relevant foundation in database design and use for subsequent studies and industry. The original course followed the conventional structure for a database course, covering database design first, then database development. Analysis showed the conventional course content was appropriate but the instructional materials used were too complex, especially for novice students. The redesign of instructional materials applied CLT to remove split attention and redundancy effects, to provide suitable worked examples and sub-goals, and included an extensive re-sequencing of content. The approach was primarily directed towards mid- to lower performing students and results showed a significant improvement for this cohort with the exam failure rate reducing by 34% after the redesign on identical final exams. Student satisfaction also increased and feedback from subsequent study was very positive. The application of CLT to the design of instructional materials is discussed for delivery of technical courses.

  3. Nuclear power plant control room crew task analysis database: SEEK system. Users manual

    International Nuclear Information System (INIS)

    Burgy, D.; Schroeder, L.

    1984-05-01

    The Crew Task Analysis SEEK Users Manual was prepared for the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission. It is designed for use with the existing computerized Control Room Crew Task Analysis Database. The SEEK system consists of a PR1ME computer with its associated peripherals and software augmented by General Physics Corporation SEEK database management software. The SEEK software programs provide the Crew Task Database user with rapid access to any number of records desired. The software uses English-like sentences to allow the user to construct logical sorts and outputs of the task data. Given the multiple-associative nature of the database, users can directly access the data at the plant, operating sequence, task or element level - or any combination of these levels. A complete description of the crew task data contained in the database is presented in NUREG/CR-3371, Task Analysis of Nuclear Power Plant Control Room Crews (Volumes 1 and 2)

  4. Development of the Lymphoma Enterprise Architecture Database: a caBIG Silver level compliant system.

    Science.gov (United States)

    Huang, Taoying; Shenoy, Pareen J; Sinha, Rajni; Graiser, Michael; Bumpers, Kevin W; Flowers, Christopher R

    2009-04-03

    Lymphomas are the fifth most common cancer in United States with numerous histological subtypes. Integrating existing clinical information on lymphoma patients provides a platform for understanding biological variability in presentation and treatment response and aids development of novel therapies. We developed a cancer Biomedical Informatics Grid (caBIG) Silver level compliant lymphoma database, called the Lymphoma Enterprise Architecture Data-system (LEAD), which integrates the pathology, pharmacy, laboratory, cancer registry, clinical trials, and clinical data from institutional databases. We utilized the Cancer Common Ontological Representation Environment Software Development Kit (caCORE SDK) provided by National Cancer Institute's Center for Bioinformatics to establish the LEAD platform for data management. The caCORE SDK generated system utilizes an n-tier architecture with open Application Programming Interfaces, controlled vocabularies, and registered metadata to achieve semantic integration across multiple cancer databases. We demonstrated that the data elements and structures within LEAD could be used to manage clinical research data from phase 1 clinical trials, cohort studies, and registry data from the Surveillance Epidemiology and End Results database. This work provides a clear example of how semantic technologies from caBIG can be applied to support a wide range of clinical and research tasks, and integrate data from disparate systems into a single architecture. This illustrates the central importance of caBIG to the management of clinical and biological data.

  5. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    Science.gov (United States)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  6. Database system for management of health physics and industrial hygiene records

    International Nuclear Information System (INIS)

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-01-01

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection

  7. Modified Delphi study to determine optimal data elements for inclusion in an emergency management database system

    Directory of Open Access Journals (Sweden)

    A. Jabar

    2012-03-01

    Conclusion: The use of a modified Expert Delphi study achieved consensus in aspects of hospital institutional capacity that can be translated into practical recommendations for implementation by the local emergency management database system. Additionally, areas of non-consensus have been identified where further work is required. This purpose of this study is to contribute to and aid in the development of this new system.

  8. Demonstration of SLUMIS: a clinical database and management information system for a multi organ transplant program.

    OpenAIRE

    Kurtz, M.; Bennett, T.; Garvin, P.; Manuel, F.; Williams, M.; Langreder, S.

    1991-01-01

    Because of the rapid evolution of the heart, heart/lung, liver, kidney and kidney/pancreas transplant programs at our institution, and because of a lack of an existing comprehensive database, we were required to develop a computerized management information system capable of supporting both clinical and research requirements of a multifaceted transplant program. SLUMIS (ST. LOUIS UNIVERSITY MULTI-ORGAN INFORMATION SYSTEM) was developed for the following reasons: 1) to comply with the reportin...

  9. Characterization of optical systems for the ALPS II experiment

    International Nuclear Information System (INIS)

    Spector, Aaron D.; Baehre, Robin; Willke, Benno; Hannover Univ.

    2016-09-01

    ALPS II is a light shining through a wall style experiment that will use the principle of resonant enhancement to boost the conversion and reconversion probabilities of photons to relativistic WISPs. This will require the use of long baseline low-loss optical cavities. Very high power build up factors in the cavities must be achieved in order to reach the design sensitivity of ALPS II. This necessitates a number of different sophisticated optical and control systems to maintain the resonance and ensure maximal coupling between the laser and the cavity. In this paper we report on the results of the characterization of these optical systems with a 20m cavity and discuss the results in the context of ALPS II.

  10. System modeling and simulation at EBR-II

    International Nuclear Information System (INIS)

    Dean, E.M.; Lehto, W.K.; Larson, H.A.

    1986-01-01

    The codes being developed and verified using EBR-II data are the NATDEMO, DSNP and CSYRED. NATDEMO is a variation of the Westinghouse DEMO code coupled to the NATCON code previously used to simulate perturbations of reactor flow and inlet temperature and loss-of-flow transients leading to natural convection in EBR-II. CSYRED uses the Continuous System Modeling Program (CSMP) to simulate the EBR-II core, including power, temperature, control-rod movement reactivity effects and flow and is used primarily to model reactivity induced power transients. The Dynamic Simulator for Nuclear Power Plants (DSNP) allows a whole plant, thermal-hydraulic simulation using specific component and system models called from libraries. It has been used to simulate flow coastdown transients, reactivity insertion events and balance-of-plant perturbations

  11. An Expert System Interfaced with a Database System to Perform Troubleshooting of Aircraft Carrier Piping Systems

    Science.gov (United States)

    1988-12-01

    interval of four feet, and are numbered sequentially bow to stem. * "wing tank" is a tank or void, outboard of the holding bulkhead, away from the center...system and DBMS simultaneously with a multi-processor, allowing queries to the DBMS without terminating the expert system. This method was judged...RECIRC). eductor -strip("Y"):- ask _ques _read_ans(OVBD,"ovbd dis open"),ovbd dis-open(OVBD). eductor-strip("N"):- ask_ques read_ans( LINEUP , "strip lineup

  12. The NASA F-15 Intelligent Flight Control Systems: Generation II

    Science.gov (United States)

    Buschbacher, Mark; Bosworth, John

    2006-01-01

    The Second Generation (Gen II) control system for the F-15 Intelligent Flight Control System (IFCS) program implements direct adaptive neural networks to demonstrate robust tolerance to faults and failures. The direct adaptive tracking controller integrates learning neural networks (NNs) with a dynamic inversion control law. The term direct adaptive is used because the error between the reference model and the aircraft response is being compensated or directly adapted to minimize error without regard to knowing the cause of the error. No parameter estimation is needed for this direct adaptive control system. In the Gen II design, the feedback errors are regulated with a proportional-plus-integral (PI) compensator. This basic compensator is augmented with an online NN that changes the system gains via an error-based adaptation law to improve aircraft performance at all times, including normal flight, system failures, mispredicted behavior, or changes in behavior resulting from damage.

  13. Long Duration Exposure Facility (LDEF) optical systems SIG summary and database

    Science.gov (United States)

    Bohnhoff-Hlavacek, Gail

    1992-01-01

    The main objectives of the Long Duration Exposure Facility (LDEF) Optical Systems Special Investigative Group (SIG) Discipline are to develop a database of experimental findings on LDEF optical systems and elements hardware, and provide an optical system overview. Unlike the electrical and mechanical disciplines, the optics effort relies primarily on the testing of hardware at the various principal investigator's laboratories, since minimal testing of optical hardware was done at Boeing. This is because all space-exposed optics hardware are part of other individual experiments. At this time, all optical systems and elements testing by experiment investigator teams is not complete, and in some cases has hardly begun. Most experiment results to date, document observations and measurements that 'show what happened'. Still to come from many principal investigators is a critical analysis to explain 'why it happened' and future design implications. The original optical system related concerns and the lessons learned at a preliminary stage in the Optical Systems Investigations are summarized. The design of the Optical Experiments Database and how to acquire and use the database to review the LDEF results are described.

  14. Long Duration Exposure Facility (LDEF) optical systems SIG summary and database

    Science.gov (United States)

    Bohnhoff-Hlavacek, Gail

    1992-09-01

    The main objectives of the Long Duration Exposure Facility (LDEF) Optical Systems Special Investigative Group (SIG) Discipline are to develop a database of experimental findings on LDEF optical systems and elements hardware, and provide an optical system overview. Unlike the electrical and mechanical disciplines, the optics effort relies primarily on the testing of hardware at the various principal investigator's laboratories, since minimal testing of optical hardware was done at Boeing. This is because all space-exposed optics hardware are part of other individual experiments. At this time, all optical systems and elements testing by experiment investigator teams is not complete, and in some cases has hardly begun. Most experiment results to date, document observations and measurements that 'show what happened'. Still to come from many principal investigators is a critical analysis to explain 'why it happened' and future design implications. The original optical system related concerns and the lessons learned at a preliminary stage in the Optical Systems Investigations are summarized. The design of the Optical Experiments Database and how to acquire and use the database to review the LDEF results are described.

  15. A SQL-Database Based Meta-CASE System and its Query Subsystem

    Science.gov (United States)

    Eessaar, Erki; Sgirka, Rünno

    Meta-CASE systems simplify the creation of CASE (Computer Aided System Engineering) systems. In this paper, we present a meta-CASE system that provides a web-based user interface and uses an object-relational database system (ORDBMS) as its basis. The use of ORDBMSs allows us to integrate different parts of the system and simplify the creation of meta-CASE and CASE systems. ORDBMSs provide powerful query mechanism. The proposed system allows developers to use queries to evaluate and gradually improve artifacts and calculate values of software measures. We illustrate the use of the systems by using SimpleM modeling language and discuss the use of SQL in the context of queries about artifacts. We have created a prototype of the meta-CASE system by using PostgreSQL™ ORDBMS and PHP scripting language.

  16. Kilowatt isotope power system phase II plan. Volume II: flight System Conceptual Design (FSCD)

    International Nuclear Information System (INIS)

    1978-03-01

    The Kilowatt Isotope Power System (KIPS) Flight System Conceptual Design (FSCD) is described. Included are a background, a description of the flight system conceptual design, configuration of components, flight system performance, Ground Demonstration System test results, and advanced development tests

  17. Open Architecture Standards and Information Systems (OASIS II ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Open Architecture Standards and Information Systems (OASIS II) - Developing Capacity, Sharing Knowledge and Good Principles Across eHealth in Africa. Health care across much of the African continent is hampered by meager resources and a growing burden of disease, with HIV/AIDS, tuberculosis (TB) and malaria ...

  18. A generic coordinate system and a set of generic variables for MFE database

    International Nuclear Information System (INIS)

    Miner, W.H. Jr.; Ross, D.W.; Solano, E.R.; Valanju, P.M.; Wiley, J.C.

    1993-01-01

    Over the last several years, profile data from nine different tokamaks have been stored in the magnetic fusion energy database (MFEDB). These data sets have come from a variety of sources and most are given in different coordinate systems. In order to attempt any intermachine analysis, it is convenient to transform these data sets into one generic coordinate system and to choose a uniform set of variable names. The authors describe the data sets from each tokamak indicating the source of the data and the coordinate system in which it is given. Next, they discuss the generic coordinate that has been adopted and show how it is implemented for each tokamak. Finally, the generic naming convention that has been adopted is discussed. It follows closely that which was used by Christiansen et al. for the ITER Global Energy Confinement H-Mode Database. For further clarification, they discuss the characteristics of the magnetic geometry given a Fourier representation of the magnetic equilibria

  19. Development of database and QA systems for post closure performance assessment on a potential HLW repository

    International Nuclear Information System (INIS)

    Hwang, Y. S.; Kim, S. G.; Kang, C. H.

    2002-01-01

    In TSPA of long-term post closure radiological safety on permanent disposal of HLW in Korea, appropriate management of input and output data through QA is necessary. The robust QA system is developed using the T2R3 principles applicable for five major steps in R and D's. The proposed system is implemented in the web-based system so that all participants in TSRA are able to access the system. In addition, the internet based input database for TSPA is developed. Currently data from literature surveys, domestic laboratory and field experiments as well as expert elicitation are applied for TSPA

  20. Development of Vision Based Multiview Gait Recognition System with MMUGait Database

    Directory of Open Access Journals (Sweden)

    Hu Ng

    2014-01-01

    Full Text Available This paper describes the acquisition setup and development of a new gait database, MMUGait. This database consists of 82 subjects walking under normal condition and 19 subjects walking with 11 covariate factors, which were captured under two views. This paper also proposes a multiview model-based gait recognition system with joint detection approach that performs well under different walking trajectories and covariate factors, which include self-occluded or external occluded silhouettes. In the proposed system, the process begins by enhancing the human silhouette to remove the artifacts. Next, the width and height of the body are obtained. Subsequently, the joint angular trajectories are determined once the body joints are automatically detected. Lastly, crotch height and step-size of the walking subject are determined. The extracted features are smoothened by Gaussian filter to eliminate the effect of outliers. The extracted features are normalized with linear scaling, which is followed by feature selection prior to the classification process. The classification experiments carried out on MMUGait database were benchmarked against the SOTON Small DB from University of Southampton. Results showed correct classification rate above 90% for all the databases. The proposed approach is found to outperform other approaches on SOTON Small DB in most cases.