WorldWideScience

Sample records for database system trends

  1. TRENDS: The aeronautical post-test database management system

    Science.gov (United States)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  2. A trending database for human performance events

    International Nuclear Information System (INIS)

    Harrison, D.

    1993-01-01

    An effective Operations Experience program includes a standardized methodology for the investigation of unplanned events and a tool capable of retaining investigation data for the purpose of trending analysis. A database used in conjunction with a formalized investigation procedure for the purpose of trending unplanning event data is described. The database follows the structure of INPO's Human Performance Enhancement System for investigations. The database screens duplicate on-line the HPES evaluation Forms. All information pertaining to investigations is collected, retained and entered into the database using these forms. The database will be used for trending analysis to determine if any significant patterns exist, for tracking progress over time both within AECL and against industry standards, and for evaluating the success of corrective actions. Trending information will be used to help prevent similar occurrences

  3. Transport and Environment Database System (TRENDS): Maritime Air Pollutant Emission Modelling

    DEFF Research Database (Denmark)

    Georgakaki, Aliki; Coffey, Robert; Lock, Grahm

    2005-01-01

    This paper reports the development of the maritime module within the framework of the Transport and Environment Database System (TRENDS) project. A detailed database has been constructed for the calculation of energy consumption and air pollutant emissions. Based on an in-house database...... changes from findings reported in Methodologies for Estimating air pollutant Emissions from Transport (MEET). The database operates on statistical data provided by Eurostat, which describe vessel and freight movements from and towards EU 15 major ports. Data are at port to Maritime Coastal Area (MCA...... with a view to this purpose, are mentioned. Examples of the results obtained by the database are presented. These include detailed air pollutant emission calculations for bulk carriers entering the port of Helsinki, as an example of the database operation, and aggregate results for different types...

  4. TRENDS: A flight test relational database user's guide and reference manual

    Science.gov (United States)

    Bondi, M. J.; Bjorkman, W. S.; Cross, J. L.

    1994-01-01

    This report is designed to be a user's guide and reference manual for users intending to access rotocraft test data via TRENDS, the relational database system which was developed as a tool for the aeronautical engineer with no programming background. This report has been written to assist novice and experienced TRENDS users. TRENDS is a complete system for retrieving, searching, and analyzing both numerical and narrative data, and for displaying time history and statistical data in graphical and numerical formats. This manual provides a 'guided tour' and a 'user's guide' for the new and intermediate-skilled users. Examples for the use of each menu item within TRENDS is provided in the Menu Reference section of the manual, including full coverage for TIMEHIST, one of the key tools. This manual is written around the XV-15 Tilt Rotor database, but does include an appendix on the UH-60 Blackhawk database. This user's guide and reference manual establishes a referrable source for the research community and augments NASA TM-101025, TRENDS: The Aeronautical Post-Test, Database Management System, Jan. 1990, written by the same authors.

  5. Transport and Environment Database System (TRENDS): Maritime Air Pollutant Emission Modelling

    DEFF Research Database (Denmark)

    Georgakaki, Aliki; Coffey, R. A.; Lock, G.

    2003-01-01

    This paper reports the development of the maritime module within the framework of the TRENDS project. A detailed database has been constructed, which includes all stages of the energy consumption and air pollutant emission calculations. The technical assumptions and factors incorporated in the da...... ¿ short sea or deep-sea shipping. Key Words: Air Pollution, Maritime Transport, Air Pollutant Emissions......This paper reports the development of the maritime module within the framework of the TRENDS project. A detailed database has been constructed, which includes all stages of the energy consumption and air pollutant emission calculations. The technical assumptions and factors incorporated...... encountered since the statistical data collection was not undertaken with a view to this purpose are mentioned. Examples of the results obtained by the database are presented. These include detailed air pollutant emission results per port and vessel type, to aggregate results for different types of movements...

  6. Database Systems - Present and Future

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available The database systems have nowadays an increasingly important role in the knowledge-based society, in which computers have penetrated all fields of activity and the Internet tends to develop worldwide. In the current informatics context, the development of the applications with databases is the work of the specialists. Using databases, reach a database from various applications, and also some of related concepts, have become accessible to all categories of IT users. This paper aims to summarize the curricular area regarding the fundamental database systems issues, which are necessary in order to train specialists in economic informatics higher education. The database systems integrate and interfere with several informatics technologies and therefore are more difficult to understand and use. Thus, students should know already a set of minimum, mandatory concepts and their practical implementation: computer systems, programming techniques, programming languages, data structures. The article also presents the actual trends in the evolution of the database systems, in the context of economic informatics.

  7. Exploration of a Vision for Actor Database Systems

    DEFF Research Database (Denmark)

    Shah, Vivek

    of these services. Existing popular approaches to building these services either use an in-memory database system or an actor runtime. We observe that these approaches have complementary strengths and weaknesses. In this dissertation, we propose the integration of actor programming models in database systems....... In doing so, we lay down a vision for a new class of systems called actor database systems. To explore this vision, this dissertation crystallizes the notion of an actor database system by defining its feature set in light of current application and hardware trends. In order to explore the viability...... of the outlined vision, a new programming model named Reactors has been designed to enrich classic relational database programming models with logical actor programming constructs. To support the reactor programming model, a high-performance in-memory multi-core OLTP database system named REACTDB has been built...

  8. Land Condition Trend Analysis Avian Database: Ecological Guild-based Summaries

    National Research Council Canada - National Science Library

    Schreiber, Eric

    1998-01-01

    Land Condition Trend Analysis (LCTA) bird database documentation capabilities often are limited to the generation of installation-wide species checklists, estimates of relative abundance, and evidence of breeding activity...

  9. Trends in adolescent bariatric surgery evaluated by UHC database collection.

    Science.gov (United States)

    Pallati, Pradeep; Buettner, Shelby; Simorov, Anton; Meyer, Avishai; Shaligram, Abhijit; Oleynikov, Dmitry

    2012-11-01

    With increasing childhood obesity, adolescent bariatric surgery has been increasingly performed. We used a national database to analyze current trends in laparoscopic bariatric surgery in the adolescent population and related short-term outcomes. Discharge data from the University Health System Consortium (UHC) database was accessed using International Classification of Disease codes during a 36 month period. UHC is an alliance of more than 110 academic medical centers and nearly 250 affiliate hospitals. All adolescent patients between 13 and 18 years of age, with the assorted diagnoses of obesity, who underwent laparoscopic adjustable gastric banding (LAGB), sleeve gastrectomy (SG), and laparoscopic Roux-en-Y gastric bypass (LRYGB) were evaluated. The main outcome measures analyzed were morbidity, mortality, length of hospital stay (LOS), overall cost, intensive care unit (ICU) admission rate, and readmission rate. These outcomes were compared to those of adult bariatric surgery. Adolescent laparoscopic bariatric surgery was performed on 329 patients. At the same time, 49,519 adult bariatric surgeries were performed. One hundred thirty-six adolescent patients underwent LAGB, 47 had SG, and 146 patients underwent LRYGB. LAGB has shown a decreasing trend (n = 68, 34, and 34), while SG has shown an increasing trend (n = 8, 15, and 24) over the study years. LRYGB remained stable (n = 44, 60, and 42) throughout the study period. The individual and summative morbidity and mortality rates for these procedures were zero. Compared to adult bariatric surgery, 30 day in-hospital morbidity (0 vs. 2.2 %, p adolescent bariatric surgery, while the ICU admission rate (9.78 vs. 6.30 %, p adolescent laparoscopic bariatric surgery reveal the increased use of sleeve gastrectomy and adjustable gastric banding falling out of favor.

  10. 7th Asian Conference on Intelligent Information and Database Systems (ACIIDS 2015)

    CERN Document Server

    Nguyen, Ngoc; Batubara, John; New Trends in Intelligent Information and Database Systems

    2015-01-01

    Intelligent information and database systems are two closely related subfields of modern computer science which have been known for over thirty years. They focus on the integration of artificial intelligence and classic database technologies to create the class of next generation information systems. The book focuses on new trends in intelligent information and database systems and discusses topics addressed to the foundations and principles of data, information, and knowledge models, methodologies for intelligent information and database systems analysis, design, and implementation, their validation, maintenance and evolution. They cover a broad spectrum of research topics discussed both from the practical and theoretical points of view such as: intelligent information retrieval, natural language processing, semantic web, social networks, machine learning, knowledge discovery, data mining, uncertainty management and reasoning under uncertainty, intelligent optimization techniques in information systems, secu...

  11. Massively Parallel Sort-Merge Joins in Main Memory Multi-Core Database Systems

    OpenAIRE

    Martina-Cezara Albutiu, Alfons Kemper, Thomas Neumann

    2012-01-01

    Two emerging hardware trends will dominate the database system technology in the near future: increasing main memory capacities of several TB per server and massively parallel multi-core processing. Many algorithmic and control techniques in current database technology were devised for disk-based systems where I/O dominated the performance. In this work we take a new look at the well-known sort-merge join which, so far, has not been in the focus of research ...

  12. Database system of geological information for geological evaluation base of NPP sites(I)

    International Nuclear Information System (INIS)

    Lim, C. B.; Choi, K. R.; Sim, T. M.; No, M. H.; Lee, H. W.; Kim, T. K.; Lim, Y. S.; Hwang, S. K.

    2002-01-01

    This study aims to provide database system for site suitability analyses of geological information and a processing program for domestic NPP site evaluation. This database system program includes MapObject provided by ESRI and Spread 3.5 OCX, and is coded with Visual Basic language. Major functions of the systematic database program includes vector and raster farmat topographic maps, database design and application, geological symbol plot, the database search for the plotted geological symbol, and so on. The program can also be applied in analyzing not only for lineament trends but also for statistic treatment from geologically site and laboratory information and sources in digital form and algorithm, which is usually used internationally

  13. Massively Parallel Sort-Merge Joins in Main Memory Multi-Core Database Systems

    OpenAIRE

    Albutiu, Martina-Cezara; Kemper, Alfons; Neumann, Thomas

    2012-01-01

    Two emerging hardware trends will dominate the database system technology in the near future: increasing main memory capacities of several TB per server and massively parallel multi-core processing. Many algorithmic and control techniques in current database technology were devised for disk-based systems where I/O dominated the performance. In this work we take a new look at the well-known sort-merge join which, so far, has not been in the focus of research in scalable massively parallel mult...

  14. Trends in Solar energy Driven Vertical Ground Source Heat Pump Systems in Sweden - An Analysis Based on the Swedish Well Database

    Science.gov (United States)

    Juhlin, K.; Gehlin, S.

    2016-12-01

    Sweden is a world leader in developing and using vertical ground source heat pump (GSHP) technology. GSHP systems extract passively stored solar energy in the ground and the Earth's natural geothermal energy. Geothermal energy is an admitted renewable energy source in Sweden since 2007 and is the third largest renewable energy source in the country today. The Geological Survey of Sweden (SGU) is the authority in Sweden that provides open access geological data of rock, soil and groundwater for the public. All wells drilled must be registered in the SGU Well Database and it is the well driller's duty to submit registration of drilled wells.Both active and passive geothermal energy systems are in use. Large GSHP systems, with at least 20 boreholes, are active geothermal energy systems. Energy is stored in the ground which allows both comfort heating and cooling to be extracted. Active systems are therefore relevant for larger properties and industrial buildings. Since 1978 more than 600 000 wells (water wells, GSHP boreholes etc) have been registered in the Well Database, with around 20 000 new registrations per year. Of these wells an estimated 320 000 wells are registered as GSHP boreholes. The vast majority of these boreholes are single boreholes for single-family houses. The number of properties with registered vertical borehole GSHP installations amounts to approximately 243 000. Of these sites between 300-350 are large GSHP systems with at least 20 boreholes. While the increase in number of new registrations for smaller homes and households has slowed down after the rapid development in the 80's and 90's, the larger installations for commercial and industrial buildings have increased in numbers over the last ten years. This poster uses data from the SGU Well Database to quantify and analyze the trends in vertical GSHP systems reported between 1978-2015 in Sweden, with special focus on large systems. From the new aggregated data, conclusions can be drawn about

  15. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  16. Time trends in prostate cancer surgery: data from an Internet-based multicentre database.

    Science.gov (United States)

    Schostak, Martin; Baumunk, Daniel; Jagota, Anita; Klopf, Christian; Winter, Alexander; Schäfers, Sebastian; Kössler, Robert; Brennecke, Volker; Fischer, Tom; Hagel, Susanne; Höchel, Steffen; Jäkel, Dierk; Lehsnau, Mike; Krege, Susanne; Rüffert, Bernd; Pretzer, Jana; Becht, Eduard; Zegenhagen, Thomas; Miller, Kurt; Weikert, Steffen

    2012-02-01

    To report our experience with an Internet-based multicentre database that enables tumour documentation, as well as the collection of quality-related parameters and follow-up data, in surgically treated patients with prostate cancer. The system was used to assess the quality of prostate cancer surgery and to analyze possible time-dependent trends in the quality of care. An Internet-based database system enabled a standardized collection of treatment data and clinical findings from the participating urological centres for the years 2005-2009. An analysis was performed aiming to evaluate relevant patient characteristics (age, pathological tumour stage, preoperative International Index of Erectile Function-5 score), intra-operative parameters (operating time, percentage of nerve-sparing operations, complication rate, transfusion rate, number of resected lymph nodes) and postoperative parameters (hospitalization time, re-operation rate, catheter indwelling time). Mean values were calculated and compared for each annual cohort from 2005 to 2008. The overall survival rate was also calculated for a subgroup of the Berlin patients. A total of 914, 1120, 1434 and 1750 patients submitted to radical prostatectomy in 2005, 2006, 2007 and 2008 were documented in the database. The mean age at the time of surgery remained constant (66 years) during the study period. More than half the patients already had erectile dysfunction before surgery (median International Index of Erectile Function-5 score of 19-20). During the observation period, there was a decrease in the percentage of pT2 tumours (1% in 2005; 64% in 2008) and a slight increase in the percentage of patients with lymph node metastases (8% in 2005; 10% in 2008). No time trend was found for the operating time (142-155 min) or the percentage of nerve-sparing operations (72-78% in patients without erectile dysfunction). A decreasing frequency was observed for the parameters: blood transfusions (1.9% in 2005; 0.5% in 2008

  17. Column-oriented database management systems

    OpenAIRE

    Možina, David

    2013-01-01

    In the following thesis I will present column-oriented database. Among other things, I will answer on a question why there is a need for a column-oriented database. In recent years there have been a lot of attention regarding a column-oriented database, even if the existence of a columnar database management systems dates back in the early seventies of the last century. I will compare both systems for a database management – a colum-oriented database system and a row-oriented database system ...

  18. Database and Expert Systems Applications

    DEFF Research Database (Denmark)

    Viborg Andersen, Kim; Debenham, John; Wagner, Roland

    schemata, query evaluation, semantic processing, information retrieval, temporal and spatial databases, querying XML, organisational aspects of databases, natural language processing, ontologies, Web data extraction, semantic Web, data stream management, data extraction, distributed database systems......This book constitutes the refereed proceedings of the 16th International Conference on Database and Expert Systems Applications, DEXA 2005, held in Copenhagen, Denmark, in August 2005.The 92 revised full papers presented together with 2 invited papers were carefully reviewed and selected from 390...... submissions. The papers are organized in topical sections on workflow automation, database queries, data classification and recommendation systems, information retrieval in multimedia databases, Web applications, implementational aspects of databases, multimedia databases, XML processing, security, XML...

  19. Database Software for the 1990s.

    Science.gov (United States)

    Beiser, Karl

    1990-01-01

    Examines trends in the design of database management systems for microcomputers and predicts developments that may occur in the next decade. Possible developments are discussed in the areas of user interfaces, database programing, library systems, the use of MARC data, CD-ROM applications, artificial intelligence features, HyperCard, and…

  20. Coordinating Mobile Databases: A System Demonstration

    OpenAIRE

    Zaihrayeu, Ilya; Giunchiglia, Fausto

    2004-01-01

    In this paper we present the Peer Database Management System (PDBMS). This system runs on top of the standard database management system, and it allows it to connect its database with other (peer) databases on the network. A particularity of our solution is that PDBMS allows for conventional database technology to be effectively operational in mobile settings. We think of database mobility as a database network, where databases appear and disappear spontaneously and their network access point...

  1. Towards Sensor Database Systems

    DEFF Research Database (Denmark)

    Bonnet, Philippe; Gehrke, Johannes; Seshadri, Praveen

    2001-01-01

    . These systems lack flexibility because data is extracted in a predefined way; also, they do not scale to a large number of devices because large volumes of raw data are transferred regardless of the queries that are submitted. In our new concept of sensor database system, queries dictate which data is extracted...... from the sensors. In this paper, we define the concept of sensor databases mixing stored data represented as relations and sensor data represented as time series. Each long-running query formulated over a sensor database defines a persistent view, which is maintained during a given time interval. We...... also describe the design and implementation of the COUGAR sensor database system....

  2. OAP- OFFICE AUTOMATION PILOT GRAPHICS DATABASE SYSTEM

    Science.gov (United States)

    Ackerson, T.

    1994-01-01

    The Office Automation Pilot (OAP) Graphics Database system offers the IBM PC user assistance in producing a wide variety of graphs and charts. OAP uses a convenient database system, called a chartbase, for creating and maintaining data associated with the charts, and twelve different graphics packages are available to the OAP user. Each of the graphics capabilities is accessed in a similar manner. The user chooses creation, revision, or chartbase/slide show maintenance options from an initial menu. The user may then enter or modify data displayed on a graphic chart. The cursor moves through the chart in a "circular" fashion to facilitate data entries and changes. Various "help" functions and on-screen instructions are available to aid the user. The user data is used to generate the graphics portion of the chart. Completed charts may be displayed in monotone or color, printed, plotted, or stored in the chartbase on the IBM PC. Once completed, the charts may be put in a vector format and plotted for color viewgraphs. The twelve graphics capabilities are divided into three groups: Forms, Structured Charts, and Block Diagrams. There are eight Forms available: 1) Bar/Line Charts, 2) Pie Charts, 3) Milestone Charts, 4) Resources Charts, 5) Earned Value Analysis Charts, 6) Progress/Effort Charts, 7) Travel/Training Charts, and 8) Trend Analysis Charts. There are three Structured Charts available: 1) Bullet Charts, 2) Organization Charts, and 3) Work Breakdown Structure (WBS) Charts. The Block Diagram available is an N x N Chart. Each graphics capability supports a chartbase. The OAP graphics database system provides the IBM PC user with an effective means of managing data which is best interpreted as a graphic display. The OAP graphics database system is written in IBM PASCAL 2.0 and assembler for interactive execution on an IBM PC or XT with at least 384K of memory, and a color graphics adapter and monitor. Printed charts require an Epson, IBM, OKIDATA, or HP Laser

  3. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  4. Current trends and new challenges of databases and web applications for systems driven biological research

    Directory of Open Access Journals (Sweden)

    Pradeep Kumar eSreenivasaiah

    2010-12-01

    Full Text Available Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases (DBs and web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research.

  5. The NCBI BioSystems database.

    Science.gov (United States)

    Geer, Lewis Y; Marchler-Bauer, Aron; Geer, Renata C; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI's Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets.

  6. Security Research on Engineering Database System

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Engine engineering database system is an oriented C AD applied database management system that has the capability managing distributed data. The paper discusses the security issue of the engine engineering database management system (EDBMS). Through studying and analyzing the database security, to draw a series of securi ty rules, which reach B1, level security standard. Which includes discretionary access control (DAC), mandatory access control (MAC) and audit. The EDBMS implem ents functions of DAC, ...

  7. Database management system for large container inspection system

    International Nuclear Information System (INIS)

    Gao Wenhuan; Li Zheng; Kang Kejun; Song Binshan; Liu Fang

    1998-01-01

    Large Container Inspection System (LCIS) based on radiation imaging technology is a powerful tool for the Customs to check the contents inside a large container without opening it. The author has discussed a database application system, as a part of Signal and Image System (SIS), for the LCIS. The basic requirements analysis was done first. Then the selections of computer hardware, operating system, and database management system were made according to the technology and market products circumstance. Based on the above considerations, a database application system with central management and distributed operation features has been implemented

  8. JT-60 database system, 2

    International Nuclear Information System (INIS)

    Itoh, Yasuhiro; Kurihara, Kenichi; Kimura, Toyoaki.

    1987-07-01

    The JT-60 central control system, ''ZENKEI'' collects the control and instrumentation data relevant to discharge and device status data for plant monitoring. The former of the engineering data amounts to about 3 Mbytes per shot of discharge. The ''ZENKEI'' control system which consists of seven minicomputers for on-line real-time control has little performance of handling such a large amount of data for physical and engineering analysis. In order to solve this problem, it was planned to establish the experimental database on the Front-end Processor (FEP) of general purpose large computer in JAERI Computer Center. The database management system (DBMS), therefore, has been developed for creating the database during the shot interval. The engineering data are shipped up from ''ZENKEI'' to FEP through the dedicated communication line after the shot. The hierarchical data model has been adopted in this database, which consists of the data files with tree structure of three keys of system, discharge type and shot number. The JT-60 DBMS provides the data handling packages of subroutines for interfacing the database with user's application programs. The subroutine packages for supporting graphic processing and the function of access control for security of the database are also prepared in this DBMS. (author)

  9. An XCT image database system

    International Nuclear Information System (INIS)

    Komori, Masaru; Minato, Kotaro; Koide, Harutoshi; Hirakawa, Akina; Nakano, Yoshihisa; Itoh, Harumi; Torizuka, Kanji; Yamasaki, Tetsuo; Kuwahara, Michiyoshi.

    1984-01-01

    In this paper, an expansion of X-ray CT (XCT) examination history database to XCT image database is discussed. The XCT examination history database has been constructed and used for daily examination and investigation in our hospital. This database consists of alpha-numeric information (locations, diagnosis and so on) of more than 15,000 cases, and for some of them, we add tree structured image data which has a flexibility for various types of image data. This database system is written by MUMPS database manipulation language. (author)

  10. Content And Multimedia Database Management Systems

    NARCIS (Netherlands)

    de Vries, A.P.

    1999-01-01

    A database management system is a general-purpose software system that facilitates the processes of defining, constructing, and manipulating databases for various applications. The main characteristic of the ‘database approach’ is that it increases the value of data by its emphasis on data

  11. Column-Oriented Database Systems (Tutorial)

    OpenAIRE

    Abadi, D.; Boncz, Peter; Harizopoulos, S.

    2009-01-01

    textabstractColumn-oriented database systems (column-stores) have attracted a lot of attention in the past few years. Column-stores, in a nutshell, store each database table column separately, with attribute values belonging to the same column stored contiguously, compressed, and densely packed, as opposed to traditional database systems that store entire records (rows) one after the other. Reading a subset of a table’s columns becomes faster, at the potential expense of excessive disk-head s...

  12. Selection of nuclear power information database management system

    International Nuclear Information System (INIS)

    Zhang Shuxin; Wu Jianlei

    1996-01-01

    In the condition of the present database technology, in order to build the Chinese nuclear power information database (NPIDB) in the nuclear industry system efficiently at a high starting point, an important task is to select a proper database management system (DBMS), which is the hinge of the matter to build the database successfully. Therefore, this article explains how to build a practical information database about nuclear power, the functions of different database management systems, the reason of selecting relation database management system (RDBMS), the principles of selecting RDBMS, the recommendation of ORACLE management system as the software to build database and so on

  13. Security aspects of database systems implementation

    OpenAIRE

    Pokorný, Tomáš

    2009-01-01

    The aim of this thesis is to provide a comprehensive overview of database systems security. Reader is introduced into the basis of information security and its development. Following chapter defines a concept of database system security using ISO/IEC 27000 Standard. The findings from this chapter form a complex list of requirements on database security. One chapter also deals with legal aspects of this domain. Second part of this thesis offers a comparison of four object-relational database s...

  14. Database reliability engineering designing and operating resilient database systems

    CERN Document Server

    Campbell, Laine

    2018-01-01

    The infrastructure-as-code revolution in IT is also affecting database administration. With this practical book, developers, system administrators, and junior to mid-level DBAs will learn how the modern practice of site reliability engineering applies to the craft of database architecture and operations. Authors Laine Campbell and Charity Majors provide a framework for professionals looking to join the ranks of today’s database reliability engineers (DBRE). You’ll begin by exploring core operational concepts that DBREs need to master. Then you’ll examine a wide range of database persistence options, including how to implement key technologies to provide resilient, scalable, and performant data storage and retrieval. With a firm foundation in database reliability engineering, you’ll be ready to dive into the architecture and operations of any modern database. This book covers: Service-level requirements and risk management Building and evolving an architecture for operational visibility ...

  15. A Relational Database System for Student Use.

    Science.gov (United States)

    Fertuck, Len

    1982-01-01

    Describes an APL implementation of a relational database system suitable for use in a teaching environment in which database development and database administration are studied, and discusses the functions of the user and the database administrator. An appendix illustrating system operation and an eight-item reference list are attached. (Author/JL)

  16. Jelly Views : Extending Relational Database Systems Toward Deductive Database Systems

    Directory of Open Access Journals (Sweden)

    Igor Wojnicki

    2004-01-01

    Full Text Available This paper regards the Jelly View technology, which provides a new, practical methodology for knowledge decomposition, storage, and retrieval within Relational Database Management Systems (RDBMS. Intensional Knowledge clauses (rules are decomposed and stored in the RDBMS founding reusable components. The results of the rule-based processing are visible as regular views, accessible through SQL. From the end-user point of view the processing capability becomes unlimited (arbitrarily complex queries can be constructed using Intensional Knowledge, while the most external queries are expressed with standard SQL. The RDBMS functionality becomes extended toward that of the Deductive Databases

  17. Development of a PSA information database system

    International Nuclear Information System (INIS)

    Kim, Seung Hwan

    2005-01-01

    The need to develop the PSA information database for performing a PSA has been growing rapidly. For example, performing a PSA requires a lot of data to analyze, to evaluate the risk, to trace the process of results and to verify the results. PSA information database is a system that stores all PSA related information into the database and file system with cross links to jump to the physical documents whenever they are needed. Korea Atomic Energy Research Institute is developing a PSA information database system, AIMS (Advanced Information Management System for PSA). The objective is to integrate and computerize all the distributed information of a PSA into a system and to enhance the accessibility to PSA information for all PSA related activities. This paper describes how we implemented such a database centered application in the view of two areas, database design and data (document) service

  18. Development of a personalized training system using the Lung Image Database Consortium and Image Database resource Initiative Database.

    Science.gov (United States)

    Lin, Hongli; Wang, Weisheng; Luo, Jiawei; Yang, Xuedong

    2014-12-01

    The aim of this study was to develop a personalized training system using the Lung Image Database Consortium (LIDC) and Image Database resource Initiative (IDRI) Database, because collecting, annotating, and marking a large number of appropriate computed tomography (CT) scans, and providing the capability of dynamically selecting suitable training cases based on the performance levels of trainees and the characteristics of cases are critical for developing a efficient training system. A novel approach is proposed to develop a personalized radiology training system for the interpretation of lung nodules in CT scans using the Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) database, which provides a Content-Boosted Collaborative Filtering (CBCF) algorithm for predicting the difficulty level of each case of each trainee when selecting suitable cases to meet individual needs, and a diagnostic simulation tool to enable trainees to analyze and diagnose lung nodules with the help of an image processing tool and a nodule retrieval tool. Preliminary evaluation of the system shows that developing a personalized training system for interpretation of lung nodules is needed and useful to enhance the professional skills of trainees. The approach of developing personalized training systems using the LIDC/IDRL database is a feasible solution to the challenges of constructing specific training program in terms of cost and training efficiency. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  19. The design of distributed database system for HIRFL

    International Nuclear Information System (INIS)

    Wang Hong; Huang Xinmin

    2004-01-01

    This paper is focused on a kind of distributed database system used in HIRFL distributed control system. The database of this distributed database system is established by SQL Server 2000, and its application system adopts the Client/Server model. Visual C ++ is used to develop the applications, and the application uses ODBC to access the database. (authors)

  20. Generable PEARL-realtime-database system

    International Nuclear Information System (INIS)

    Plessmann, K.W.; Duif, V.; Angenendt, F.

    1983-06-01

    This database system has been designed with special consideration of the requirements of process-control-application. For that purpose the attribute ''time'' is treated as the essential dimension for processes, affecting data treatment. According to the multiformed requirements of process-control applications the database system is generable, i.e. its size and collection of functions is applicable to each implementation. The system is not adapted to a single data model, therefore several modes can be implemented. Using PEARL for the implementation allows to put the portability of the system to a high standard. (orig.) [de

  1. Development of a database system for the calculation of indicators of environmental pressure caused by transport

    DEFF Research Database (Denmark)

    Giannouli, Myrsini; Samaras, Zissis; Keller, Mario

    2006-01-01

    The scope of this paper is to summarise a methodology developed for TRENDS (TRansport and ENvironment Database System-TRENDS). The main objective of TRENDS was the calculation of environmental pressure indicators caused by transport. The environmental pressures considered are associated with air...... emissions from the four main transport modes, i.e. road, rail, ships and air. In order to determine these indicators a system for calculating a range of environmental pressures due to transport was developed within a PC-based MS Access environment. Emphasis is given oil the latest features incorporated...... the production of collective results for all transport modes as well as a comparative assessment of air emissions produced by the various modes. Traffic activity and emission data obtained according to a basic (reference) scenario are displayed for the time period 1970-2020. In addition, a detailed assessment...

  2. Discovering new information in bibliographic databases

    Directory of Open Access Journals (Sweden)

    Emil Hudomalj

    2005-01-01

    Full Text Available Databases contain information that can usually not be revealed by standard query systems. For that purpose, the methods for knowledge discovery from databases can be applied, which enable the user to browse aggregated data, discover trends, produce online reports, explore possible new associations within the data etc. Such methods are successfully employed in various fields, such as banking, insurance and telecommunications, while they are seldom used in libraries. The article reviews the development of query systems for bibliographic databases, including some early attempts to apply modern knowledge discovery methods. Analytical databases are described in more detail, since they usually serve as the basis for knowledge discovery. Data mining approaches are presented, since they are a central step in the knowledge discovery process. The key role of librarians who can play a key part in developing systems for finding new information in existing bibliographic databases is stressed.

  3. A Sandia telephone database system

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, S.D.; Tolendino, L.F.

    1991-08-01

    Sandia National Laboratories, Albuquerque, may soon have more responsibility for the operation of its own telephone system. The processes that constitute providing telephone service can all be improved through the use of a central data information system. We studied these processes, determined the requirements for a database system, then designed the first stages of a system that meets our needs for work order handling, trouble reporting, and ISDN hardware assignments. The design was based on an extensive set of applications that have been used for five years to manage the Sandia secure data network. The system utilizes an Ingres database management system and is programmed using the Application-By-Forms tools.

  4. 17th East European Conference on Advances in Databases and Information Systems and Associated Satellite Events

    CERN Document Server

    Cerquitelli, Tania; Chiusano, Silvia; Guerrini, Giovanna; Kämpf, Mirko; Kemper, Alfons; Novikov, Boris; Palpanas, Themis; Pokorný, Jaroslav; Vakali, Athena

    2014-01-01

    This book reports on state-of-art research and applications in the field of databases and information systems. It includes both fourteen selected short contributions, presented at the East-European Conference on Advances in Databases and Information Systems (ADBIS 2013, September 1-4, Genova, Italy), and twenty-six papers from ADBIS 2013 satellite events. The short contributions from the main conference are collected in the first part of the book, which covers a wide range of topics, like data management, similarity searches, spatio-temporal and social network data, data mining, data warehousing, and data management on novel architectures, such as graphics processing units, parallel database management systems, cloud and MapReduce environments. In contrast, the contributions from the satellite events are organized in five different parts, according to their respective ADBIS satellite event: BiDaTA 2013 - Special Session on Big Data: New Trends and Applications); GID 2013 – The Second International Workshop ...

  5. Reexamining Operating System Support for Database Management

    OpenAIRE

    Vasil, Tim

    2003-01-01

    In 1981, Michael Stonebraker [21] observed that database management systems written for commodity operating systems could not effectively take advantage of key operating system services, such as buffer pool management and process scheduling, due to expensive overhead and lack of customizability. The “not quite right” fit between these kernel services and the demands of database systems forced database designers to work around such limitations or re-implement some kernel functionality in user ...

  6. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage of MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. The data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 9 figs

  7. The magnet components database system

    International Nuclear Information System (INIS)

    Baggett, M.J.; Leedy, R.; Saltmarsh, C.; Tompkins, J.C.

    1990-01-01

    The philosophy, structure, and usage MagCom, the SSC magnet components database, are described. The database has been implemented in Sybase (a powerful relational database management system) on a UNIX-based workstation at the Superconducting Super Collider Laboratory (SSCL); magnet project collaborators can access the database via network connections. The database was designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will facilitate the tracking and control of the production process as well as the correlation of magnet performance with the properties of its constituents. 3 refs., 10 figs

  8. Nuclear technology databases and information network systems

    International Nuclear Information System (INIS)

    Iwata, Shuichi; Kikuchi, Yasuyuki; Minakuchi, Satoshi

    1993-01-01

    This paper describes the databases related to nuclear (science) technology, and information network. Following contents are collected in this paper: the database developed by JAERI, ENERGY NET, ATOM NET, NUCLEN nuclear information database, INIS, NUclear Code Information Service (NUCLIS), Social Application of Nuclear Technology Accumulation project (SANTA), Nuclear Information Database/Communication System (NICS), reactor materials database, radiation effects database, NucNet European nuclear information database, reactor dismantling database. (J.P.N.)

  9. Database/Operating System Co-Design

    OpenAIRE

    Giceva, Jana

    2016-01-01

    We want to investigate how to improve the information flow between a database and an operating system, aiming for better scheduling and smarter resource management. We are interested in identifying the potential optimizations that can be achieved with a better interaction between a database engine and the underlying operating system, especially by allowing the application to get more control over scheduling and memory management decisions. Therefore, we explored some of the issues that arise ...

  10. Column-Oriented Database Systems (Tutorial)

    NARCIS (Netherlands)

    D. Abadi; P.A. Boncz (Peter); S. Harizopoulos

    2009-01-01

    textabstractColumn-oriented database systems (column-stores) have attracted a lot of attention in the past few years. Column-stores, in a nutshell, store each database table column separately, with attribute values belonging to the same column stored contiguously, compressed, and densely packed, as

  11. Design of database management system for 60Co container inspection system

    International Nuclear Information System (INIS)

    Liu Jinhui; Wu Zhifang

    2007-01-01

    The function of the database management system has been designed according to the features of cobalt-60 container inspection system. And the software related to the function has been constructed. The database querying and searching are included in the software. The database operation program is constructed based on Microsoft SQL server and Visual C ++ under Windows 2000. The software realizes database querying, image and graph displaying, statistic, report form and its printing, interface designing, etc. The software is powerful and flexible for operation and information querying. And it has been successfully used in the real database management system of cobalt-60 container inspection system. (authors)

  12. A Support Database System for Integrated System Health Management (ISHM)

    Science.gov (United States)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  13. Microcomputer Database Management Systems for Bibliographic Data.

    Science.gov (United States)

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  14. Database specification for the Worldwide Port System (WPS) Regional Integrated Cargo Database (ICDB)

    Energy Technology Data Exchange (ETDEWEB)

    Faby, E.Z.; Fluker, J.; Hancock, B.R.; Grubb, J.W.; Russell, D.L. [Univ. of Tennessee, Knoxville, TN (United States); Loftis, J.P.; Shipe, P.C.; Truett, L.F. [Oak Ridge National Lab., TN (United States)

    1994-03-01

    This Database Specification for the Worldwide Port System (WPS) Regional Integrated Cargo Database (ICDB) describes the database organization and storage allocation, provides the detailed data model of the logical and physical designs, and provides information for the construction of parts of the database such as tables, data elements, and associated dictionaries and diagrams.

  15. Network and Database Security: Regulatory Compliance, Network, and Database Security - A Unified Process and Goal

    Directory of Open Access Journals (Sweden)

    Errol A. Blake

    2007-12-01

    Full Text Available Database security has evolved; data security professionals have developed numerous techniques and approaches to assure data confidentiality, integrity, and availability. This paper will show that the Traditional Database Security, which has focused primarily on creating user accounts and managing user privileges to database objects are not enough to protect data confidentiality, integrity, and availability. This paper is a compilation of different journals, articles and classroom discussions will focus on unifying the process of securing data or information whether it is in use, in storage or being transmitted. Promoting a change in Database Curriculum Development trends may also play a role in helping secure databases. This paper will take the approach that if one make a conscientious effort to unifying the Database Security process, which includes Database Management System (DBMS selection process, following regulatory compliances, analyzing and learning from the mistakes of others, Implementing Networking Security Technologies, and Securing the Database, may prevent database breach.

  16. Concurrency control in distributed database systems

    CERN Document Server

    Cellary, W; Gelenbe, E

    1989-01-01

    Distributed Database Systems (DDBS) may be defined as integrated database systems composed of autonomous local databases, geographically distributed and interconnected by a computer network.The purpose of this monograph is to present DDBS concurrency control algorithms and their related performance issues. The most recent results have been taken into consideration. A detailed analysis and selection of these results has been made so as to include those which will promote applications and progress in the field. The application of the methods and algorithms presented is not limited to DDBSs but a

  17. [The future of clinical laboratory database management system].

    Science.gov (United States)

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  18. Issues in Big-Data Database Systems

    Science.gov (United States)

    2014-06-01

    that big data will not be manageable using conventional relational database technology, and it is true that alternative paradigms, such as NoSQL systems...conventional relational database technology, and it is true that alternative paradigms, such as NoSQL systems and search engines, have much to offer...scale well, and because integration with external data sources is so difficult. NoSQL systems are more open to this integration, and provide excellent

  19. Design of Database System of HIRFL-CSR Beam Line

    International Nuclear Information System (INIS)

    Li Peng; Li Ke; Yin Dayu; Yuan Youjin; Gou Shizhe

    2009-01-01

    This paper introduces the database design and optimization for the power supply system of Lanzhou Heavy Ion Accelerator CSR (HIRFL-CSR) beam line. Based on HIFEL-CSR main Oracle database system, the interface was designed to read parameters of the power supply while achieving real-time monitoring. A new database system to store the history data of power supplies was established at the same time, and it realized the data exchange between Oracle database system and Access database system. Meanwhile, the interface was designed conveniently for printing and query parameters. (authors)

  20. Experience in running relational databases on clustered storage

    CERN Document Server

    Aparicio, Ruben Gaspar

    2015-01-01

    For past eight years, CERN IT Database group has based its backend storage on NAS (Network-Attached Storage) architecture, providing database access via NFS (Network File System) protocol. In last two and half years, our storage has evolved from a scale-up architecture to a scale-out one. This paper describes our setup and a set of functionalities providing key features to other services like Database on Demand [1] or CERN Oracle backup and recovery service. It also outlines possible trend of evolution that, storage for databases could follow.

  1. Database system selection for marketing strategies support in information systems

    Directory of Open Access Journals (Sweden)

    František Dařena

    2007-01-01

    Full Text Available In today’s dynamically changing environment marketing has a significant role. Creating successful marketing strategies requires large amount of high quality information of various kinds and data types. A powerful database management system is a necessary condition for marketing strategies creation support. The paper briefly describes the field of marketing strategies and specifies the features that should be provided by database systems in connection with these strategies support. Major commercial (Oracle, DB2, MS SQL, Sybase and open-source (PostgreSQL, MySQL, Firebird databases are than examined from the point of view of accordance with these characteristics and their comparison in made. The results are useful for making the decision before acquisition of a database system during information system’s hardware architecture specification.

  2. Report of the SRC working party on databases and database management systems

    International Nuclear Information System (INIS)

    Crennell, K.M.

    1980-10-01

    An SRC working party, set up to consider the subject of support for databases within the SRC, were asked to identify interested individuals and user communities, establish which features of database management systems they felt were desirable, arrange demonstrations of possible systems and then make recommendations for systems, funding and likely manpower requirements. This report describes the activities and lists the recommendations of the working party and contains a list of databses maintained or proposed by those who replied to a questionnaire. (author)

  3. Management system of instrument database

    International Nuclear Information System (INIS)

    Zhang Xin

    1997-01-01

    The author introduces a management system of instrument database. This system has been developed using with Foxpro on network. The system has some characters such as clear structure, easy operation, flexible and convenient query, as well as the data safety and reliability

  4. Characterization analysis database system (CADS). A system overview

    International Nuclear Information System (INIS)

    1997-12-01

    The CADS database is a standardized, quality-assured, and configuration-controlled data management system developed to assist in the task of characterizing the DOE surplus HEU material. Characterization of the surplus HEU inventory includes identifying the specific material; gathering existing data about the inventory; defining the processing steps that may be necessary to prepare the material for transfer to a blending site; and, ultimately, developing a range of the preliminary cost estimates for those processing steps. Characterization focuses on producing commercial reactor fuel as the final step in material disposition. Based on the project analysis results, the final determination will be made as to the viability of the disposition path for each particular item of HEU. The purpose of this document is to provide an informational overview of the CADS database, its evolution, and its current capabilities. This document describes the purpose of CADS, the system requirements it fulfills, the database structure, and the operational guidelines of the system

  5. Distributed Database Management Systems A Practical Approach

    CERN Document Server

    Rahimi, Saeed K

    2010-01-01

    This book addresses issues related to managing data across a distributed database system. It is unique because it covers traditional database theory and current research, explaining the difficulties in providing a unified user interface and global data dictionary. The book gives implementers guidance on hiding discrepancies across systems and creating the illusion of a single repository for users. It also includes three sample frameworksâ€"implemented using J2SE with JMS, J2EE, and Microsoft .Netâ€"that readers can use to learn how to implement a distributed database management system. IT and

  6. A dedicated database system for handling multi-level data in systems biology.

    Science.gov (United States)

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging. To overcome this, we designed and developed a dedicated database system that can serve and solve the vital issues in data management and hereby facilitate data integration, modeling and analysis in systems biology within a sole database. In addition, a yeast data repository was implemented as an integrated database environment which is operated by the database system. Two applications were implemented to demonstrate extensibility and utilization of the system. Both illustrate how the user can access the database via the web query function and implemented scripts. These scripts are specific for two sample cases: 1) Detecting the pheromone pathway in protein interaction networks; and 2) Finding metabolic reactions regulated by Snf1 kinase. In this study we present the design of database system which offers an extensible environment to efficiently capture the majority of biological entities and relations encountered in systems biology. Critical functions and control processes were designed and implemented to ensure consistent, efficient, secure and reliable transactions. The two sample cases on the yeast integrated data clearly demonstrate the value of a sole database environment for systems biology research.

  7. Advanced approaches to intelligent information and database systems

    CERN Document Server

    Boonjing, Veera; Chittayasothorn, Suphamit

    2014-01-01

    This book consists of 35 chapters presenting different theoretical and practical aspects of Intelligent Information and Database Systems. Nowadays both Intelligent and Database Systems are applied in most of the areas of human activities which necessitates further research in these areas. In this book various interesting issues related to the intelligent information models and methods as well as their advanced applications, database systems applications, data models and their analysis, and digital multimedia methods and applications are presented and discussed both from the practical and theoretical points of view. The book is organized in four parts devoted to intelligent systems models and methods, intelligent systems advanced applications, database systems methods and applications, and multimedia systems methods and applications. The book will be interesting for both practitioners and researchers, especially graduate and PhD students of information technology and computer science, as well more experienced ...

  8. ALARA database value in future outage work planning and dose management

    Energy Technology Data Exchange (ETDEWEB)

    Miller, D.W.; Green, W.H. [Clinton Power Station Illinois Power Co., IL (United States)

    1995-03-01

    ALARA database encompassing job-specific duration and man-rem plant specific information over three refueling outages represents an invaluable tool for the outage work planner and ALARA engineer. This paper describes dose-management trends emerging based on analysis of three refueling outages at Clinton Power Station. Conclusions reached based on hard data available from a relational database dose-tracking system is a valuable tool for planning of future outage work. The system`s ability to identify key problem areas during a refueling outage is improving as more outage comparative data becomes available. Trends over a three outage period are identified in this paper in the categories of number and type of radiation work permits implemented, duration of jobs, projected vs. actual dose rates in work areas, and accuracy of outage person-rem projection. The value of the database in projecting 1 and 5 year station person-rem estimates is discussed.

  9. An Introduction to the DB Relational Database Management System

    OpenAIRE

    Ward, J.R.

    1982-01-01

    This paper is an introductory guide to using the Db programs to maintain and query a relational database on the UNIX operating system. In the past decade. increasing interest has been shown in the development of relational database management systems. Db is an attempt to incorporate a flexible and powerful relational database system within the user environment presented by the UNIX operating system. The family of Db programs is useful for maintaining a database of information that i...

  10. A database prototype has been developed to help understand costs in photovoltaic systems

    International Nuclear Information System (INIS)

    Moorw, Larry M.

    2000-01-01

    High photovoltaic (PV) system costs hinder market growth. An approach to studying these costs has been developed using a database containing system, component and maintenance information. This data, which is both technical and non-technical in nature, is to be used to identify trends related to costs. A pilot database exists at this time and work is continuing. The results of this work may be used by the data owners to improve their operations with the goal of sharing non-attributable information with the public and industry at large. The published objectives of the DOE PV program are to accelerate the development of PV as a national and global energy option, as well as ensure US technology and global market leadership. The approach to supporting these objectives is to understand what drives costs in PV applications. This paper and poster session describe work-in-progress in the form of a database that will help identify costs in PV systems. In an effort to address DOE's Five-Year PV Milestones, a program was established in the summer of 1999 to study system costs in three PV applications--solar home lighting, water pumping, and grid-tied systems. This work began with a RFQ requesting data from these types of systems. Creating a partnership with industry and other system organizations such as Non-Government Organizations (NGOs) was the approach chosen to maintain a close time to the systems in the field. Nine participants were selected as partners, who provided data on their systems. Two activities are emphasized in this work. For the first, an iterative approach of developing baseline reliability and costs information with the participants was taken. This effort led to identifying typical components in these systems as well as the specific data (metrics) that would be needed in any analysis used to understand total systems costs

  11. Performance Enhancements for Advanced Database Management Systems

    OpenAIRE

    Helmer, Sven

    2000-01-01

    New applications have emerged, demanding database management systems with enhanced functionality. However, high performance is a necessary precondition for the acceptance of such systems by end users. In this context we developed, implemented, and tested algorithms and index structures for improving the performance of advanced database management systems. We focused on index structures and join algorithms for set-valued attributes.

  12. Performance assessment of EMR systems based on post-relational database.

    Science.gov (United States)

    Yu, Hai-Yan; Li, Jing-Song; Zhang, Xiao-Guang; Tian, Yu; Suzuki, Muneou; Araki, Kenji

    2012-08-01

    Post-relational databases provide high performance and are currently widely used in American hospitals. As few hospital information systems (HIS) in either China or Japan are based on post-relational databases, here we introduce a new-generation electronic medical records (EMR) system called Hygeia, which was developed with the post-relational database Caché and the latest platform Ensemble. Utilizing the benefits of a post-relational database, Hygeia is equipped with an "integration" feature that allows all the system users to access data-with a fast response time-anywhere and at anytime. Performance tests of databases in EMR systems were implemented in both China and Japan. First, a comparison test was conducted between a post-relational database, Caché, and a relational database, Oracle, embedded in the EMR systems of a medium-sized first-class hospital in China. Second, a user terminal test was done on the EMR system Izanami, which is based on the identical database Caché and operates efficiently at the Miyazaki University Hospital in Japan. The results proved that the post-relational database Caché works faster than the relational database Oracle and showed perfect performance in the real-time EMR system.

  13. Depiction of Trends in Administrative Healthcare Data from Hospital Information System.

    Science.gov (United States)

    Kalankesh, Leila R; Pourasghar, Faramarz; Jafarabadi, Mohammad Asghari; Khanehdan, Negar

    2015-06-01

    administrative healthcare data are among main components of hospital information system. Such data can be analyzed and deployed for a variety of purposes. The principal aim of this research was to depict trends of administrative healthcare data from HIS in a general hospital from March 2011 to March 2014. data set used for this research was extracted from the SQL database of the hospital information system in Razi general hospital located in Marand. The data were saved as CSV (Comma Separated Values) in order to facilitate data cleaning and analysis. The variables of data set included patient's age, gender, final diagnosis, final diagnosis code based on ICD-10 classification system, date of hospitalization, date of discharge, LOS(Length of Stay), ward, and survival status of the patient. Data were analyzed and visualized after applying appropriate cleansing and preparing techniques. morbidity showed a constant trend over three years. Pregnancy, childbirth and the puerperium were the leading category of final diagnosis (about 32.8 %). The diseases of the circulatory system were the second class accounting for 13 percent of the hospitalization cases. The diseases of the digestive system had the third rank (10%). Patients aged between 14 and 44 constituted a higher proportion of total cases. Diseases of the circulatory system was the most common class of diseases among elderly patients (age≥65). The highest rate of mortality was observed among patients with final diagnosis of the circulatory system diseases followed by those with diseases of the respiratory system, and neoplasms. Mortality rate for the ICU and the CCU patients were 62% and 33% respectively. The longest average of LOS (7.3 days) was observed among patients hospitalized in the ICU while patients in the Obstetrics and Gynecology ward had the shortest average of LOS (2.4 days). Multiple regression analysis revealed that LOS was correlated with variables of surgery, gender, and type of payment, ward, the

  14. JT-60 database system, 1

    International Nuclear Information System (INIS)

    Kurihara, Kenichi; Kimura, Toyoaki; Itoh, Yasuhiro.

    1987-07-01

    Naturally, sufficient software circumstance makes it possible to analyse the discharge result data effectively. JT-60 discharge result data, collected by the supervisor, are transferred to the general purpose computer through the new linkage channel, and are converted to ''database''. Datafile in the database was designed to be surrounded by various interfaces. This structure is able to preserve the datafile reliability and does not expect the user's information about the datafile structure. In addition, the support system for graphic processing was developed so that the user may easily obtain the figures with some calculations. This paper reports on the basic concept and system design. (author)

  15. Online Databases for Health Professionals

    OpenAIRE

    Marshall, Joanne Gard

    1987-01-01

    Recent trends in the marketing of electronic information technology have increased interest among health professionals in obtaining direct access to online biomedical databases such as Medline. During 1985, the Canadian Medical Association (CMA) and Telecom Canada conducted an eight-month trial of the use made of online information retrieval systems by 23 practising physicians and one pharmacist. The results of this project demonstrated both the value and the limitations of these systems in p...

  16. Computer Application Of Object Oriented Database Management ...

    African Journals Online (AJOL)

    Object Oriented Systems (OOS) have been widely adopted in software engineering because of their superiority with respect to data extensibility. The present trend in the software engineering process (SEP) towards concurrent computing raises novel concerns for the facilities and technology available in database ...

  17. Quality assurance tracking and trending system (QATTS)

    International Nuclear Information System (INIS)

    Anderson, W.J.

    1987-01-01

    In 1984, The Philadelphia Electric Company (PECo) Quality Assurance (QA) Division recognized a need to modify the existing quality finding tracking program to generate a nuclear trending program that could detect trends of PECo-initiated findings that were not detectable to a day-to-day observer. Before 1984, each quality organization in PECo had a separate tracking system. An adequate quality trending program demanded that all findings be tracked in a common data base. The Quality Assurance Tracking and Trending System (QATTS) is divided into two parts, an on-line subsystem that provides access to QATTS data via corporate computer data screens and a reports and graphics subsystem that connects commercially available reports and graphic software computer packages to the QATTS data base. The QATTS can be accessed from any terminal connected to the main frame computer at PECo headquarters. The paper discusses the tracking system, report generation, responsible organization commitment tracking system (ROCT), and trending program

  18. Function and organization of CPC database system

    International Nuclear Information System (INIS)

    Yoshida, Tohru; Tomiyama, Mineyoshi.

    1986-02-01

    It is very time-consuming and expensive work to develop computer programs. Therefore, it is desirable to effectively use the existing program. For this purpose, it is required for researchers and technical staffs to obtain the relevant informations easily. CPC (Computer Physics Communications) is a journal published to facilitate the exchange of physics programs and of the relevant information about the use of computers in the physics community. There are about 1300 CPC programs in JAERI computing center, and the number of programs is increasing. A new database system (CPC database) has been developed to manage the CPC programs and their information. Users obtain information about all the programs stored in the CPC database. Also users can find and copy the necessary program by inputting the program name, the catalogue number and the volume number. In this system, each operation is done by menu selection. Every CPC program is compressed and stored in the database; the required storage size is one third of the non-compressed format. Programs unused for a long time are moved to magnetic tape. The present report describes the CPC database system and the procedures for its use. (author)

  19. The ATLAS Distributed Data Management System & Databases

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Barisits, M; Beermann, T; Vigne, R; Serfon, C

    2013-01-01

    The ATLAS Distributed Data Management (DDM) System is responsible for the global management of petabytes of high energy physics data. The current system, DQ2, has a critical dependency on Relational Database Management Systems (RDBMS), like Oracle. RDBMS are well-suited to enforcing data integrity in online transaction processing applications, however, concerns have been raised about the scalability of its data warehouse-like workload. In particular, analysis of archived data or aggregation of transactional data for summary purposes is problematic. Therefore, we have evaluated new approaches to handle vast amounts of data. We have investigated a class of database technologies commonly referred to as NoSQL databases. This includes distributed filesystems, like HDFS, that support parallel execution of computational tasks on distributed data, as well as schema-less approaches via key-value stores, like HBase. In this talk we will describe our use cases in ATLAS, share our experiences with various databases used ...

  20. Offshore Blowouts, Causes and Trends

    Energy Technology Data Exchange (ETDEWEB)

    Holand, P

    1996-02-01

    The main objective of this doctoral thesis was to establish an improved design basis for offshore installations with respect to blowout risk analyses. The following sub objectives are defined: (1) Establish an offshore blowout database suitable for risk analyses, (2) Compare the blowout risk related to loss of lives with the total offshore risk and risk in other industries, (3) Analyse blowouts with respect to parameters that are important to describe and quantify blowout risk that has been experienced to be able to answer several questions such as under what operations have blowouts occurred, direct causes, frequency of occurrence etc., (4) Analyse blowouts with respect to trends. The research strategy applied includes elements from both survey strategy and case study strategy. The data are systematized in the form of a new database developed from the MARINTEK database. Most blowouts in the analysed period occurred during drilling operations. Shallow gas blowouts were more frequent than deep blowouts and workover blowouts occurred more often than deep development drilling blowouts. Relatively few blowouts occurred during completion, wireline and normal production activities. No significant trend in blowout occurrences as a function of time could be observed, except for completion blowouts that showed a significantly decreasing trend. But there were trends regarding some important parameters for risk analyses, e.g. the ignition probability has decreased and diverter systems have improved. Only 3.5% of the fatalities occurred because of blowouts. 106 refs., 51 figs., 55 tabs.

  1. An Integrated Enterprise Accelerator Database for the SLC Control System

    International Nuclear Information System (INIS)

    2002-01-01

    Since its inception in the early 1980's, the SLC Control System has been driven by a highly structured memory-resident real-time database. While efficient, its rigid structure and file-based sources makes it difficult to maintain and extract relevant information. The goal of transforming the sources for this database into a relational form is to enable it to be part of a Control System Enterprise Database that is an integrated central repository for SLC accelerator device and Control System data with links to other associated databases. We have taken the concepts developed for the NLC Enterprise Database and used them to create and load a relational model of the online SLC Control System database. This database contains data and structure to allow querying and reporting on beamline devices, their associations and parameters. In the future this will be extended to allow generation of EPICS and SLC database files, setup of applications and links to other databases such as accelerator maintenance, archive data, financial and personnel records, cabling information, documentation etc. The database is implemented using Oracle 8i. In the short term it will be updated daily in batch from the online SLC database. In the longer term, it will serve as the primary source for Control System static data, an R and D platform for the NLC, and contribute to SLC Control System operations

  2. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  3. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  4. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  5. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  6. 2011 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  7. Safety system function trends

    International Nuclear Information System (INIS)

    Johnson, C.

    1989-01-01

    This paper describes research to develop risk-based indicators of plant safety performance. One measure of the safety-performance of operating nuclear power plants is the unavailability of important safety systems. Brookhaven National Laboratory and Science Applications International Corporation are evaluating ways to aggregate train-level or component-level data to provide such an indicator. This type of indicator would respond to changes in plant safety margins faster than the currently used indicator of safety system unavailability (i.e., safety system failures reported in licensee event reports). Trends in the proposed indicator would be one indication of trends in plant safety performance and maintenance effectiveness. This paper summarizes the basis for such an indicator, identifies technical issues to be resolved, and illustrates the potential usefullness of such indicators by means of computer simulations and case studies

  8. Comparison of the Frontier Distributed Database Caching System with NoSQL Databases

    CERN Document Server

    Dykstra, David

    2012-01-01

    One of the main attractions of non-relational "NoSQL" databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also has high scalability and wide-area distributability for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  9. Extended functions of the database machine FREND for interactive systems

    International Nuclear Information System (INIS)

    Hikita, S.; Kawakami, S.; Sano, K.

    1984-01-01

    Well-designed visual interfaces encourage non-expert users to use relational database systems. In those systems such as office automation systems or engineering database systems, non-expert users interactively access to database from visual terminals. Some users may want to occupy database or other users may share database according to various situations. Because, those jobs need a lot of time to be completed, concurrency control must be well designed to enhance the concurrency. The extended method of concurrency control of FREND is presented in this paper. The authors assume that systems are composed of workstations, a local area network and the database machine FREND. This paper also stresses that those workstations and FREND must cooperate to complete concurrency control for interactive applications

  10. Active in-database processing to support ambient assisted living systems.

    Science.gov (United States)

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  11. Developing of tensile property database system

    International Nuclear Information System (INIS)

    Park, S. J.; Kim, D. H.; Jeon, J.; Ryu, W. S.

    2002-01-01

    The data base construction using the data produced from tensile experiment can increase the application of test results. Also, we can get the basic data ease from database when we prepare the new experiment and can produce high quality result by compare the previous data. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, the tensile database system was developed by internet method using JSP(Java Server pages) tool

  12. Top Four Trends in Student Information Systems

    Science.gov (United States)

    Weathers, Robert

    2013-01-01

    The modern student information systems (SIS) is a powerful administrative tool with robust functionality. As such, it is essential that school and district administrators consider the top trends in modern student information systems before going forward with system upgrades or new purchases. These trends, described herein, are: (1) Support for…

  13. Similarity joins in relational database systems

    CERN Document Server

    Augsten, Nikolaus

    2013-01-01

    State-of-the-art database systems manage and process a variety of complex objects, including strings and trees. For such objects equality comparisons are often not meaningful and must be replaced by similarity comparisons. This book describes the concepts and techniques to incorporate similarity into database systems. We start out by discussing the properties of strings and trees, and identify the edit distance as the de facto standard for comparing complex objects. Since the edit distance is computationally expensive, token-based distances have been introduced to speed up edit distance comput

  14. Nuclear data processing using a database management system

    International Nuclear Information System (INIS)

    Castilla, V.; Gonzalez, L.

    1991-01-01

    A database management system that permits the design of relational models was used to create an integrated database with experimental and evaluated nuclear data.A system that reduces the time and cost of processing was created for computers type EC or compatibles.A set of programs for the conversion from nuclear calculated data output format to EXFOR format was developed.A dictionary to perform a retrospective search in the ENDF database was created too

  15. ALARA database value in future outage work planning and dose management

    International Nuclear Information System (INIS)

    Miller, D.W.; Green, W.H.

    1995-01-01

    ALARA database encompassing job-specific duration and man-rem plant specific information over three refueling outages represents an invaluable tool for the outage work planner and ALARA engineer. This paper describes dose-management trends emerging based on analysis of three refueling outages at Clinton Power Station. Conclusions reached based on hard data available from a relational database dose-tracking system is a valuable tool for planning of future outage work. The system's ability to identify key problem areas during a refueling outage is improving as more outage comparative data becomes available. Trends over a three outage period are identified in this paper in the categories of number and type of radiation work permits implemented, duration of jobs, projected vs. actual dose rates in work areas, and accuracy of outage person-rem projection. The value of the database in projecting 1 and 5 year station person-rem estimates is discussed

  16. Building a columnar database on shared main memory-based storage

    OpenAIRE

    Tinnefeld, Christian

    2014-01-01

    In the field of disk-based parallel database management systems exists a great variety of solutions based on a shared-storage or a shared-nothing architecture. In contrast, main memory-based parallel database management systems are dominated solely by the shared-nothing approach as it preserves the in-memory performance advantage by processing data locally on each server. We argue that this unilateral development is going to cease due to the combination of the following three trends: a) Nowad...

  17. Comparing the Global Charcoal Database with Burned Area Trends from an Offline Fire Model Driven by the NCAR Last Millennium Ensemble

    Science.gov (United States)

    Schaefer, A.; Magi, B. I.; Marlon, J. R.; Bartlein, P. J.

    2017-12-01

    This study uses an offline fire model driven by output from the NCAR Community Earth System Model Last Millennium Ensemble (LME) to evaluate how climate, ecological, and human factors contributed to burned area over the past millennium, and uses the Global Charcoal Database (GCD) record of fire activity as a constraint. The offline fire model is similar to the fire module within the NCAR Community Land Model. The LME experiment includes 13 simulations of the Earth system from 850 CE through 2005 CE, and the fire model simulates burned area using LME climate and vegetation with imposed land use and land cover change. The fire model trends are compared to GCD records of charcoal accumulation rates derived from sediment cores. The comparisons are a way to assess the skill of the fire model, but also set up a methodology to directly test hypotheses of the main drivers of fire patterns over the past millennium. The focus is on regions selected from the GCD with high data density, and that have lake sediment cores that best capture the last millennium. Preliminary results are based on a fire model which excludes burning cropland and pasture land cover types, but this allows some assessment of how climate variability is captured by the fire model. Generally, there is good agreement between modeled burned area trends and fire trends from GCD for many regions of interest, suggesting the strength of climate variability as a control. At the global scale, trends and features are similar from 850 to 1700, which includes the Medieval Climate Anomaly and the Little Ice Age. After 1700, the trends significantly deviate, which may be due to non-cultivated land being converted to cultivated. In key regions of high data density in the GCD such as the Western USA, the trends agree from 850 to 1200 but diverge from 1200 to 1300. From 1300 to 1800, the trends show good agreement again. Implementing processes to include burning cultivated land within the fire model is anticipated to

  18. Developing of corrosion and creep property test database system

    International Nuclear Information System (INIS)

    Park, S. J.; Jun, I.; Kim, J. S.; Ryu, W. S.

    2004-01-01

    The corrosion and creep characteristics database systems were constructed using the data produced from corrosion and creep test and designed to hold in common the data and programs of tensile, impact, fatigue characteristics database that was constructed since 2001 and others characteristics databases that will be constructed in future. We can easily get the basic data from the corrosion and creep characteristics database systems when we prepare the new experiment and can produce high quality result by compare the previous test result. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, we describe the procedure about analysis, design and development of the impact and fatigue characteristics database systems developed by internet method using jsp(Java Server pages) tool

  19. Developing of impact and fatigue property test database system

    International Nuclear Information System (INIS)

    Park, S. J.; Jun, I.; Kim, D. H.; Ryu, W. S.

    2003-01-01

    The impact and fatigue characteristics database systems were constructed using the data produced from impact and fatigue test and designed to hold in common the data and programs of tensile characteristics database that was constructed on 2001 and others characteristics databases that will be constructed in future. We can easily get the basic data from the impact and fatigue characteristics database systems when we prepare the new experiment and can produce high quality result by compare the previous data. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, we describe the procedure about analysis, design and development of the impact and fatigue characteristics database systems developed by internet method using jsp(Java Server pages) tool

  20. Comparison of the Frontier Distributed Database Caching System to NoSQL Databases

    Science.gov (United States)

    Dykstra, Dave

    2012-12-01

    One of the main attractions of non-relational “NoSQL” databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also adds high scalability and the ability to be distributed over a wide-area for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  1. Comparison of the Frontier Distributed Database Caching System to NoSQL Databases

    International Nuclear Information System (INIS)

    Dykstra, Dave

    2012-01-01

    One of the main attractions of non-relational “NoSQL” databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also adds high scalability and the ability to be distributed over a wide-area for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  2. Comparison of the Frontier Distributed Database Caching System to NoSQL Databases

    Energy Technology Data Exchange (ETDEWEB)

    Dykstra, Dave [Fermilab

    2012-07-20

    One of the main attractions of non-relational NoSQL databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also adds high scalability and the ability to be distributed over a wide-area for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  3. Active In-Database Processing to Support Ambient Assisted Living Systems

    Directory of Open Access Journals (Sweden)

    Wagner O. de Morais

    2014-08-01

    Full Text Available As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  4. DOE technology information management system database study report

    Energy Technology Data Exchange (ETDEWEB)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  5. Comparison of the Frontier Distributed Database Caching System with NoSQL Databases

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Non-relational "NoSQL" databases such as Cassandra and CouchDB are best known for their ability to scale to large numbers of clients spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects, is based on traditional SQL databases but also has the same high scalability and wide-area distributability for an important subset of applications. This paper compares the architectures, behavior, performance, and maintainability of the two different approaches and identifies the criteria for choosing which approach to prefer over the other.

  6. Software Application for Supporting the Education of Database Systems

    Science.gov (United States)

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  7. An Analysis of Trends in U.S. Stormater Utility and Fee Systems

    OpenAIRE

    Kea, Kandace

    2015-01-01

    Many municipalities have established stormwater user fees (SUFs), commonly known as stormwater utilities, to raise revenue for stormwater management programs, however little is known about the trends among the fees currently in existence. This research observes trends in the establishment, type and magnitude of user fees by analyzing location, population density, home value, and establishment for a comprehensive national stormwater user fee database with data for 1,490 user fees. The Equivale...

  8. Database application research in real-time data access of accelerator control system

    International Nuclear Information System (INIS)

    Chen Guanghua; Chen Jianfeng; Wan Tianmin

    2012-01-01

    The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

  9. Design and implementation of typical target image database system

    International Nuclear Information System (INIS)

    Qin Kai; Zhao Yingjun

    2010-01-01

    It is necessary to provide essential background data and thematic data timely in image processing and application. In fact, application is an integrating and analyzing procedure with different kinds of data. In this paper, the authors describe an image database system which classifies, stores, manages and analyzes database of different types, such as image database, vector database, spatial database, spatial target characteristics database, its design and structure. (authors)

  10. A comparison of database systems for XML-type data.

    Science.gov (United States)

    Risse, Judith E; Leunissen, Jack A M

    2010-01-01

    In the field of bioinformatics interchangeable data formats based on XML are widely used. XML-type data is also at the core of most web services. With the increasing amount of data stored in XML comes the need for storing and accessing the data. In this paper we analyse the suitability of different database systems for storing and querying large datasets in general and Medline in particular. All reviewed database systems perform well when tested with small to medium sized datasets, however when the full Medline dataset is queried a large variation in query times is observed. There is not one system that is vastly superior to the others in this comparison and, depending on the database size and the query requirements, different systems are most suitable. The best all-round solution is the Oracle 11~g database system using the new binary storage option. Alias-i's Lingpipe is a more lightweight, customizable and sufficiently fast solution. It does however require more initial configuration steps. For data with a changing XML structure Sedna and BaseX as native XML database systems or MySQL with an XML-type column are suitable.

  11. An inductive database system based on virtual mining views

    NARCIS (Netherlands)

    Blockeel, H.; Calders, T.G.K.; Fromont, É.; Goethals, B.; Prado, A.; Robardet, C.

    2012-01-01

    Inductive databases integrate database querying with database mining. In this article, we present an inductive database system that does not rely on a new data mining query language, but on plain SQL. We propose an intuitive and elegant framework based on virtual mining views, which are relational

  12. The Network Configuration of an Object Relational Database Management System

    Science.gov (United States)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  13. The magnet database system

    International Nuclear Information System (INIS)

    Ball, M.J.; Delagi, N.; Horton, B.; Ivey, J.C.; Leedy, R.; Li, X.; Marshall, B.; Robinson, S.L.; Tompkins, J.C.

    1992-01-01

    The Test Department of the Magnet Systems Division of the Superconducting Super Collider Laboratory (SSCL) is developing a central database of SSC magnet information that will be available to all magnet scientists at the SSCL or elsewhere, via network connections. The database contains information on the magnets' major components, configuration information (specifying which individual items were used in each cable, coil, and magnet), measurements made at major fabrication stages, and the test results on completed magnets. These data will facilitate the correlation of magnet performance with the properties of its constituents. Recent efforts have focused on the development of procedures for user-friendly access to the data, including displays in the format of the production open-quotes travelerclose quotes data sheets, standard summary reports, and a graphical interface for ad hoc queues and plots

  14. Database management in the new GANIL control system

    International Nuclear Information System (INIS)

    Lecorche, E.; Lermine, P.

    1993-01-01

    At the start of the new control system design, decision was made to manage the huge amount of data by means of a database management system. The first implementations built on the INGRES relational database are described. Real time and data management domains are shown, and problems induced by Ada/SQL interfacing are briefly discussed. Database management concerns the whole hardware and software configuration for the GANIL pieces of equipment and the alarm system either for the alarm configuration or for the alarm logs. An other field of application encompasses the beam parameter archiving as a function of the various kinds of beams accelerated at GANIL (ion species, energies, charge states). (author) 3 refs., 4 figs

  15. LHCb Conditions database operation assistance systems

    International Nuclear Information System (INIS)

    Clemencic, M; Shapoval, I; Cattaneo, M; Degaudenzi, H; Santinelli, R

    2012-01-01

    The Conditions Database (CondDB) of the LHCb experiment provides versioned, time dependent geometry and conditions data for all LHCb data processing applications (simulation, high level trigger (HLT), reconstruction, analysis) in a heterogeneous computing environment ranging from user laptops to the HLT farm and the Grid. These different use cases impose front-end support for multiple database technologies (Oracle and SQLite are used). Sophisticated distribution tools are required to ensure timely and robust delivery of updates to all environments. The content of the database has to be managed to ensure that updates are internally consistent and externally compatible with multiple versions of the physics application software. In this paper we describe three systems that we have developed to address these issues. The first system is a CondDB state tracking extension to the Oracle 3D Streams replication technology, to trap cases when the CondDB replication was corrupted. Second, an automated distribution system for the SQLite-based CondDB, providing also smart backup and checkout mechanisms for the CondDB managers and LHCb users respectively. And, finally, a system to verify and monitor the internal (CondDB self-consistency) and external (LHCb physics software vs. CondDB) compatibility. The former two systems are used in production in the LHCb experiment and have achieved the desired goal of higher flexibility and robustness for the management and operation of the CondDB. The latter one has been fully designed and is passing currently to the implementation stage.

  16. The ALADDIN atomic physics database system

    International Nuclear Information System (INIS)

    Hulse, R.A.

    1990-01-01

    ALADDIN is an atomic physics database system which has been developed in order to provide a broadly-based standard medium for the exchange and management of atomic data. ALADDIN consists of a data format definition together with supporting software for both interactive searches as well as for access to the data by plasma modeling and other codes. 8AB The ALADDIN system is designed to offer maximum flexibility in the choice of data representations and labeling schemes, so as to support a wide range of atomic physics data types and allow natural evolution and modification of the database as needs change. Associated dictionary files are included in the ALADDIN system for data documentation. The importance of supporting the widest possible user community was also central to be ALADDIN design, leading to the use of straightforward text files with concatentated data entries for the file structure, and the adoption of strict FORTRAN 77 code for the supporting software. This will allow ready access to the ALADDIN system on the widest range of scientific computers, and easy interfacing with FORTRAN modeling codes, user developed atomic physics codes and database, etc. This supporting software consists of the ALADDIN interactive searching and data display code, together with the ALPACK subroutine package which provides ALADDIN datafile searching and data retrieval capabilities to user's codes

  17. Design of SMART alarm system using main memory database

    International Nuclear Information System (INIS)

    Jang, Kue Sook; Seo, Yong Seok; Park, Keun Oak; Lee, Jong Bok; Kim, Dong Hoon

    2001-01-01

    To achieve design goal of SMART alarm system, first of all we have to decide on how to handle and manage alarm information and how to use database. So this paper analyses concepts and deficiencies of main memory database applied in real time system. And this paper sets up structure and processing principles of main memory database using nonvolatile memory such as flash memory and develops recovery strategy and process board structures using these. Therefore this paper shows design of SMART alarm system is suited functions and requirements

  18. SPIRE Data-Base Management System

    Science.gov (United States)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  19. Conceptual design of nuclear power plants database system

    International Nuclear Information System (INIS)

    Ishikawa, Masaaki; Izumi, Fumio; Sudoh, Takashi.

    1984-03-01

    This report is the result of the joint study on the developments of the nuclear power plants database system. The present conceptual design of the database system, which includes Japanese character processing and image processing, has been made on the data of safety design parameters mainly found in the application documents for reactor construction permit made available to the public. (author)

  20. Information retrieval system of nuclear power plant database (PPD) user's guide

    International Nuclear Information System (INIS)

    Izumi, Fumio; Horikami, Kunihiko; Kobayashi, Kensuke.

    1990-12-01

    A nuclear power plant database (PPD) and its retrieval system have been developed. The database involves a large number of safety design data of nuclear power plants, operating and planned in Japan. The information stored in the database can be retrieved at high speed, whenever they are needed, by use of the retrieval system. The report is a user's manual of the system to access the database utilizing a display unit of the JAERI computer network system. (author)

  1. Trend of research and development on clearance system for CBRNE agents

    International Nuclear Information System (INIS)

    Ishihara, Masayuki; Fujita, Masanori; Ishida, Natsuko; Hattori, Hidemi; Tachibana, Shoich; Nakamura, Shingo; Kanatani, Yasuhiro

    2013-01-01

    In a field of disaster for a wide variety of Chemical, Biological, Radiological, Nuclear, Explosive (CBRNE) agents which are harmful and deadly poisonous, it is crucial to effectively clear those agents for prevention of expanded damages and retrieval from damage. The clearance technologies in the CBRNE disaster field require to be safe for human body and to be friendly for the environments. In addition, they need scientific data-based evidences for their effectiveness and safety. The aim of this article is to view the trend of research on clearances of N, B and c agents in terms of rapidity, simplicity and economic rationality, and to deliberate on effective clearance system using adequate equipments and materials for detoxification, decomposition and removal of each contamination. (author)

  2. Airports and Navigation Aids Database System -

    Data.gov (United States)

    Department of Transportation — Airport and Navigation Aids Database System is the repository of aeronautical data related to airports, runways, lighting, NAVAID and their components, obstacles, no...

  3. Nuclear Criticality Information System. Database examples

    Energy Technology Data Exchange (ETDEWEB)

    Foret, C.A.

    1984-06-01

    The purpose of this publication is to provide our users with a guide to using the Nuclear Criticality Information System (NCIS). It is comprised of an introduction, an information and resources section, a how-to-use section, and several useful appendices. The main objective of this report is to present a clear picture of the NCIS project and its available resources as well as assisting our users in accessing the database and using the TIS computer to process data. The introduction gives a brief description of the NCIS project, the Technology Information System (TIS), online user information, future plans and lists individuals to contact for additional information about the NCIS project. The information and resources section outlines the NCIS database and describes the resources that are available. The how-to-use section illustrates access to the NCIS database as well as searching datafiles for general or specific data. It also shows how to access and read the NCIS news section as well as connecting to other information centers through the TIS computer.

  4. Nuclear Criticality Information System. Database examples

    International Nuclear Information System (INIS)

    Foret, C.A.

    1984-06-01

    The purpose of this publication is to provide our users with a guide to using the Nuclear Criticality Information System (NCIS). It is comprised of an introduction, an information and resources section, a how-to-use section, and several useful appendices. The main objective of this report is to present a clear picture of the NCIS project and its available resources as well as assisting our users in accessing the database and using the TIS computer to process data. The introduction gives a brief description of the NCIS project, the Technology Information System (TIS), online user information, future plans and lists individuals to contact for additional information about the NCIS project. The information and resources section outlines the NCIS database and describes the resources that are available. The how-to-use section illustrates access to the NCIS database as well as searching datafiles for general or specific data. It also shows how to access and read the NCIS news section as well as connecting to other information centers through the TIS computer

  5. Thermodynamic database for the Co-Pr system

    Directory of Open Access Journals (Sweden)

    S.H. Zhou

    2016-03-01

    Full Text Available In this article, we describe data on (1 compositions for both as-cast and heat treated specimens were summarized in Table 1; (2 the determined enthalpy of mixing of liquid phase is listed in Table 2; (3 thermodynamic database of the Co-Pr system in TDB format for the research articled entitle Chemical partitioning for the Co-Pr system: First-principles, experiments and energetic calculations to investigate the hard magnetic phase W. Keywords: Thermodynamic database of Co-Pr, Solution calorimeter measurement, Phase diagram Co-Pr

  6. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  7. Overview of AEOD's program for trending reactor operational events

    International Nuclear Information System (INIS)

    Baranowsky, P.W.; O'Reilly, P.D.; Rasmuson, D.M.; Houghton, J.R.

    1994-01-01

    This paper presents an overview of the trending program being performed by AEOD. The major elements of the program include: (1) system and component reliability trending and analysis, (2) special data collection and analysis (e.g., IPE and PRA component failure data, common cause failure event data), (3) risk assessment of safety issues based on actual operating experience, (4) Accident Sequence Precursor (ASP) Program, and (5) trending US industry risk. AEOD plans to maintain up-to-date safety data trends for selected high risk or high regulatory profile components, systems, accident initiators, accident sequences, and regulatory issues. AEOD will also make greater use of PRA insights and perform limited probabilistic safety assessments to evaluate the safety significance of qualitative results. Examples of a system study and an issue evaluation are presented, as well as a summary of the common cause failure event database

  8. Centralized database for interconnection system design. [for spacecraft

    Science.gov (United States)

    Billitti, Joseph W.

    1989-01-01

    A database application called DFACS (Database, Forms and Applications for Cabling and Systems) is described. The objective of DFACS is to improve the speed and accuracy of interconnection system information flow during the design and fabrication stages of a project, while simultaneously supporting both the horizontal (end-to-end wiring) and the vertical (wiring by connector) design stratagems used by the Jet Propulsion Laboratory (JPL) project engineering community. The DFACS architecture is centered around a centralized database and program methodology which emulates the manual design process hitherto used at JPL. DFACS has been tested and successfully applied to existing JPL hardware tasks with a resulting reduction in schedule time and costs.

  9. Portable database driven control system for SPEAR

    International Nuclear Information System (INIS)

    Howry, S.; Gromme, T.; King, A.; Sullenberger, M.

    1985-04-01

    The new computer control system software for SPEAR is presented as a transfer from the PEP system. Features of the target ring (SPEAR) such as symmetries, magnet groupings, etc., are all contained in a design file which is read by both people and computer. People use it as documentation; a program reads it to generate the database structure, which becomes the center of communication for all the software. Geometric information, such as element positions and lengths, and CAMAC I/O routing information is entered into the database as it is developed. Since application processes refer only to the database and since they do so only in generic terms, almost all of this software (representing more then fifteen man years) is transferred with few changes. Operator console menus (touchpanels) are also transferred with only superficial changes for the same reasons. The system is modular: the CAMAC I/O software is all in one process; the menu control software is a process; the ring optics model and the orbit model are separate processes, each of which runs concurrently with about 15 others in the multiprogramming environment of the VAX/VMS operating system. 10 refs., 1 fig

  10. Portable database driven control system for SPEAR

    Energy Technology Data Exchange (ETDEWEB)

    Howry, S.; Gromme, T.; King, A.; Sullenberger, M.

    1985-04-01

    The new computer control system software for SPEAR is presented as a transfer from the PEP system. Features of the target ring (SPEAR) such as symmetries, magnet groupings, etc., are all contained in a design file which is read by both people and computer. People use it as documentation; a program reads it to generate the database structure, which becomes the center of communication for all the software. Geometric information, such as element positions and lengths, and CAMAC I/O routing information is entered into the database as it is developed. Since application processes refer only to the database and since they do so only in generic terms, almost all of this software (representing more then fifteen man years) is transferred with few changes. Operator console menus (touchpanels) are also transferred with only superficial changes for the same reasons. The system is modular: the CAMAC I/O software is all in one process; the menu control software is a process; the ring optics model and the orbit model are separate processes, each of which runs concurrently with about 15 others in the multiprogramming environment of the VAX/VMS operating system. 10 refs., 1 fig.

  11. Kingfisher: a system for remote sensing image database management

    Science.gov (United States)

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  12. Resource Survey Relational Database Management System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Mississippi Laboratories employ both enterprise and localized data collection systems for recording data. The databases utilized by these applications range from...

  13. Trends of Concurrent Ankle Arthroscopy at the Time of Operative Treatment of Ankle Fracture: A National Database Review.

    Science.gov (United States)

    Ackermann, Jakob; Fraser, Ethan J; Murawski, Christopher D; Desai, Payal; Vig, Khushdeep; Kennedy, John G

    2016-04-01

    The purpose of this study was to report trends associated with concurrent ankle arthroscopy at the time of operative treatment of ankle fracture. The current procedural terminology (CPT) billing codes were used to search the PearlDiver Patient Record Database and identify all patients who were treated for acute ankle fracture in the United States. The Medicare Standard Analytic Files were searchable between 2005 and 2011 and the United Healthcare Orthopedic Dataset from 2007 to 2011. Annual trends were expressed only between 2007 and 2011, as it was the common time period among both databases. Demographic factors were identified for all procedures as well as the cost aspect using the Medicare data set. In total, 32 307 patients underwent open reduction internal fixation (ORIF) of an ankle fracture, of whom 313 (1.0%) had an ankle arthroscopy performed simultaneously. Of those 313 cases, 70 (22.4%) patients received microfracture treatment. Between 2005 and 2011, 85 203 patients were treated for an ankle fracture whether via ORIF or closed treatment. Of these, a total of 566 patients underwent arthroscopic treatment within 7 years. The prevalence of arthroscopy after ankle fracture decreased significantly by 45% from 2007 to 2011 (Pankle fracture treatment, it appears that only a small proportion of surgeons in the United States perform these procedures concurrently. Therapeutic, Level IV: Retrospective. © 2015 The Author(s).

  14. Distributed Database Control and Allocation. Volume 3. Distributed Database System Designer’s Handbook.

    Science.gov (United States)

    1983-10-01

    Multiversion Data 2-18 2.7.1 Multiversion Timestamping 2-20 2.T.2 Multiversion Looking 2-20 2.8 Combining the Techniques 2-22 3. Database Recovery Algorithms...See rTHEM79, GIFF79] for details. 2.7 Multiversion Data Let us return to a database system model where each logical data item is stored at one DM...In a multiversion database each Write wifxl, produces a new copy (or version) of x, denoted xi. Thus, the value of z is a set of ver- sions. For each

  15. Development of database system on MOX fuel for water reactors (I)

    International Nuclear Information System (INIS)

    Kikuchi, Keiichi; Nakazawa, Hiroaki; Abe, Tomoyuki; Shirai, Takao

    2000-04-01

    JNC has been conducted a great number of irradiation tests to develop MOX fuels for Advanced Thermal Reactor and Light Water Reactors. In order to manage irradiation data consistently and to effectively utilize valuable data obtained from the irradiation tests, we commenced construction of database system on MOX fuel for water reactors in 1998 JFY. Collection and selection of irradiation data and relevant fuel fabrication data, design of the database system and preparation of assisting programs have been finished and data registration onto the system is under way according to priority at present. The database system can be operated through the menu screen on PC. About 94,000 records of data on 11 fuel assemblies in total have been registered onto the database up to the present. By conducting registration of the remaining data and some modification of the system, if necessary, the database system is expected to complete in 2000 JFY. The completed database system is to be distributed to relevant sections in JNC by means of CD-R as a media. This report is an interim report covering 1998 and 1999 JFY, which gives the structure explanation and users manual concerning to the prepared database up to the present. (author)

  16. Failure and Maintenance Analysis Using Web-Based Reliability Database System

    International Nuclear Information System (INIS)

    Hwang, Seok Won; Kim, Myoung Su; Seong, Ki Yeoul; Na, Jang Hwan; Jerng, Dong Wook

    2007-01-01

    Korea Hydro and Nuclear Power Company has lunched the development of a database system for PSA and Maintenance Rule implementation. It focuses on the easy processing of raw data into a credible and useful database for the risk-informed environment of nuclear power plant operation and maintenance. Even though KHNP had recently completed the PSA for all domestic NPPs as a requirement of the severe accident mitigation strategy, the component failure data were only gathered as a means of quantification purposes for the relevant project. So, the data were not efficient enough for the Living PSA or other generic purposes. Another reason to build a real time database is for the newly adopted Maintenance Rule, which requests the utility to continuously monitor the plant risk based on its operation and maintenance performance. Furthermore, as one of the pre-condition for the Risk Informed Regulation and Application, the nuclear regulatory agency of Korea requests the development and management of domestic database system. KHNP is stacking up data of operation and maintenance on the Enterprise Resource Planning (ERP) system since its first opening on July, 2003. But, so far a systematic review has not been performed to apply the component failure and maintenance history for PSA and other reliability analysis. The data stored in PUMAS before the ERP system is introduced also need to be converted and managed into the new database structure and methodology. This reliability database system is a web-based interface on a UNIX server with Oracle relational database. It is designed to be applicable for all domestic NPPs with a common database structure and the web interfaces, therefore additional program development would not be necessary for data acquisition and processing in the near future. Categorization standards for systems and components have been implemented to analyze all domestic NPPs. For example, SysCode (for a system code) and CpCode (for a component code) were newly

  17. A59 Drum Activity database (DRUMAC): system documentation

    International Nuclear Information System (INIS)

    Keel, Alan.

    1993-01-01

    This paper sets out the requirements, database design, software module designs and test plans for DRUMAC (the Active handling Building Drum Activity Database) - a computer-based system to record the radiological inventory for LLW/ILW drums dispatched from the Active Handling Building. (author)

  18. Developing of database on nuclear power engineering and purchase of ORACLE system

    International Nuclear Information System (INIS)

    Liu Renkang

    1996-01-01

    This paper presents a point of view according development of database on the nuclear power engineering and performance of ORACLE database manager system. ORACLE system is a practical database system for purchasing

  19. A Bayesian model for anomaly detection in SQL databases for security systems

    NARCIS (Netherlands)

    Drugan, M.M.

    2017-01-01

    We focus on automatic anomaly detection in SQL databases for security systems. Many logs of database systems, here the Townhall database, contain detailed information about users, like the SQL queries and the response of the database. A database is a list of log instances, where each log instance is

  20. FY 1993 annual report. Survey and study on establishment of databases for body functions; 1993 nendo shintai kino database no kochiku ni kansuru chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-03-01

    As part of the health/welfare-related information collection, analysis and information service project, establishment of the databases is surveyed and studied for human life technology and body functions of the aged in the aging society. The survey/study on establishment of the human life technology for the aged covers concept of human life technology, systems of the databases for human life technology, and techniques for the database systems. The case study on the human life technology databases for the aged takes up everyday life behaviors of the aged as the models, and analyzes human and life characteristics in everyday life, to clarify the human characteristic, human performance and human life technology design data to be stored in the databases. The validity of the method developed by this project is tested for their behaviors, such as bathing and outgoing. For establishment of the databases for body functions of the aged, literature surveys and interviews are conducted for the technological trends. (NEDO)

  1. Database design for Physical Access Control System for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Sathishkumar, T., E-mail: satishkumart@igcar.gov.in; Rao, G. Prabhakara, E-mail: prg@igcar.gov.in; Arumugam, P., E-mail: aarmu@igcar.gov.in

    2016-08-15

    Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

  2. Database design for Physical Access Control System for nuclear facilities

    International Nuclear Information System (INIS)

    Sathishkumar, T.; Rao, G. Prabhakara; Arumugam, P.

    2016-01-01

    Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

  3. Scientometric trends and knowledge maps of global health systems research.

    Science.gov (United States)

    Yao, Qiang; Chen, Kai; Yao, Lan; Lyu, Peng-hui; Yang, Tian-an; Luo, Fei; Chen, Shan-quan; He, Lu-yang; Liu, Zhi-yong

    2014-06-05

    In the last few decades, health systems research (HSR) has garnered much attention with a rapid increase in the related literature. This study aims to review and evaluate the global progress in HSR and assess the current quantitative trends. Based on data from the Web of Science database, scientometric methods and knowledge visualization techniques were applied to evaluate global scientific production and develop trends of HSR from 1900 to 2012. HSR has increased rapidly over the past 20 years. Currently, there are 28,787 research articles published in 3,674 journals that are listed in 140 Web of Science subject categories. The research in this field has mainly focused on public, environmental and occupational health (6,178, 21.46%), health care sciences and services (5,840, 20.29%), and general and internal medicine (3,783, 13.14%). The top 10 journals had published 2,969 (10.31%) articles and received 5,229 local citations and 40,271 global citations. The top 20 authors together contributed 628 papers, which accounted for a 2.18% share in the cumulative worldwide publications. The most productive author was McKee, from the London School of Hygiene & Tropical Medicine, with 48 articles. In addition, USA and American institutions ranked the first in health system research productivity, with high citation times, followed by the UK and Canada. HSR is an interdisciplinary area. Organization for Economic Co-operation and Development countries showed they are the leading nations in HSR. Meanwhile, American and Canadian institutions and the World Health Organization play a dominant role in the production, collaboration, and citation of high quality articles. Moreover, health policy and analysis research, health systems and sub-systems research, healthcare and services research, health, epidemiology and economics of communicable and non-communicable diseases, primary care research, health economics and health costs, and pharmacy of hospital have been identified as the

  4. Formalization of Database Systems -- and a Formal Definition of {IMS}

    DEFF Research Database (Denmark)

    Bjørner, Dines; Løvengreen, Hans Henrik

    1982-01-01

    Drawing upon an analogy between Programming Language Systems and Database Systems we outline the requirements that architectural specifications of database systems must futfitl, and argue that only formal, mathematical definitions may 6atisfy these. Then we illustrate home aspects and touch upon...... come ueee of formal definitions of data models and databaee management systems. A formal model of INS will carry this discussion. Finally we survey some of the exkting literature on formal definitions of database systems. The emphasis will be on constructive definitions in the denotationul semantics...... style of the VCM: Vienna Development Nethd. The role of formal definitions in international standardiaation efforts is briefly mentioned....

  5. Interim evaluation report of the mutually operable database systems by different computers; Denshi keisanki sogo un'yo database system chukan hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    This is the interim report on evaluation of the mutually operable database systems by different computers. The techniques for these systems fall into four categories of those related to (1) dispersed data systems, (2) multimedia, (3) high reliability, and (4) elementary techniques for mutually operable network systems. The techniques for the category (1) include those for vertically dispersed databases, database systems for multiple addresses in a wide area, and open type combined database systems, which have been in progress generally as planned. Those for the category (2) include the techniques for color document inputting and information retrieval, meaning compiling, understanding highly overlapping data, and controlling data centered by drawings, which have been in progress generally as planned. Those for the category (3) include the techniques for improving resistance of the networks to obstruction, and security of the data in the networks, which have been in progress generally as planned. Those for the category (4) include the techniques for rule processing for development of protocols, protocols for mutually connecting the systems, and high-speed, high-function networks, which have been in progress generally as planned. It is expected that the original objectives are finally achieved, because the development programs for these categories have been in progress generally as planned. (NEDO)

  6. A Transactional Asynchronous Replication Scheme for Mobile Database Systems

    Institute of Scientific and Technical Information of China (English)

    丁治明; 孟小峰; 王珊

    2002-01-01

    In mobile database systems, mobility of users has a significant impact on data replication. As a result, the various replica control protocols that exist today in traditional distributed and multidatabase environments are no longer suitable. To solve this problem, a new mobile database replication scheme, the Transaction-Level Result-Set Propagation (TLRSP)model, is put forward in this paper. The conflict detection and resolution strategy based on TLRSP is discussed in detail, and the implementation algorithm is proposed. In order to compare the performance of the TLRSP model with that of other mobile replication schemes, we have developed a detailed simulation model. Experimental results show that the TLRSP model provides an efficient support for replicated mobile database systems by reducing reprocessing overhead and maintaining database consistency.

  7. Nuclear integrated database and design advancement system

    International Nuclear Information System (INIS)

    Ha, Jae Joo; Jeong, Kwang Sub; Kim, Seung Hwan; Choi, Sun Young.

    1997-01-01

    The objective of NuIDEAS is to computerize design processes through an integrated database by eliminating the current work style of delivering hardcopy documents and drawings. The major research contents of NuIDEAS are the advancement of design processes by computerization, the establishment of design database and 3 dimensional visualization of design data. KSNP (Korea Standard Nuclear Power Plant) is the target of legacy database and 3 dimensional model, so that can be utilized in the next plant design. In the first year, the blueprint of NuIDEAS is proposed, and its prototype is developed by applying the rapidly revolutionizing computer technology. The major results of the first year research were to establish the architecture of the integrated database ensuring data consistency, and to build design database of reactor coolant system and heavy components. Also various softwares were developed to search, share and utilize the data through networks, and the detailed 3 dimensional CAD models of nuclear fuel and heavy components were constructed, and walk-through simulation using the models are developed. This report contains the major additions and modifications to the object oriented database and associated program, using methods and Javascript.. (author). 36 refs., 1 tab., 32 figs

  8. The RMS program system and database

    International Nuclear Information System (INIS)

    Fisher, S.M.; Peach, K.J.

    1982-08-01

    This report describes the program system developed for the data reduction and analysis of data obtained with the Rutherford Multiparticle Spectrometer (RMS), with particular emphasis on the utility of a well structured central data-base. (author)

  9. JAERI Material Performance Database (JMPD); outline of the system

    International Nuclear Information System (INIS)

    Yokoyama, Norio; Tsukada, Takashi; Nakajima, Hajime.

    1991-01-01

    JAERI Material Performance Database (JMPD) has been developed since 1986 in JAERI with a view to utilizing the various kinds of characteristic data of nuclear materials efficiently. Management system of relational database, PLANNER was employed and supporting systems for data retrieval and output were expanded. JMPD is currently serving the following data; (1) Data yielded from the research activities of JAERI including fatigue crack growth data of LWR pressure vessel materials as well as creep and fatigue data of the alloy developed for the High Temperature Gas-cooled Reactor (HTGR), Hastelloy XR. (2) Data of environmentally assisted cracking of LWR materials arranged by Electric power Research Institute (EPRI) including fatigue crack growth data (3000 tests), stress corrosion data (500 tests) and Slow Strain Rate Technique (SSRT) data (1000 tests). In order to improve user-friendliness of retrieval system, the menu selection type procedures have been developed where knowledge of system and data structures are not required for end-users. In addition a retrieval via database commands, Structured Query Language (SQL), is supported by the relational database management system. In JMPD the retrieved data can be processed readily through supporting systems for graphical and statistical analyses. The present report outlines JMPD and describes procedures for data retrieval and analyses by utilizing JMPD. (author)

  10. Experience using a distributed object oriented database for a DAQ system

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    To configure the RD13 data acquisition system, we need many parameters which describe the various hardware and software components. Such information has been defined using an entity-relation model and stored in a commercial memory-resident database. during the last year, Itasca, an object oriented database management system (OODB), was chosen as a replacement database system. We have ported the existing databases (hs and sw configurations, run parameters etc.) to Itasca and integrated it with the run control system. We believe that it is possible to use an OODB in real-time environments such as DAQ systems. In this paper, we present our experience and impression: why we wanted to change from an entity-relational approach, some useful features of Itasca, the issues we meet during this project including integration of the database into an existing distributed environment and factors which influence performance. (author)

  11. Asynchronous data change notification between database server and accelerator controls system

    International Nuclear Information System (INIS)

    Fu, W.; Morris, J.; Nemesure, S.

    2011-01-01

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.

  12. Performance analysis of different database in new internet mapping system

    Science.gov (United States)

    Yao, Xing; Su, Wei; Gao, Shuai

    2017-03-01

    In the Mapping System of New Internet, Massive mapping entries between AID and RID need to be stored, added, updated, and deleted. In order to better deal with the problem when facing a large number of mapping entries update and query request, the Mapping System of New Internet must use high-performance database. In this paper, we focus on the performance of Redis, SQLite, and MySQL these three typical databases, and the results show that the Mapping System based on different databases can adapt to different needs according to the actual situation.

  13. The IAEA stopping power database, following the trends in stopping power of ions in matter

    Science.gov (United States)

    Montanari, C. C.; Dimitriou, P.

    2017-10-01

    The aim of this work is to present an overview of the state of art of the energy loss of ions in matter, based on the new developments in the stopping power database of the International Atomic Energy Agency (IAEA). This exhaustive collection of experimental data, graphs, programs and comparisons, is the legacy of Helmut Paul, who made it accessible to the global scientific community, and has been extensively employed in theoretical and experimental research during the last 25 years. The field of stopping power in matter is evolving, with new trends in materials of interest, including oxides, nitrides, polymers, and biological targets. Our goal is to identify areas of interest and emerging data needs to meet the requirements of a continuously developing user community.

  14. Development of the severe accident risk information database management system SARD

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies

  15. Development of the severe accident risk information database management system SARD

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies.

  16. A Data Analysis Expert System For Large Established Distributed Databases

    Science.gov (United States)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-05-01

    The purpose of this work is to analyze the applicability of artificial intelligence techniques for developing a user-friendly, parallel interface to large isolated, incompatible NASA databases for the purpose of assisting the management decision process. To carry out this work, a survey was conducted to establish the data access requirements of several key NASA user groups. In addition, current NASA database access methods were evaluated. The results of this work are presented in the form of a design for a natural language database interface system, called the Deductively Augmented NASA Management Decision Support System (DANMDS). This design is feasible principally because of recently announced commercial hardware and software product developments which allow cross-vendor compatibility. The goal of the DANMDS system is commensurate with the central dilemma confronting most large companies and institutions in America, the retrieval of information from large, established, incompatible database systems. The DANMDS system implementation would represent a significant first step toward this problem's resolution.

  17. A Review of Stellar Abundance Databases and the Hypatia Catalog Database

    Science.gov (United States)

    Hinkel, Natalie Rose

    2018-01-01

    The astronomical community is interested in elements from lithium to thorium, from solar twins to peculiarities of stellar evolution, because they give insight into different regimes of star formation and evolution. However, while some trends between elements and other stellar or planetary properties are well known, many other trends are not as obvious and are a point of conflict. For example, stars that host giant planets are found to be consistently enriched in iron, but the same cannot be definitively said for any other element. Therefore, it is time to take advantage of large stellar abundance databases in order to better understand not only the large-scale patterns, but also the more subtle, small-scale trends within the data.In this overview to the special session, I will present a review of large stellar abundance databases that are both currently available (i.e. RAVE, APOGEE) and those that will soon be online (i.e. Gaia-ESO, GALAH). Additionally, I will discuss the Hypatia Catalog Database (www.hypatiacatalog.com) -- which includes abundances from individual literature sources that observed stars within 150pc. The Hypatia Catalog currently contains 72 elements as measured within ~6000 stars, with a total of ~240,000 unique abundance determinations. The online database offers a variety of solar normalizations, stellar properties, and planetary properties (where applicable) that can all be viewed through multiple interactive plotting interfaces as well as in a tabular format. By analyzing stellar abundances for large populations of stars and from a variety of different perspectives, a wealth of information can be revealed on both large and small scales.

  18. Functional integration of automated system databases by means of artificial intelligence

    Science.gov (United States)

    Dubovoi, Volodymyr M.; Nikitenko, Olena D.; Kalimoldayev, Maksat; Kotyra, Andrzej; Gromaszek, Konrad; Iskakova, Aigul

    2017-08-01

    The paper presents approaches for functional integration of automated system databases by means of artificial intelligence. The peculiarities of turning to account the database in the systems with the usage of a fuzzy implementation of functions were analyzed. Requirements for the normalization of such databases were defined. The question of data equivalence in conditions of uncertainty and collisions in the presence of the databases functional integration is considered and the model to reveal their possible occurrence is devised. The paper also presents evaluation method of standardization of integrated database normalization.

  19. Development of a database system for the calculation of indicators of environmental pressure caused by transport

    Energy Technology Data Exchange (ETDEWEB)

    Giannouli, Myrsini; Samaras, Zissis [Aristotle University of Thessaloniki, Laboratory of Applied Thermodynamics, Mechanical Engineering Department, GR 54124, Thessaloniki, P.O. Box 458 (Greece); Keller, Mario; De Haan, Peter [INFRAS, Muhlemattstrasse 45 CH-3007, Bern (Switzerland); Kallivoda, Manfred [psiA-Consult, Environmental Research and Engineering GmbH, Lastenstrasse 38/1, 1230 Wien (Austria); Sorenson, Spencer; Georgakaki, Aliki [DTU: Technical University of Denmark, Nils Koppels Alle, Building 403, DK 2800 Kgs. Lyngby (Denmark)

    2006-03-15

    The scope of this paper is to summarise a methodology developed for TRENDS (TRansport and ENvironment Database System-TRENDS). The main objective of TRENDS was the calculation of environmental pressure indicators caused by transport. The environmental pressures considered are associated with air emissions from the four main transport modes, i.e. road, rail, ships and air. In order to determine these indicators a system for calculating a range of environmental pressures due to transport was developed within a PC-based MS Access environment. Emphasis is given on the latest features incorporated in the model and their applications. One of the recently developed features of the software provides an option for simple scenario analysis including vehicle dynamics (such as turnover and evolution) for all EU15 member states. This feature is called the Transport Activity Balance module (TAB) and enables the production of collective results for all transport modes as well as a comparative assessment of air emissions produced by the various modes. Traffic activity and emission data obtained according to a basic (reference) scenario are displayed for the time period 1970-2020. In addition, a detailed assessment of the results produced by TRENDS was conducted by means of comparison with data found in the literature. Finally, vehicle emissions produced by the model for the EU15 member states were spatially disaggregated for the base year, 1995 and GIS maps were generated. Examples of these maps are displayed in this document, for the various modes of transport considered in the study. (author)

  20. Computerized database management system for breast cancer patients.

    Science.gov (United States)

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.

  1. LHCb Conditions Database Operation Assistance Systems

    CERN Multimedia

    Shapoval, Illya

    2012-01-01

    The Conditions Database of the LHCb experiment (CondDB) provides versioned, time dependent geometry and conditions data for all LHCb data processing applications (simulation, high level trigger, reconstruction, analysis) in a heterogeneous computing environment ranging from user laptops to the HLT farm and the Grid. These different use cases impose front-end support for multiple database technologies (Oracle and SQLite are used). Sophisticated distribution tools are required to ensure timely and robust delivery of updates to all environments. The content of the database has to be managed to ensure that updates are internally consistent and externally compatible with multiple versions of the physics application software. In this paper we describe three systems that we have developed to address these issues: - an extension to the automatic content validation done by the “Oracle Streams” replication technology, to trap cases when the replication was unsuccessful; - an automated distribution process for the S...

  2. How the choice of Operating System can affect databases on a Virtual Machine

    OpenAIRE

    Karlsson, Jan; Eriksson, Patrik

    2014-01-01

    As databases grow in size, the need for optimizing databases is becoming a necessity. Choosing the right operating system to support your database becomes paramount to ensure that the database is fully utilized. Furthermore with the virtualization of operating systems becoming more commonplace, we find ourselves with more choices than we ever faced before. This paper demonstrates why the choice of operating system plays an integral part in deciding the right database for your system in a virt...

  3. Experimental database retrieval system 'DARTS'

    International Nuclear Information System (INIS)

    Aoyagi, Tetsuo; Tani, Keiji; Haginoya, Hirobumi; Naito, Shinjiro.

    1989-02-01

    In JT-60, a large tokamak device of Japan Atomic Energy Research Institute (JAERI), a plasma is fired for 5 ∼ 10 seconds at intervals of about 10 minutes. The each firing is called a shot. Plasma diagnostic data are edited as JT-60 experimental database at every shot cycle and are stored in a large-scale computer (FACOM-M780). Experimentalists look up the data for specific shots which they want to analyze and consider. As the total number of shots increases, they find a difficulty in the looking-up work. In order that they can easily access to their objective shot data or shot group data by using a computer terminal, 'DARTS' (DAtabase ReTrieval System) has been developed. This report may provide enough information on DARTS handling for users. (author)

  4. Towards a Component Based Model for Database Systems

    Directory of Open Access Journals (Sweden)

    Octavian Paul ROTARU

    2004-02-01

    Full Text Available Due to their effectiveness in the design and development of software applications and due to their recognized advantages in terms of reusability, Component-Based Software Engineering (CBSE concepts have been arousing a great deal of interest in recent years. This paper presents and extends a component-based approach to object-oriented database systems (OODB introduced by us in [1] and [2]. Components are proposed as a new abstraction level for database system, logical partitions of the schema. In this context, the scope is introduced as an escalated property for transactions. Components are studied from the integrity, consistency, and concurrency control perspective. The main benefits of our proposed component model for OODB are the reusability of the database design, including the access statistics required for a proper query optimization, and a smooth information exchange. The integration of crosscutting concerns into the component database model using aspect-oriented techniques is also discussed. One of the main goals is to define a method for the assessment of component composition capabilities. These capabilities are restricted by the component’s interface and measured in terms of adaptability, degree of compose-ability and acceptability level. The above-mentioned metrics are extended from database components to generic software components. This paper extends and consolidates into one common view the ideas previously presented by us in [1, 2, 3].[1] Octavian Paul Rotaru, Marian Dobre, Component Aspects in Object Oriented Databases, Proceedings of the International Conference on Software Engineering Research and Practice (SERP’04, Volume II, ISBN 1-932415-29-7, pages 719-725, Las Vegas, NV, USA, June 2004.[2] Octavian Paul Rotaru, Marian Dobre, Mircea Petrescu, Integrity and Consistency Aspects in Component-Oriented Databases, Proceedings of the International Symposium on Innovation in Information and Communication Technology (ISIICT

  5. An anomaly analysis framework for database systems

    NARCIS (Netherlands)

    Vavilis, S.; Egner, A.I.; Petkovic, M.; Zannone, N.

    2015-01-01

    Anomaly detection systems are usually employed to monitor database activities in order to detect security incidents. These systems raise an alert when anomalous activities are detected. The raised alerts have to be analyzed to timely respond to the security incidents. Their analysis, however, is

  6. Plant operation data collection and database management using NIC system

    International Nuclear Information System (INIS)

    Inase, S.

    1990-01-01

    The Nuclear Information Center (NIC), a division of the Central Research Institute of Electric Power Industry, collects nuclear power plant operation and maintenance information both in Japan and abroad and transmits the information to all domestic utilities so that it can be effectively utilized for safe plant operation and reliability enhancement. The collected information is entered into the database system after being key-worded by NIC. The database system, Nuclear Information database/Communication System (NICS), has been developed by NIC for storage and management of collected information. Objectives of keywords are retrieval and classification by the keyword categories

  7. A user's manual for managing database system of tensile property

    International Nuclear Information System (INIS)

    Ryu, Woo Seok; Park, S. J.; Kim, D. H.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the tensile database system for managing the tensile property test data. The data base constructed the data produced from tensile property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The tensile database system was developed by internet method using Java, PL/SQL, JSP(Java Server Pages) tool

  8. Stress Testing of Transactional Database Systems

    OpenAIRE

    Meira , Jorge Augusto; Cunha De Almeida , Eduardo; Sunyé , Gerson; Le Traon , Yves; Valduriez , Patrick

    2013-01-01

    International audience; Transactional database management systems (DBMS) have been successful at supporting traditional transaction processing workloads. However, web-based applications that tend to generate huge numbers of concurrent business operations are pushing DBMS performance over their limits, thus threatening overall system availability. Then, a crucial question is how to test DBMS performance under heavy workload conditions. Answering this question requires a testing methodology to ...

  9. Software Classifications: Trends in Literacy Software Publication and Marketing.

    Science.gov (United States)

    Balajthy, Ernest

    First in a continuing series of reports on trends in marketing and publication of software for literacy education, a study explored the development of a database to track the trends and reported on trends seen in 1995. The final version of the 1995 database consisted of 1011 software titles, 165 of which had been published in 1995 and 846…

  10. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  11. A protable Database driven control system for SPEAR

    International Nuclear Information System (INIS)

    Howry, S.; Gromme, T.; King, A.; Sullenberger, M.

    1985-01-01

    The new computer control system software for SPEAR is presented as a transfer from the PEP system. Features of the target ring (SPEAR) such as symmetries, magnet groupings, etc., are all contained in a design file which is read by both people and computer. People use it as documentation; a program reads it to generate the database structure, which becomes the center of communication for all the software. Geometric information, such as element positions and lengths, and CAMAC I/O routing information is entered into the database as it is developed. Since application processes refer only to the database and since they do so only in generic terms, almost all of this software (representing more then fifteen man years) is transferred with few changes. Operator console menus (touchpanels) are also transferred with only superficial changes for the same reasons. The system is modular: the CAMAC I/O software is all in one process; the menu control software is a process; the ring optics model and the orbit model are separate processes, each of which runs concurrently with about 15 others in the multiprogramming environment of the VAX/VMS operating system

  12. Expert database system for quality control

    Science.gov (United States)

    Wang, Anne J.; Li, Zhi-Cheng

    1993-09-01

    There are more competitors today. Markets are not homogeneous they are fragmented into increasingly focused niches requiring greater flexibility in the product mix shorter manufacturing production runs and above allhigher quality. In this paper the author identified a real-time expert system as a way to improve plantwide quality management. The quality control expert database system (QCEDS) by integrating knowledge of experts in operations quality management and computer systems use all information relevant to quality managementfacts as well as rulesto determine if a product meets quality standards. Keywords: expert system quality control data base

  13. Multi-dimensional database design and implementation of dam safety monitoring system

    Directory of Open Access Journals (Sweden)

    Zhao Erfeng

    2008-09-01

    Full Text Available To improve the effectiveness of dam safety monitoring database systems, the development process of a multi-dimensional conceptual data model was analyzed and a logic design was achieved in multi-dimensional database mode. The optimal data model was confirmed by identifying data objects, defining relations and reviewing entities. The conversion of relations among entities to external keys and entities and physical attributes to tables and fields was interpreted completely. On this basis, a multi-dimensional database that reflects the management and analysis of a dam safety monitoring system on monitoring data information has been established, for which factual tables and dimensional tables have been designed. Finally, based on service design and user interface design, the dam safety monitoring system has been developed with Delphi as the development tool. This development project shows that the multi-dimensional database can simplify the development process and minimize hidden dangers in the database structure design. It is superior to other dam safety monitoring system development models and can provide a new research direction for system developers.

  14. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  15. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  16. A database system for enhancing fuel records management capabilities

    International Nuclear Information System (INIS)

    Rieke, Phil; Razvi, Junaid

    1994-01-01

    The need to modernize the system of managing a large variety of fuel related data at the TRIGA Reactors Facility at General Atomics, as well as the need to improve NRC nuclear material reporting requirements, prompted the development of a database to cover all aspects of fuel records management. The TRIGA Fuel Database replaces (a) an index card system used for recording fuel movements, (b) hand calculations for uranium burnup, and (c) a somewhat aged and cumbersome system of recording fuel inspection results. It was developed using Microsoft Access, a relational database system for Windows. Instead of relying on various sources for element information, users may now review individual element statistics, record inspection results, calculate element burnup and more, all from within a single application. Taking full advantage of the ease-of-use features designed in to Windows and Access, the user can enter and extract information easily through a number of customized on screen forms, with a wide variety of reporting options available. All forms are accessed through a main 'Options' screen, with the options broken down by categories, including 'Elements', 'Special Elements/Devices', 'Control Rods' and 'Areas'. Relational integrity and data validation rules are enforced to assist in ensuring accurate and meaningful data is entered. Among other items, the database lets the user define: element types (such as FLIP or standard) and subtypes (such as fuel follower, instrumented, etc.), various inspection codes for standardizing inspection results, areas within the facility where elements are located, and the power factors associated with element positions within a reactor. Using fuel moves, power history, power factors and element types, the database tracks uranium burnup and plutonium buildup on a quarterly basis. The Fuel Database was designed with end-users in mind and does not force an operations oriented user to learn any programming or relational database theory in

  17. Expert system for quality control in the INIS database

    International Nuclear Information System (INIS)

    Todeschini, C.; Tolstenkov, A.

    1990-05-01

    An expert system developed to identify input items to INIS database with a high probability of containing errors is described. The system employs a Knowledge Base constructed by the interpretation of a large number of intellectual choices or expert decisions made by human indexers and incorporated in the INIS database. On the basis of the descriptor indexing, the system checks the correctness of the categorization. A notable feature of the system is its capability of self improvement by the continuous updating of the Knowledge Base. The expert system has also been found to be extremely useful in identifying documents with poor indexing. 3 refs, 9 figs

  18. Expert system for quality control in the INIS database

    Energy Technology Data Exchange (ETDEWEB)

    Todeschini, C; Tolstenkov, A [International Atomic Energy Agency, Vienna (Austria)

    1990-05-01

    An expert system developed to identify input items to INIS database with a high probability of containing errors is described. The system employs a Knowledge Base constructed by the interpretation of a large number of intellectual choices or expert decisions made by human indexers and incorporated in the INIS database. On the basis of the descriptor indexing, the system checks the correctness of the categorization. A notable feature of the system is its capability of self improvement by the continuous updating of the Knowledge Base. The expert system has also been found to be extremely useful in identifying documents with poor indexing. 3 refs, 9 figs.

  19. Research and Implementation of Distributed Database HBase Monitoring System

    Directory of Open Access Journals (Sweden)

    Guo Lisi

    2017-01-01

    Full Text Available With the arrival of large data age, distributed database HBase becomes an important tool for storing data in massive data age. The normal operation of HBase database is an important guarantee to ensure the security of data storage. Therefore designing a reasonable HBase monitoring system is of great significance in practice. In this article, we introduce the solution, which contains the performance monitoring and fault alarm function module, to meet a certain operator’s demand of HBase monitoring database in their actual production projects. We designed a monitoring system which consists of a flexible and extensible monitoring agent, a monitoring server based on SSM architecture, and a concise monitoring display layer. Moreover, in order to deal with the problem that pages renders too slow in the actual operation process, we present a solution: reducing the SQL query. It has been proved that reducing SQL query can effectively improve system performance and user experience. The system work well in monitoring the status of HBase database, flexibly extending the monitoring index, and issuing a warning when a fault occurs, so that it is able to improve the working efficiency of the administrator, and ensure the smooth operation of the project.

  20. System maintenance test plan for the TWRS controlled baseline database system

    International Nuclear Information System (INIS)

    Spencer, S.G.

    1998-01-01

    TWRS [Tank Waste Remediation System] Controlled Baseline Database, formally known as the Performance Measurement Control System, is used to track and monitor TWRS project management baseline information. This document contains the maintenance testing approach for software testing of the TCBD system once SCR/PRs are implemented

  1. ADVICE--Educational System for Teaching Database Courses

    Science.gov (United States)

    Cvetanovic, M.; Radivojevic, Z.; Blagojevic, V.; Bojovic, M.

    2011-01-01

    This paper presents a Web-based educational system, ADVICE, that helps students to bridge the gap between database management system (DBMS) theory and practice. The usage of ADVICE is presented through a set of laboratory exercises developed to teach students conceptual and logical modeling, SQL, formal query languages, and normalization. While…

  2. Insertion algorithms for network model database management systems

    Science.gov (United States)

    Mamadolimov, Abdurashid; Khikmat, Saburov

    2017-12-01

    The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, forms partial order. When a database is large and a query comparison is expensive then the efficiency requirement of managing algorithms is minimizing the number of query comparisons. We consider updating operation for network model database management systems. We develop a new sequantial algorithm for updating operation. Also we suggest a distributed version of the algorithm.

  3. Fire auto alarm system intelligent trend

    International Nuclear Information System (INIS)

    Du Chengbao

    1997-01-01

    The author gives the course and trend of the fire alarm system going to more computerized and more intelligent. It is described that only the system applied artificial intelligent and confusion control is the true intelligent fire alarm system. The author gives the detailed analysis on the signal treatment of artificial intelligent applied to analogue fire alarm system as well as the alarm system controlled by confusion technology and artificial nervous net

  4. PFTijah: text search in an XML database system

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Rode, H.; van Os, R.; Flokstra, Jan

    2006-01-01

    This paper introduces the PFTijah system, a text search system that is integrated with an XML/XQuery database management system. We present examples of its use, we explain some of the system internals, and discuss plans for future work. PFTijah is part of the open source release of MonetDB/XQuery.

  5. Database Description - Trypanosomes Database | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us Trypanosomes Database Database Description General information of database Database name Trypanosomes Database...stitute of Genetics Research Organization of Information and Systems Yata 1111, Mishima, Shizuoka 411-8540, JAPAN E mail: Database...y Name: Trypanosoma Taxonomy ID: 5690 Taxonomy Name: Homo sapiens Taxonomy ID: 9606 Database description The... Article title: Author name(s): Journal: External Links: Original website information Database maintenance s...DB (Protein Data Bank) KEGG PATHWAY Database DrugPort Entry list Available Query search Available Web servic

  6. HATCHES - a thermodynamic database and management system

    International Nuclear Information System (INIS)

    Cross, J.E.; Ewart, F.T.

    1990-03-01

    The Nirex Safety Assessment Research Programme has been compiling the thermodynamic data necessary to allow simulations of the aqueous behaviour of the elements important to radioactive waste disposal to be made. These data have been obtained from the literature, when available, and validated for the conditions of interest by experiment. In order to maintain these data in an accessible form and to satisfy quality assurance on all data used for assessments, a database has been constructed which resides on a personal computer operating under MS-DOS using the Ashton-Tate dBase III program. This database contains all the input data fields required by the PHREEQE program and, in addition, a body of text which describes the source of the data and the derivation of the PHREEQE input parameters from the source data. The HATCHES system consists of this database, a suite of programs to facilitate the searching and listing of data and a further suite of programs to convert the dBase III files to PHREEQE database format. (Author)

  7. Construction of database server system for fuel thermo-physical properties

    International Nuclear Information System (INIS)

    Park, Chang Je; Kang, Kwon Ho; Song, Kee Chan

    2003-12-01

    To perform the evaluation of various fuels in the nuclear reactors, not only the mechanical properties but also thermo-physical properties are required as one of most important inputs for fuel performance code system. The main objective of this study is to make a database system for fuel thermo-physical properties and a PC-based hardware system has been constructed for ease use for the public with visualization such as web-based server system. This report deals with the hardware and software which are used in the database server system for nuclear fuel thermo-physical properties. It is expected to be highly useful to obtain nuclear fuel data without such a difficulty through opening the database of fuel properties to the public and is also helpful to research of development of various fuel of nuclear industry. Furthermore, the proposed models of nuclear fuel thermo-physical properties will be enough utilized to the fuel performance code system

  8. Data collection for improved follow-up of operating experiences. SKI damage database. Contents and aims with database

    International Nuclear Information System (INIS)

    Gott, Karen

    1997-01-01

    The Stryk database is presented and discussed in conjunction with the Swedish regulations concerning structural components in nuclear installations. The database acts as a reference library for reported cracks and degradation and can be used to retrieve information about individual events or for compiling statistics and performing trend analyses

  9. INTEGRATED HSEQ MANAGEMENT SYSTEMS: DEVELOPMENTS AND TRENDS

    Directory of Open Access Journals (Sweden)

    Osmo Kauppila

    2015-06-01

    Full Text Available The integration of health and safety, environmental and quality (HSEQ management systems has become a current topic in the 21st century, as the need for systems thinking has grown along with the number of management system standards. This study aims to map current developments and trends in integrated HSEQ management. Three viewpoints are taken: the current state of the main HSEQ management standards, research literature on integrated management systems (IMS, and a case study of an industry-led HSEQ cluster in Northern Finland. The results demonstrate that some of the most prominent current trends are the harmonization of the high level structure of management systems by ISO, the evaluation of IMS, accounting for the supply chain in HSEQ issues, and sustainability and risk management. The results of the study can be used by practitioners to get a view of the current state of HSEQ management systems and their integration, and by researchers to seek out potential directions for HSEQ and IMS related research.

  10. A human friendly reporting and database system for brain PET analysis

    International Nuclear Information System (INIS)

    Jamzad, M.; Ishii, Kenji; Toyama, Hinako; Senda, Michio

    1996-01-01

    We have developed a human friendly reporting and database system for clinical brain PET (Positron Emission Tomography) scans, which enables statistical data analysis on qualitative information obtained from image interpretation. Our system consists of a Brain PET Data (Input) Tool and Report Writing Tool. In the Brain PET Data Tool, findings and interpretations are input by selecting menu icons in a window panel instead of writing a free text. This method of input enables on-line data entry into and update of the database by means of pre-defined consistent words, which facilitates statistical data analysis. The Report Writing Tool generates a one page report of natural English sentences semi-automatically by using the above input information and the patient information obtained from our PET center's main database. It also has a keyword selection function from the report text so that we can save a set of keywords on the database for further analysis. By means of this system, we can store the data related to patient information and visual interpretation of the PET examination while writing clinical reports in daily work. The database files in our system can be accessed by means of commercially available databases. We have used the 4th Dimension database that runs on a Macintosh computer and analyzed 95 cases of 18 F-FDG brain PET studies. The results showed high specificity of parietal hypometabolism for Alzheimer's patients. (author)

  11. Online-Expert: An Expert System for Online Database Selection.

    Science.gov (United States)

    Zahir, Sajjad; Chang, Chew Lik

    1992-01-01

    Describes the design and development of a prototype expert system called ONLINE-EXPERT that helps users select online databases and vendors that meet users' needs. Search strategies are discussed; knowledge acquisition and knowledge bases are described; and the Analytic Hierarchy Process (AHP), a decision analysis technique that ranks databases,…

  12. NoSQL databases

    OpenAIRE

    Mrozek, Jakub

    2012-01-01

    This thesis deals with database systems referred to as NoSQL databases. In the second chapter, I explain basic terms and the theory of database systems. A short explanation is dedicated to database systems based on the relational data model and the SQL standardized query language. Chapter Three explains the concept and history of the NoSQL databases, and also presents database models, major features and the use of NoSQL databases in comparison with traditional database systems. In the fourth ...

  13. Environmental Scanning in Educational Planning: Establishing a Strategic Trend Information System.

    Science.gov (United States)

    Morrison, James L.

    The systematic evaluation of the macroenvironment is sometimes referred to as a strategic trend information system. Strategic trend intelligence systems are highly developed, systematic intelligence programs that focus on trends and events in the external environment and provide institutions with knowledge to reduce areas of uncertainty and with…

  14. Trends in Wind Turbine Generator Systems

    DEFF Research Database (Denmark)

    Polinder, Henk; Ferreira, Jan Abraham; Jensen, Bogi Bech

    2013-01-01

    This paper reviews the trends in wind turbine generator systems. After discussing some important requirements and basic relations, it describes the currently used systems: the constant speed system with squirrel-cage induction generator, and the three variable speed systems with doubly fed...... induction generator (DFIG), with gearbox and fully rated converter, and direct drive (DD). Then, possible future generator systems are reviewed. Hydraulic transmissions are significantly lighter than gearboxes and enable continuously variable transmission, but their efficiency is lower. A brushless DFIG...

  15. DAD - Distributed Adamo Database system at Hermes

    International Nuclear Information System (INIS)

    Wander, W.; Dueren, M.; Ferstl, M.; Green, P.; Potterveld, D.; Welch, P.

    1996-01-01

    Software development for the HERMES experiment faces the challenges of many other experiments in modern High Energy Physics: Complex data structures and relationships have to be processed at high I/O rate. Experimental control and data analysis are done on a distributed environment of CPUs with various operating systems and requires access to different time dependent databases like calibration and geometry. Slow and experimental control have a need for flexible inter-process-communication. Program development is done in different programming languages where interfaces to the libraries should not restrict the capacities of the language. The needs of handling complex data structures are fulfilled by the ADAMO entity relationship model. Mixed language programming can be provided using the CFORTRAN package. DAD, the Distributed ADAMO Database library, was developed to provide the I/O and database functionality requirements. (author)

  16. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  17. Data Mining on Distributed Medical Databases: Recent Trends and Future Directions

    Science.gov (United States)

    Atilgan, Yasemin; Dogan, Firat

    As computerization in healthcare services increase, the amount of available digital data is growing at an unprecedented rate and as a result healthcare organizations are much more able to store data than to extract knowledge from it. Today the major challenge is to transform these data into useful information and knowledge. It is important for healthcare organizations to use stored data to improve quality while reducing cost. This paper first investigates the data mining applications on centralized medical databases, and how they are used for diagnostic and population health, then introduces distributed databases. The integration needs and issues of distributed medical databases are described. Finally the paper focuses on data mining studies on distributed medical databases.

  18. New trends on mobile learning area: The review of published articles on mobile learning in science direct database

    Directory of Open Access Journals (Sweden)

    Emrah Soykan

    2015-03-01

    Full Text Available Articles published in Science Direct between 2009 and 2014 (May were screened in this research. This is because of its respectable position in the field of technology and peer-reviewed secured structure of this database. From a total of 161 articles within the scope of the screening 156 articles were included in the study. Through this research, new trends in mobile learning activities in recent years will be determined and a new way will be shown for researchers . "Mobile learning" keywords used during researching process and all the articles with "mobile learning" keywords were included in this study. As a result of this research, it is determined that the most studies in the field of mobile learning were published in 2013 in Malaysia, UK and Taiwan. Particularly undergraduate students was selected as the sample group of the researches. It is emerged that, experimental research was used maximum as a research model. Quantitative data collection tools were used most as a means of data collection. It was emerged that foreign language education is the most widely used field in mobile learning. It is seen that smart phones as mobile learning devices and IOS operating system as an operating system were used in the most researches. Mobile-based method is seen to be used as the teaching method in the present study.

  19. A user's manual for the database management system of impact property

    International Nuclear Information System (INIS)

    Ryu, Woo Seok; Park, S. J.; Kong, W. S.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the impact database system for managing the impact property test data. The data base constructed the data produced from impact property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The impact database system was developed by internet method using jsp(Java Server pages) tool

  20. Designing the database for a reliability aware Model-Based System Engineering process

    International Nuclear Information System (INIS)

    Cressent, Robin; David, Pierre; Idasiak, Vincent; Kratz, Frederic

    2013-01-01

    This article outlines the need for a reliability database to implement model-based description of components failure modes and dysfunctional behaviors. We detail the requirements such a database should honor and describe our own solution: the Dysfunctional Behavior Database (DBD). Through the description of its meta-model, the benefits of integrating the DBD in the system design process is highlighted. The main advantages depicted are the possibility to manage feedback knowledge at various granularity and semantic levels and to ease drastically the interactions between system engineering activities and reliability studies. The compliance of the DBD with other reliability database such as FIDES is presented and illustrated. - Highlights: ► Model-Based System Engineering is more and more used in the industry. ► It results in a need for a reliability database able to deal with model-based description of dysfunctional behavior. ► The Dysfunctional Behavior Database aims to fulfill that need. ► It helps dealing with feedback management thanks to its structured meta-model. ► The DBD can profit from other reliability database such as FIDES.

  1. Distributed Access View Integrated Database (DAVID) system

    Science.gov (United States)

    Jacobs, Barry E.

    1991-01-01

    The Distributed Access View Integrated Database (DAVID) System, which was adopted by the Astrophysics Division for their Astrophysics Data System, is a solution to the system heterogeneity problem. The heterogeneous components of the Astrophysics problem is outlined. The Library and Library Consortium levels of the DAVID approach are described. The 'books' and 'kits' level is discussed. The Universal Object Typer Management System level is described. The relation of the DAVID project with the Small Business Innovative Research (SBIR) program is explained.

  2. Asynchronous data change notification between database server and accelerator control systems

    International Nuclear Information System (INIS)

    Wenge Fu; Seth Nemesure; Morris, J.

    2012-01-01

    Database data change notification (DCN) is a commonly used feature, it allows to be informed when the data has been changed on the server side by another client. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to any client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. (authors)

  3. Refactoring databases evolutionary database design

    CERN Document Server

    Ambler, Scott W

    2006-01-01

    Refactoring has proven its value in a wide range of development projects–helping software professionals improve system designs, maintainability, extensibility, and performance. Now, for the first time, leading agile methodologist Scott Ambler and renowned consultant Pramodkumar Sadalage introduce powerful refactoring techniques specifically designed for database systems. Ambler and Sadalage demonstrate how small changes to table structures, data, stored procedures, and triggers can significantly enhance virtually any database design–without changing semantics. You’ll learn how to evolve database schemas in step with source code–and become far more effective in projects relying on iterative, agile methodologies. This comprehensive guide and reference helps you overcome the practical obstacles to refactoring real-world databases by covering every fundamental concept underlying database refactoring. Using start-to-finish examples, the authors walk you through refactoring simple standalone databas...

  4. Development of a Multidisciplinary and Telemedicine Focused System Database.

    Science.gov (United States)

    Paštěka, Richard; Forjan, Mathias; Sauermann, Stefan

    2017-01-01

    Tele-rehabilitation at home is one of the promising approaches in increasing rehabilitative success and simultaneously decreasing the financial burden on the healthcare system. Novel and mostly mobile devices are already in use, but shall be used in the future to a higher extent for allowing at home rehabilitation processes at a high quality level. The combination of exercises, assessments and available equipment is the basic objective of the presented database. The database has been structured in order to allow easy-to-use and fast access for the three main user groups. Therapists - looking for exercise and equipment combinations - patients - rechecking their tasks for home exercises - and manufacturers - entering their equipment for specific use cases. The database has been evaluated by a proof of concept study and shows a high degree of applicability for the field of rehabilitative medicine. Currently it contains 110 exercises/assessments and 111 equipment/systems. Foundations of presented database are already established in the rehabilitative field of application, but can and will be enhanced in its functionality to be usable for a higher variety of medical fields and specifications.

  5. The Eruption Forecasting Information System (EFIS) database project

    Science.gov (United States)

    Ogburn, Sarah; Harpel, Chris; Pesicek, Jeremy; Wellik, Jay; Pallister, John; Wright, Heather

    2016-04-01

    The Eruption Forecasting Information System (EFIS) project is a new initiative of the U.S. Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) with the goal of enhancing VDAP's ability to forecast the outcome of volcanic unrest. The EFIS project seeks to: (1) Move away from relying on the collective memory to probability estimation using databases (2) Create databases useful for pattern recognition and for answering common VDAP questions; e.g. how commonly does unrest lead to eruption? how commonly do phreatic eruptions portend magmatic eruptions and what is the range of antecedence times? (3) Create generic probabilistic event trees using global data for different volcano 'types' (4) Create background, volcano-specific, probabilistic event trees for frequently active or particularly hazardous volcanoes in advance of a crisis (5) Quantify and communicate uncertainty in probabilities A major component of the project is the global EFIS relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest, including the timing of phreatic eruptions, column heights, eruptive products, etc. and will be initially populated using chronicles of eruptive activity from Alaskan volcanic eruptions in the GeoDIVA database (Cameron et al. 2013). This database module allows us to query across other global databases such as the WOVOdat database of monitoring data and the Smithsonian Institution's Global Volcanism Program (GVP) database of eruptive histories and volcano information. The EFIS database is in the early stages of development and population; thus, this contribution also serves as a request for feedback from the community.

  6. Trend Monitoring System (TMS) graphics software

    Science.gov (United States)

    Brown, J. S.

    1979-01-01

    A prototype bus communications systems, which is being used to support the Trend Monitoring System (TMS) and to evaluate the bus concept is considered. A set of FORTRAN-callable graphics subroutines for the host MODCOMP comuter, and an approach to splitting graphics work between the host and the system's intelligent graphics terminals are described. The graphics software in the MODCOMP and the operating software package written for the graphics terminals are included.

  7. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    Science.gov (United States)

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  8. Operational experience running the HERA-B database system

    International Nuclear Information System (INIS)

    Amaral, V.; Amorim, A.; Batista, J.

    2001-01-01

    The HERA-B database system has been used in the commissioning period of the experiment. The authors present the expertise gathered during this period, covering also the improvements introduced and describing the different classes of problems faced in giving persistency to all non-event information. The author aims to give a global overview of the Database group activities, techniques developed and results based on the running experiment and dealing with large Data Volumes during and after the production phase

  9. Virus Database and Online Inquiry System Based on Natural Vectors.

    Science.gov (United States)

    Dong, Rui; Zheng, Hui; Tian, Kun; Yau, Shek-Chung; Mao, Weiguang; Yu, Wenping; Yin, Changchuan; Yu, Chenglong; He, Rong Lucy; Yang, Jie; Yau, Stephen St

    2017-01-01

    We construct a virus database called VirusDB (http://yaulab.math.tsinghua.edu.cn/VirusDB/) and an online inquiry system to serve people who are interested in viral classification and prediction. The database stores all viral genomes, their corresponding natural vectors, and the classification information of the single/multiple-segmented viral reference sequences downloaded from National Center for Biotechnology Information. The online inquiry system serves the purpose of computing natural vectors and their distances based on submitted genomes, providing an online interface for accessing and using the database for viral classification and prediction, and back-end processes for automatic and manual updating of database content to synchronize with GenBank. Submitted genomes data in FASTA format will be carried out and the prediction results with 5 closest neighbors and their classifications will be returned by email. Considering the one-to-one correspondence between sequence and natural vector, time efficiency, and high accuracy, natural vector is a significant advance compared with alignment methods, which makes VirusDB a useful database in further research.

  10. A database application for wilderness character monitoring

    Science.gov (United States)

    Ashley Adams; Peter Landres; Simon Kingston

    2012-01-01

    The National Park Service (NPS) Wilderness Stewardship Division, in collaboration with the Aldo Leopold Wilderness Research Institute and the NPS Inventory and Monitoring Program, developed a database application to facilitate tracking and trend reporting in wilderness character. The Wilderness Character Monitoring Database allows consistent, scientifically based...

  11. Databases and information systems: Applications in biogeography

    International Nuclear Information System (INIS)

    Escalante E, Tania; Llorente B, Jorge; Espinoza O, David N; Soberon M, Jorge

    2000-01-01

    Some aspects of the new instrumentalization and methodological elements that make up information systems in biodiversity (ISB) are described. The use of accurate geographically referenced data allows a broad range of available sources: natural history collections and scientific literature require the use of databases and geographic information systems (GIS). The conceptualization of ISB and GIS, based in the use of extensive data bases, has implied detailed modeling and the construction of authoritative archives: exhaustive catalogues of nomenclature and synonymies, complete bibliographic lists, list of names proposed, historical-geographic gazetteers with localities and their synonyms united under a global positioning system which produces a geospheric conception of the earth and its biota. Certain difficulties in the development of the system and the construction of the biological databases are explained: quality control of data, for example. The use of such systems is basic in order to respond to many questions at the frontier of current studies of biodiversity and conservation. In particular, some applications in biogeography and their importance for modeling distributions, to identify and contrast areas of endemism and biological richness for conservation, and their use as tools in what we identify as predictive and experimental faunistics are detailed. Lastly, the process as well as its relevance is emphasized at national and regional levels

  12. Establishment of database system for management of KAERI wastes

    International Nuclear Information System (INIS)

    Shon, J. S.; Kim, K. J.; Ahn, S. J.

    2004-07-01

    Radioactive wastes generated by KAERI has various types, nuclides and characteristics. To manage and control these kinds of radioactive wastes, it comes to need systematic management of their records, efficient research and quick statistics. Getting information about radioactive waste generated and stored by KAERI is the basic factor to construct the rapid information system for national cooperation management of radioactive waste. In this study, Radioactive Waste Management Integration System (RAWMIS) was developed. It is is aimed at management of record of radioactive wastes, uplifting the efficiency of management and support WACID(Waste Comprehensive Integration Database System) which is a national radioactive waste integrated safety management system of Korea. The major information of RAWMIS supported by user's requirements is generation, gathering, transfer, treatment, and storage information for solid waste, liquid waste, gas waste and waste related to spent fuel. RAWMIS is composed of database, software (interface between user and database), and software for a manager and it was designed with Client/Server structure. RAWMIS will be a useful tool to analyze radioactive waste management and radiation safety management. Also, this system is developed to share information with associated companies. Moreover, it can be expected to support the technology of research and development for radioactive waste treatment

  13. A data analysis expert system for large established distributed databases

    Science.gov (United States)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  14. Interconnecting heterogeneous database management systems

    Science.gov (United States)

    Gligor, V. D.; Luckenbaugh, G. L.

    1984-01-01

    It is pointed out that there is still a great need for the development of improved communication between remote, heterogeneous database management systems (DBMS). Problems regarding the effective communication between distributed DBMSs are primarily related to significant differences between local data managers, local data models and representations, and local transaction managers. A system of interconnected DBMSs which exhibit such differences is called a network of distributed, heterogeneous DBMSs. In order to achieve effective interconnection of remote, heterogeneous DBMSs, the users must have uniform, integrated access to the different DBMs. The present investigation is mainly concerned with an analysis of the existing approaches to interconnecting heterogeneous DBMSs, taking into account four experimental DBMS projects.

  15. Improving Timeliness in Real-Time Secure Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H; David, Rasikan; Thuraisingham, Bhavani

    2006-01-01

    .... In addition to real-time requirements, security is usually required in many applications. Multilevel security requirements introduce a new dimension to transaction processing in real-time database systems...

  16. ARACHNID: A prototype object-oriented database tool for distributed systems

    Science.gov (United States)

    Younger, Herbert; Oreilly, John; Frogner, Bjorn

    1994-01-01

    This paper discusses the results of a Phase 2 SBIR project sponsored by NASA and performed by MIMD Systems, Inc. A major objective of this project was to develop specific concepts for improved performance in accessing large databases. An object-oriented and distributed approach was used for the general design, while a geographical decomposition was used as a specific solution. The resulting software framework is called ARACHNID. The Faint Source Catalog developed by NASA was the initial database testbed. This is a database of many giga-bytes, where an order of magnitude improvement in query speed is being sought. This database contains faint infrared point sources obtained from telescope measurements of the sky. A geographical decomposition of this database is an attractive approach to dividing it into pieces. Each piece can then be searched on individual processors with only a weak data linkage between the processors being required. As a further demonstration of the concepts implemented in ARACHNID, a tourist information system is discussed. This version of ARACHNID is the commercial result of the project. It is a distributed, networked, database application where speed, maintenance, and reliability are important considerations. This paper focuses on the design concepts and technologies that form the basis for ARACHNID.

  17. Switching the Fermilab Accelerator Control System to a relational database

    International Nuclear Information System (INIS)

    Shtirbu, S.

    1993-01-01

    The accelerator control system (open-quotes ACNETclose quotes) at Fermilab is using a made-in-house, Assembly language, database. The database holds device information, which is mostly used for finding out how to read/set devices and how to interpret alarms. This is a very efficient implementation, but it lacks the needed flexibility and forces applications to store data in private/shared files. This database is being replaced by an off-the-shelf relational database (Sybase 2 ). The major constraints on switching are the necessity to maintain/improve response time and to minimize changes to existing applications. Innovative methods are used to help achieve the required performance, and a layer seven gateway simulates the old database for existing programs. The new database is running on a DEC ALPHA/VMS platform, and provides better performance. The switch is also exposing problems with the data currently stored in the database, and is helping in cleaning up erroneous data. The flexibility of the new relational database is going to facilitate many new applications in the future (e.g. a 3D presentation of device location). The new database is expected to fully replace the old database during this summer's shutdown

  18. Diffusivity database (DDB) system for major rocks (Version of 2006/specification and CD-ROM)

    International Nuclear Information System (INIS)

    Tochigi, Yoshikatsu; Sasamoto, Hirosi; Shibata, Masahiro; Sato, Haruo; Yui, Mikazu

    2006-03-01

    The development of the database system has been started to manage with the generally used. The database system has been constructed based on datasheets of the effective diffusion coefficient of the nuclides in the rock matrix in order to be applied on the 'H12: Project to Establish the Scientific and Technical Basis for HLW Disposal in Japan'. In this document, the examination and expansion of the datasheet structure and the process of construction of the database system and conversion of all data existing on datasheets are described. As the first step of the development of the database, this database system and its data will continue to be updated and the interface will be revised to improve the availability. The developed database system is attached on the CD-ROM as the file format of Microsoft Access. (author)

  19. Data-based control tuning in master-slave systems

    NARCIS (Netherlands)

    Heertjes, M.F.; Temizer, B.

    2012-01-01

    For improved output synchronization in master-slave systems, a data-based control tuning is presented. Herein the coefficients of two finite-duration impulse response (FIR) filters are found through machine-in-the-loop optimization. One filter is used to shape the input to the slave system while the

  20. Electromagnetic Systems Effects Database (EMSED). AERO 90, Phase II User's Manual

    National Research Council Canada - National Science Library

    Sawires, Kalim

    1998-01-01

    The Electromagnetic Systems Effects Database (EMSED), also called AIRBASE, is a training guide for users not familiar with the AIRBASE database and its operating platform, the Macintosh computer (Mac...

  1. Future trends in commercial and military systems

    Science.gov (United States)

    Bond, F. E.

    Commercial and military satellite communication systems are addressed, with a review of current applications and typical communication characteristics of the space and earth segments. Drivers for the development of future commercial systems include: the pervasion of digital techniques and services, growing orbit and frequency congestion, demand for more entertainment, and the large potential market for commercial 'roof-top' service. For military systems, survivability, improved flexibility, and the need for service to small mobile terminals are the principal factors involved. Technical trends include the use of higher frequency bands, multibeam antennas and a significant increase in the application of onboard processing. Military systems will employ a variety of techniques to counter both physical and electronic threats. The use of redundant transmission paths is a particularly effective approach. Successful implementation requires transmission standards to achieve the required interoperability among the pertinent networks. For both the military and commercial sectors, the trend toward larger numbers of terminals and more complex spacecraft is still persisting.

  2. Coordinate Systems Integration for Craniofacial Database from Multimodal Devices

    Directory of Open Access Journals (Sweden)

    Deni Suwardhi

    2005-05-01

    Full Text Available This study presents a data registration method for craniofacial spatial data of different modalities. The data consists of three dimensional (3D vector and raster data models. The data is stored in object relational database. The data capture devices are Laser scanner, CT (Computed Tomography scan and CR (Close Range Photogrammetry. The objective of the registration is to transform the data from various coordinate systems into a single 3-D Cartesian coordinate system. The standard error of the registration obtained from multimodal imaging devices using 3D affine transformation is in the ranged of 1-2 mm. This study is a step forward for storing the craniofacial spatial data in one reference system in database.

  3. Comparison of open source database systems(characteristics, limits of usage)

    OpenAIRE

    Husárik, Braňko

    2008-01-01

    The goal of this work is to compare some chosen open source database systems (Ingres, PostgreSQL, Firebird, Mysql). First part of work is focused on history and present situation of companies which are developing these products. Second part contains the comparision of certain group of specific features and limits. The benchmark of some operations is its own part. Possibilities of usage of mentioned database systems are summarized at the end of work.

  4. Integrated spent nuclear fuel database system

    International Nuclear Information System (INIS)

    Henline, S.P.; Klingler, K.G.; Schierman, B.H.

    1994-01-01

    The Distributed Information Systems software Unit at the Idaho National Engineering Laboratory has designed and developed an Integrated Spent Nuclear Fuel Database System (ISNFDS), which maintains a computerized inventory of all US Department of Energy (DOE) spent nuclear fuel (SNF). Commercial SNF is not included in the ISNFDS unless it is owned or stored by DOE. The ISNFDS is an integrated, single data source containing accurate, traceable, and consistent data and provides extensive data for each fuel, extensive facility data for every facility, and numerous data reports and queries

  5. 16th East-European Conference on Advances in Databases and Information Systems (ADBIS 2012)

    CERN Document Server

    Härder, Theo; Wrembel, Robert; Advances in Databases and Information Systems

    2013-01-01

    This volume is the second one of the 16th East-European Conference on Advances in Databases and Information Systems (ADBIS 2012), held on September 18-21, 2012, in Poznań, Poland. The first one has been published in the LNCS series.   This volume includes 27 research contributions, selected out of 90. The contributions cover a wide spectrum of topics in the database and information systems field, including: database foundation and theory, data modeling and database design, business process modeling, query optimization in relational and object databases, materialized view selection algorithms, index data structures, distributed systems, system and data integration, semi-structured data and databases, semantic data management, information retrieval, data mining techniques, data stream processing, trust and reputation in the Internet, and social networks. Thus, the content of this volume covers the research areas from fundamentals of databases, through still hot topic research problems (e.g., data mining, XML ...

  6. Database for fusion devices and associated fuel systems

    International Nuclear Information System (INIS)

    Woolgar, P.W.

    1983-03-01

    A computerized database storage and retrieval system has been set up for fusion devices and the associated fusion fuel systems which should be a useful tool for the CFFTP program and other users. The features of the Wang 'Alliance' system are discussed for this application, as well as some of the limitations of the system. Recommendations are made on the operation, upkeep and further development that should take place to implement and maintain the system

  7. A role for relational databases in high energy physics software systems

    International Nuclear Information System (INIS)

    Lauer, R.; Slaughter, A.J.; Wolin, E.

    1987-01-01

    This paper presents the design and initial implementation of software which uses a relational database management system for storage and retrieval of real and Monte Carlo generated events from a charm and beauty spectrometer with a vertex detector. The purpose of the software is to graphically display and interactively manipulate the events, fit tracks and vertices and calculate physics quantities. The INGRES database forms the core of the system, while the DI3000 graphics package is used to plot the events. The paper introduces relational database concepts and their applicability to high energy physics data. It also evaluates the environment provided by INGRES, particularly its usefulness in code development and its Fortran interface. Specifics of the database design we have chosen are detailed as well. (orig.)

  8. BDVC (Bimodal Database of Violent Content): A database of violent audio and video

    Science.gov (United States)

    Rivera Martínez, Jose Luis; Mijes Cruz, Mario Humberto; Rodríguez Vázqu, Manuel Antonio; Rodríguez Espejo, Luis; Montoya Obeso, Abraham; García Vázquez, Mireya Saraí; Ramírez Acosta, Alejandro Álvaro

    2017-09-01

    Nowadays there is a trend towards the use of unimodal databases for multimedia content description, organization and retrieval applications of a single type of content like text, voice and images, instead bimodal databases allow to associate semantically two different types of content like audio-video, image-text, among others. The generation of a bimodal database of audio-video implies the creation of a connection between the multimedia content through the semantic relation that associates the actions of both types of information. This paper describes in detail the used characteristics and methodology for the creation of the bimodal database of violent content; the semantic relationship is stablished by the proposed concepts that describe the audiovisual information. The use of bimodal databases in applications related to the audiovisual content processing allows an increase in the semantic performance only and only if these applications process both type of content. This bimodal database counts with 580 audiovisual annotated segments, with a duration of 28 minutes, divided in 41 classes. Bimodal databases are a tool in the generation of applications for the semantic web.

  9. An Expert System Helps Students Learn Database Design

    Science.gov (United States)

    Post, Gerald V.; Whisenand, Thomas G.

    2005-01-01

    Teaching and learning database design is difficult for both instructors and students. Students need to solve many problems with feedback and corrections. A Web-based specialized expert system was created to enable students to create designs online and receive immediate feedback. An experiment testing the system shows that it significantly enhances…

  10. The PREDICTS database: a global database of how local terrestrial biodiversity responds to human impacts

    Science.gov (United States)

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara; Hill, Samantha L L; Lysenko, Igor; De Palma, Adriana; Phillips, Helen R P; Senior, Rebecca A; Bennett, Dominic J; Booth, Hollie; Choimes, Argyrios; Correia, David L P; Day, Julie; Echeverría-Londoño, Susy; Garon, Morgan; Harrison, Michelle L K; Ingram, Daniel J; Jung, Martin; Kemp, Victoria; Kirkpatrick, Lucinda; Martin, Callum D; Pan, Yuan; White, Hannah J; Aben, Job; Abrahamczyk, Stefan; Adum, Gilbert B; Aguilar-Barquero, Virginia; Aizen, Marcelo A; Ancrenaz, Marc; Arbeláez-Cortés, Enrique; Armbrecht, Inge; Azhar, Badrul; Azpiroz, Adrián B; Baeten, Lander; Báldi, András; Banks, John E; Barlow, Jos; Batáry, Péter; Bates, Adam J; Bayne, Erin M; Beja, Pedro; Berg, Åke; Berry, Nicholas J; Bicknell, Jake E; Bihn, Jochen H; Böhning-Gaese, Katrin; Boekhout, Teun; Boutin, Céline; Bouyer, Jérémy; Brearley, Francis Q; Brito, Isabel; Brunet, Jörg; Buczkowski, Grzegorz; Buscardo, Erika; Cabra-García, Jimmy; Calviño-Cancela, María; Cameron, Sydney A; Cancello, Eliana M; Carrijo, Tiago F; Carvalho, Anelena L; Castro, Helena; Castro-Luna, Alejandro A; Cerda, Rolando; Cerezo, Alexis; Chauvat, Matthieu; Clarke, Frank M; Cleary, Daniel F R; Connop, Stuart P; D'Aniello, Biagio; da Silva, Pedro Giovâni; Darvill, Ben; Dauber, Jens; Dejean, Alain; Diekötter, Tim; Dominguez-Haydar, Yamileth; Dormann, Carsten F; Dumont, Bertrand; Dures, Simon G; Dynesius, Mats; Edenius, Lars; Elek, Zoltán; Entling, Martin H; Farwig, Nina; Fayle, Tom M; Felicioli, Antonio; Felton, Annika M; Ficetola, Gentile F; Filgueiras, Bruno K C; Fonte, Steven J; Fraser, Lauchlan H; Fukuda, Daisuke; Furlani, Dario; Ganzhorn, Jörg U; Garden, Jenni G; Gheler-Costa, Carla; Giordani, Paolo; Giordano, Simonetta; Gottschalk, Marco S; Goulson, Dave; Gove, Aaron D; Grogan, James; Hanley, Mick E; Hanson, Thor; Hashim, Nor R; Hawes, Joseph E; Hébert, Christian; Helden, Alvin J; Henden, John-André; Hernández, Lionel; Herzog, Felix; Higuera-Diaz, Diego; Hilje, Branko; Horgan, Finbarr G; Horváth, Roland; Hylander, Kristoffer; Isaacs-Cubides, Paola; Ishitani, Masahiro; Jacobs, Carmen T; Jaramillo, Víctor J; Jauker, Birgit; Jonsell, Mats; Jung, Thomas S; Kapoor, Vena; Kati, Vassiliki; Katovai, Eric; Kessler, Michael; Knop, Eva; Kolb, Annette; Kőrösi, Ádám; Lachat, Thibault; Lantschner, Victoria; Le Féon, Violette; LeBuhn, Gretchen; Légaré, Jean-Philippe; Letcher, Susan G; Littlewood, Nick A; López-Quintero, Carlos A; Louhaichi, Mounir; Lövei, Gabor L; Lucas-Borja, Manuel Esteban; Luja, Victor H; Maeto, Kaoru; Magura, Tibor; Mallari, Neil Aldrin; Marin-Spiotta, Erika; Marshall, E J P; Martínez, Eliana; Mayfield, Margaret M; Mikusinski, Grzegorz; Milder, Jeffrey C; Miller, James R; Morales, Carolina L; Muchane, Mary N; Muchane, Muchai; Naidoo, Robin; Nakamura, Akihiro; Naoe, Shoji; Nates-Parra, Guiomar; Navarrete Gutierrez, Dario A; Neuschulz, Eike L; Noreika, Norbertas; Norfolk, Olivia; Noriega, Jorge Ari; Nöske, Nicole M; O'Dea, Niall; Oduro, William; Ofori-Boateng, Caleb; Oke, Chris O; Osgathorpe, Lynne M; Paritsis, Juan; Parra-H, Alejandro; Pelegrin, Nicolás; Peres, Carlos A; Persson, Anna S; Petanidou, Theodora; Phalan, Ben; Philips, T Keith; Poveda, Katja; Power, Eileen F; Presley, Steven J; Proença, Vânia; Quaranta, Marino; Quintero, Carolina; Redpath-Downing, Nicola A; Reid, J Leighton; Reis, Yana T; Ribeiro, Danilo B; Richardson, Barbara A; Richardson, Michael J; Robles, Carolina A; Römbke, Jörg; Romero-Duque, Luz Piedad; Rosselli, Loreta; Rossiter, Stephen J; Roulston, T'ai H; Rousseau, Laurent; Sadler, Jonathan P; Sáfián, Szabolcs; Saldaña-Vázquez, Romeo A; Samnegård, Ulrika; Schüepp, Christof; Schweiger, Oliver; Sedlock, Jodi L; Shahabuddin, Ghazala; Sheil, Douglas; Silva, Fernando A B; Slade, Eleanor M; Smith-Pardo, Allan H; Sodhi, Navjot S; Somarriba, Eduardo J; Sosa, Ramón A; Stout, Jane C; Struebig, Matthew J; Sung, Yik-Hei; Threlfall, Caragh G; Tonietto, Rebecca; Tóthmérész, Béla; Tscharntke, Teja; Turner, Edgar C; Tylianakis, Jason M; Vanbergen, Adam J; Vassilev, Kiril; Verboven, Hans A F; Vergara, Carlos H; Vergara, Pablo M; Verhulst, Jort; Walker, Tony R; Wang, Yanping; Watling, James I; Wells, Konstans; Williams, Christopher D; Willig, Michael R; Woinarski, John C Z; Wolf, Jan H D; Woodcock, Ben A; Yu, Douglas W; Zaitsev, Andrey S; Collen, Ben; Ewers, Rob M; Mace, Georgina M; Purves, Drew W; Scharlemann, Jörn P W; Purvis, Andy

    2014-01-01

    Biodiversity continues to decline in the face of increasing anthropogenic pressures such as habitat destruction, exploitation, pollution and introduction of alien species. Existing global databases of species’ threat status or population time series are dominated by charismatic species. The collation of datasets with broad taxonomic and biogeographic extents, and that support computation of a range of biodiversity indicators, is necessary to enable better understanding of historical declines and to project – and avert – future declines. We describe and assess a new database of more than 1.6 million samples from 78 countries representing over 28,000 species, collated from existing spatial comparisons of local-scale biodiversity exposed to different intensities and types of anthropogenic pressures, from terrestrial sites around the world. The database contains measurements taken in 208 (of 814) ecoregions, 13 (of 14) biomes, 25 (of 35) biodiversity hotspots and 16 (of 17) megadiverse countries. The database contains more than 1% of the total number of all species described, and more than 1% of the described species within many taxonomic groups – including flowering plants, gymnosperms, birds, mammals, reptiles, amphibians, beetles, lepidopterans and hymenopterans. The dataset, which is still being added to, is therefore already considerably larger and more representative than those used by previous quantitative models of biodiversity trends and responses. The database is being assembled as part of the PREDICTS project (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems – http://www.predicts.org.uk). We make site-level summary data available alongside this article. The full database will be publicly available in 2015. PMID:25558364

  11. Development of subsurface drainage database system for use in environmental management issues

    International Nuclear Information System (INIS)

    Azhar, A.H.; Rafiq, M.; Alam, M.M.

    2007-01-01

    A simple user-friendly menue-driven system for database management pertinent to the Impact of Subsurface Drainage Systems on Land and Water Conditions (ISIAW) has been developed for use in environment-management issues of the drainage areas. This database has been developed by integrating four soft wares, viz; Microsoft Excel, MS Word Acrobat and MS Access. The information, in the form of tables and figures, with respect to various drainage projects has been presented in MS Word files. The major data-sets of various subsurface drainage projects included in the ISLaW database are: i) technical aspects, ii) groundwater and soil-salinity aspects, iii) socio-technical aspects, iv) agro-economic aspects, and v) operation and maintenance aspects. The various ISlAW file can be accessed just by clicking at the Menu buttons of the database system. This database not only gives feed back on the functioning of different subsurface drainage projects, with respect to the above-mentioned aspects, but also serves as a resource-document for these data for future studies on other drainage projects. The developed database-system is useful for planners, designers and Farmers Organisations for improved operation of existing drainage projects as well as development of future ones. (author)

  12. Large-scale Health Information Database and Privacy Protection*1

    OpenAIRE

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law...

  13. A Systematic Review of Coding Systems Used in Pharmacoepidemiology and Database Research.

    Science.gov (United States)

    Chen, Yong; Zivkovic, Marko; Wang, Tongtong; Su, Su; Lee, Jianyi; Bortnichak, Edward A

    2018-02-01

    Clinical coding systems have been developed to translate real-world healthcare information such as prescriptions, diagnoses and procedures into standardized codes appropriate for use in large healthcare datasets. Due to the lack of information on coding system characteristics and insufficient uniformity in coding practices, there is a growing need for better understanding of coding systems and their use in pharmacoepidemiology and observational real world data research. To determine: 1) the number of available coding systems and their characteristics, 2) which pharmacoepidemiology databases are they adopted in, 3) what outcomes and exposures can be identified from each coding system, and 4) how robust they are with respect to consistency and validity in pharmacoepidemiology and observational database studies. Electronic literature database and unpublished literature searches, as well as hand searching of relevant journals were conducted to identify eligible articles discussing characteristics and applications of coding systems in use and published in the English language between 1986 and 2016. Characteristics considered included type of information captured by codes, clinical setting(s) of use, adoption by a pharmacoepidemiology database, region, and available mappings. Applications articles describing the use and validity of specific codes, code lists, or algorithms were also included. Data extraction was performed independently by two reviewers and a narrative synthesis was performed. A total of 897 unique articles and 57 coding systems were identified, 17% of which included country-specific modifications or multiple versions. Procedures (55%), diagnoses (36%), drugs (38%), and site of disease (39%) were most commonly and directly captured by these coding systems. The systems were used to capture information from the following clinical settings: inpatient (63%), ambulatory (55%), emergency department (ED, 34%), and pharmacy (13%). More than half of all coding

  14. Development of a Relational Database for Learning Management Systems

    Science.gov (United States)

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  15. Relational databases

    CERN Document Server

    Bell, D A

    1986-01-01

    Relational Databases explores the major advances in relational databases and provides a balanced analysis of the state of the art in relational databases. Topics covered include capture and analysis of data placement requirements; distributed relational database systems; data dependency manipulation in database schemata; and relational database support for computer graphics and computer aided design. This book is divided into three sections and begins with an overview of the theory and practice of distributed systems, using the example of INGRES from Relational Technology as illustration. The

  16. Quality assurance database for the CBM silicon tracking system

    Energy Technology Data Exchange (ETDEWEB)

    Lymanets, Anton [Physikalisches Institut, Universitaet Tuebingen (Germany); Collaboration: CBM-Collaboration

    2015-07-01

    The Silicon Tracking System is a main tracking device of the CBM Experiment at FAIR. Its construction includes production, quality assurance and assembly of large number of components, e.g., 106 carbon fiber support structures, 1300 silicon microstrip sensors, 16.6k readout chips, analog microcables, etc. Detector construction is distributed over several production and assembly sites and calls for a database that would be extensible and allow tracing the components, integrating the test data, monitoring the component statuses and data flow. A possible implementation of the above-mentioned requirements is being developed at GSI (Darmstadt) based on the FAIR DB Virtual Database Library that provides connectivity to common SQL-Database engines (PostgreSQL, Oracle, etc.). Data structure, database architecture as well as status of implementation are discussed.

  17. Spent fuel composition database system on WWW. SFCOMPO on WWW Ver.2

    International Nuclear Information System (INIS)

    Mochizuki, Hiroki; Suyama, Kenya; Nomura, Yasushi; Okuno, Hiroshi

    2001-08-01

    'SFCOMPO on WWW Ver.2' is an advanced version of 'SFCOMPO on WWW (Spent Fuel Composition Database System on WWW' released in 1997. This new version has a function of database management by an introduced relational database software 'PostgreSQL' and has various searching methods. All of the data required for the calculation of isotopic composition is available from the web site of this system. This report describes the outline of this system and the searching method using Internet. In addition, the isotopic composition data and the reactor data of the 14 LWRs (7 PWR and 7 BWR) registered in this system are described. (author)

  18. Understanding, modeling, and improving main-memory database performance

    OpenAIRE

    Manegold, S.

    2002-01-01

    textabstractDuring the last two decades, computer hardware has experienced remarkable developments. Especially CPU (clock-)speed has been following Moore's Law, i.e., doubling every 18 months; and there is no indication that this trend will change in the foreseeable future. Recent research has revealed that database performance, even with main-memory based systems, can hardly benefit from the ever increasing CPU power. The reason for this is that the performance of other hardware components h...

  19. Integrated Controlling System and Unified Database for High Throughput Protein Crystallography Experiments

    International Nuclear Information System (INIS)

    Gaponov, Yu.A.; Igarashi, N.; Hiraki, M.; Sasajima, K.; Matsugaki, N.; Suzuki, M.; Kosuge, T.; Wakatsuki, S.

    2004-01-01

    An integrated controlling system and a unified database for high throughput protein crystallography experiments have been developed. Main features of protein crystallography experiments (purification, crystallization, crystal harvesting, data collection, data processing) were integrated into the software under development. All information necessary to perform protein crystallography experiments is stored (except raw X-ray data that are stored in a central data server) in a MySQL relational database. The database contains four mutually linked hierarchical trees describing protein crystals, data collection of protein crystal and experimental data processing. A database editor was designed and developed. The editor supports basic database functions to view, create, modify and delete user records in the database. Two search engines were realized: direct search of necessary information in the database and object oriented search. The system is based on TCP/IP secure UNIX sockets with four predefined sending and receiving behaviors, which support communications between all connected servers and clients with remote control functions (creating and modifying data for experimental conditions, data acquisition, viewing experimental data, and performing data processing). Two secure login schemes were designed and developed: a direct method (using the developed Linux clients with secure connection) and an indirect method (using the secure SSL connection using secure X11 support from any operating system with X-terminal and SSH support). A part of the system has been implemented on a new MAD beam line, NW12, at the Photon Factory Advanced Ring for general user experiments

  20. Development of knowledge base system linked to material database

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Tsuji, Hirokazu; Mashiko, Shinichi; Miyakawa, Shunichi; Fujita, Mitsutane; Kinugawa, Junichi; Iwata, Shuichi

    2002-01-01

    The distributed material database system named 'Data-Free-Way' has been developed by four organizations (the National Institute for Materials Science, the Japan Atomic Energy Research Institute, the Japan Nuclear Cycle Development Institute, and the Japan Science and Technology Corporation) under a cooperative agreement in order to share fresh and stimulating information as well as accumulated information for the development of advanced nuclear materials, for the design of structural components, etc. In order to create additional values of the system, knowledge base system, in which knowledge extracted from the material database is expressed, is planned to be developed for more effective utilization of Data-Free-Way. XML (eXtensible Markup Language) has been adopted as the description method of the retrieved results and the meaning of them. One knowledge note described with XML is stored as one knowledge which composes the knowledge base. Since this knowledge note is described with XML, the user can easily convert the display form of the table and the graph into the data format which the user usually uses. This paper describes the current status of Data-Free-Way, the description method of knowledge extracted from the material database with XML and the distributed material knowledge base system. (author)

  1. Database of Samples Irradiated at Reactor TRIGA PUSPATI (RTP)

    International Nuclear Information System (INIS)

    Muhd Husamuddin Abdul Khalil; Mohd Amin Sharifuldin Salleh; Julia Abdul Karim

    2011-01-01

    Evaluation has been made to data of irradiated samples for the type of sample requested for activation at RTP. Sample types are grouped with percentage of total throughputs to rule out the weight percent of every respective group. The database consists of radionuclide inventory of short and long half-life and high activity radionuclides such as Br and Au have been identified and that database has been constructed using a user-friendly Microsoft Access. Through this, trend of gamma exposure will easily be evaluated at experimental facilities and could ensure radiological effect towards safety and health is limited per Radiation Protection (Basic Safety Standard) Regulation 1988. This database places an important parameter to improve management system in acquiring information of the samples irradiated at RTP and will enhance the safety assurance and reliability of the experimental design basis. (author)

  2. State analysis requirements database for engineering complex embedded systems

    Science.gov (United States)

    Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.

  3. Evaluation report on research and development of a database system for mutual computer operation; Denshi keisanki sogo un'yo database system no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    This paper describes evaluation on the research and development of a database system for mutual computer operation, with respect to discrete database technology, multi-media technology, high reliability technology, and mutual operation network system technology. A large number of research results placing the views on the future were derived, such as the issues of discretion and utilization patterns of the discrete database, structuring of data for multi-media information, retrieval systems, flexible and high-level utilization of the network, and the issues in database protection. These achievements are publicly disclosed widely. The largest feature of this project is in aiming at forming a network system that can be operated mutually under multi-vender environment. Therefore, the researches and developments have been executed under the spirit of the principle of openness to public and international cooperation. These efforts are represented by organizing the rule establishment committee, execution of mutual interconnection experiment (including demonstration evaluation), and development of the mounting rules based on the ISO's 'open system interconnection (OSI)'. These results are compiled in the JIS as the basic reference model for the open system interconnection, whereas the targets shown in the basic plan have been achieved sufficiently. (NEDO)

  4. Trends and progress in system identification

    CERN Document Server

    Eykhoff, Pieter

    1981-01-01

    Trends and Progress in System Identification is a three-part book that focuses on model considerations, identification methods, and experimental conditions involved in system identification. Organized into 10 chapters, this book begins with a discussion of model method in system identification, citing four examples differing on the nature of the models involved, the nature of the fields, and their goals. Subsequent chapters describe the most important aspects of model theory; the """"classical"""" methods and time series estimation; application of least squares and related techniques for the e

  5. Thermodynamic database for the Co-Pr system.

    Science.gov (United States)

    Zhou, S H; Kramer, M J; Meng, F Q; McCallum, R W; Ott, R T

    2016-03-01

    In this article, we describe data on (1) compositions for both as-cast and heat treated specimens were summarized in Table 1; (2) the determined enthalpy of mixing of liquid phase is listed in Table 2; (3) thermodynamic database of the Co-Pr system in TDB format for the research articled entitle Chemical partitioning for the Co-Pr system: First-principles, experiments and energetic calculations to investigate the hard magnetic phase W.

  6. Musculoskeletal disorders as underlying cause of death in 58 countries, 1986-2011: trend analysis of WHO mortality database.

    Science.gov (United States)

    Kiadaliri, Aliasghar A; Woolf, Anthony D; Englund, Martin

    2017-02-02

    Due to low mortality rate of musculoskeletal disorders (MSK) less attention has been paid to MSK as underlying cause of death in the general population. The aim was to examine trend in MSK as underlying cause of death in 58 countries across globe during 1986-2011. Data on mortality were collected from the WHO mortality database and population data were obtained from the United Nations. Annual sex-specific age-standardized mortality rates (ASMR) were calculated by means of direct standardization using the WHO world standard population. We applied joinpoint regression analysis for trend analysis. Between-country disparities were examined using between-country variance and Gini coefficient. The changes in number of MSK deaths between 1986 and 2011 were decomposed using two counterfactual scenarios. The number of MSK deaths increased by 67% between 1986 and 2011 mainly due to population aging. The mean ASMR changed from 17.2 and 26.6 per million in 1986 to 18.1 and 25.1 in 2011 among men and women, respectively (median: 7.3% increase in men and 9.0% reduction in women). Declines in ASMR of 25% or more were observed for men (women) in 13 (19) countries, while corresponding increases were seen for men (women) in 25 (14) countries. In both sexes, ASMR declined during 1986-1997, then increased during 1997-2001 and again declined over 2001-2011. Despite decline over time, there were substantial between-country disparities in MSK mortality and its temporal trend. We found substantial variations in MSK mortality and its trends between countries, regions and also between sex and age groups. Promoted awareness and better management of MSK might partly explain reduction in MSK mortality, but variations across countries warrant further investigations.

  7. METODE RESET PASSWORD LEVEL ROOT PADA RELATIONAL DATABASE MANAGEMENT SYSTEM (RDBMS MySQL

    Directory of Open Access Journals (Sweden)

    Taqwa Hariguna

    2011-08-01

    Full Text Available Database merupakan sebuah hal yang penting untuk menyimpan data, dengan database organisasi akan mendapatkan keuntungan dalam beberapa hal, seperti kecepatan akases dan mengurangi penggunaan kertas, namun dengan implementasi database tidak jarang administrator database lupa akan password yang digunakan, hal ini akan mempersulit dalam proses penangganan database. Penelitian ini bertujuan untuk menggali cara mereset password level root pada relational database management system MySQL.

  8. Selecting a Relational Database Management System for Library Automation Systems.

    Science.gov (United States)

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  9. Spent fuel composition database system on WWW. SFCOMPO on WWW Ver.2

    Energy Technology Data Exchange (ETDEWEB)

    Mochizuki, Hiroki [Japan Research Institute, Ltd., Tokyo (Japan); Suyama, Kenya; Nomura, Yasushi; Okuno, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-08-01

    'SFCOMPO on WWW Ver.2' is an advanced version of 'SFCOMPO on WWW' ('Spent Fuel Composition Database System on WWW') released in 1997. This new version has a function of database management by an introduced relational database software 'PostgreSQL' and has various searching methods. All of the data required for the calculation of isotopic composition is available from the web site of this system. This report describes the outline of this system and the searching method using Internet. In addition, the isotopic composition data and the reactor data of the 14 LWRs (7 PWR and 7 BWR) registered in this system are described. (author)

  10. Development of environment radiation database management system

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun [Daeduk College, Taejon (Korea, Republic of)

    1999-03-15

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation.

  11. Development of environment radiation database management system

    International Nuclear Information System (INIS)

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun

    1999-03-01

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation

  12. System factors influencing utilisation of Research4Life databases by ...

    African Journals Online (AJOL)

    This is a comprehensive investigation of the influence of system factors on utilisation of Research4Life databases. It is part of a doctoral dissertation. Research4Life databases are new innovative technologies being investigated in a new context – utilisation by NARIs scientists for research. The study adopted the descriptive ...

  13. Current status of system development to provide databases of nuclides migration

    International Nuclear Information System (INIS)

    Sasamoto, Hiroshi; Yoshida, Yasushi; Isogai, Takeshi; Suyama, Tadahiro; Shibata, Masahiro; Yui, Mikazu; Jintoku, Takashi

    2005-01-01

    JNC has developed databases of nuclides migration for safety assessment of high-level radioactive waste (HLW) repository, and they have been used in the second progress report to present the technical reliability of HLW geological disposal system in Japan. The technical level and applicability of databases have been highly evaluated even overseas. To provide the databases broadly over the world and to promote the use of the databases, we have performed the followings: 1) development of tools to convert the database format from geochemical code PHREEQE to PHREEQC, GWB and EQ3/6 and 2) set up a web site (http://migrationdb.jnc.go.jp) which enables the public to access to the databases. As a result, the number of database users has significantly increased. Additionally, a number of useful comments from the users can be applied to modification and/or update of databases. (author)

  14. Databases in Cloud - Solutions for Developing Renewable Energy Informatics Systems

    Directory of Open Access Journals (Sweden)

    Adela BARA

    2017-08-01

    Full Text Available The paper presents the data model of a decision support prototype developed for generation monitoring, forecasting and advanced analysis in the renewable energy filed. The solutions considered for developing this system include databases in cloud, XML integration, spatial data representation and multidimensional modeling. This material shows the advantages of Cloud databases and spatial data representation and their implementation in Oracle Database 12 c. Also, it contains a data integration part and a multidimensional analysis. The presentation of output data is made using dashboards.

  15. Investigation on construction of the database system for research and development of the global environment industry technology; Chikyu kankyo sangyo gijutsu kenkyu kaihatsuyo database system no kochiku ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-01

    This paper studies a concrete plan to introduce a new database system of Research Institute of Innovative Technology for the Earth (RITE) which is necessary to promote the industrial technology development contributing to solution of the global environmental problem. Specifications for system introduction are about maker selection, operation system, detailed schedule for introduction, etc. RITE inhouse database has problems on its operation system and its maintenance cost, and is apt to be high in a construction cost in comparison with a utilization factor. Further study is made on its introduction. Information provided by the inhouse database is only the one owned by the organization, and information outside the organization is provided by the external database. The information is registered and selected by the registerer himself. The access network is set by personal computer network at the beginning and is set to transit to INTERNET in the future. For practical construction of the system, it is necessary to make user`s detailed needs clear for the system design and to adjust functions between hardware systems. 32 figs., 9 tabs.

  16. Database Capture of Natural Language Echocardiographic Reports: A Unified Medical Language System Approach

    OpenAIRE

    Canfield, K.; Bray, B.; Huff, S.; Warner, H.

    1989-01-01

    We describe a prototype system for semi-automatic database capture of free-text echocardiography reports. The system is very simple and uses a Unified Medical Language System compatible architecture. We use this system and a large body of texts to create a patient database and develop a comprehensive hierarchical dictionary for echocardiography.

  17. Documentation for the U.S. Geological Survey Public-Supply Database (PSDB): A database of permitted public-supply wells, surface-water intakes, and systems in the United States

    Science.gov (United States)

    Price, Curtis V.; Maupin, Molly A.

    2014-01-01

    The U.S. Geological Survey (USGS) has developed a database containing information about wells, surface-water intakes, and distribution systems that are part of public water systems across the United States, its territories, and possessions. Programs of the USGS such as the National Water Census, the National Water Use Information Program, and the National Water-Quality Assessment Program all require a complete and current inventory of public water systems, the sources of water used by those systems, and the size of populations served by the systems across the Nation. Although the U.S. Environmental Protection Agency’s Safe Drinking Water Information System (SDWIS) database already exists as the primary national Federal database for information on public water systems, the Public-Supply Database (PSDB) was developed to add value to SDWIS data with enhanced location and ancillary information, and to provide links to other databases, including the USGS’s National Water Information System (NWIS) database.

  18. System Study: Emergency Power System 1998-2014

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk Assessment and Management Services Dept.

    2015-12-01

    This report presents an unreliability evaluation of the emergency power system (EPS) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period while yearly estimates for system unreliability are provided for the entire active period. An extremely statistically significant increasing trend was observed for EPS system unreliability for an 8-hour mission. A statistically significant increasing trend was observed for EPS system start-only unreliability.

  19. A guide to Internet atomic databases for hot plasmas

    International Nuclear Information System (INIS)

    Ralchenko, Yuri

    2006-01-01

    Internet atomic databases are nowadays considered to be the primary tool for dissemination of atomic data. We present here a review of numerical and bibliographic databases of importance for diagnostics of hot plasmas. Special attention is given to new and emerging trends, such as online calculation of various atomic parameters. The recently updated NIST databases are presented in detail

  20. A guide to Internet atomic databases for hot plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Ralchenko, Yuri [National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)]. E-mail: yuri.ralchenko@nist.gov

    2006-05-15

    Internet atomic databases are nowadays considered to be the primary tool for dissemination of atomic data. We present here a review of numerical and bibliographic databases of importance for diagnostics of hot plasmas. Special attention is given to new and emerging trends, such as online calculation of various atomic parameters. The recently updated NIST databases are presented in detail.

  1. Long Duration Exposure Facility (LDEF) optical systems SIG summary and database

    Science.gov (United States)

    Bohnhoff-Hlavacek, Gail

    1992-01-01

    The main objectives of the Long Duration Exposure Facility (LDEF) Optical Systems Special Investigative Group (SIG) Discipline are to develop a database of experimental findings on LDEF optical systems and elements hardware, and provide an optical system overview. Unlike the electrical and mechanical disciplines, the optics effort relies primarily on the testing of hardware at the various principal investigator's laboratories, since minimal testing of optical hardware was done at Boeing. This is because all space-exposed optics hardware are part of other individual experiments. At this time, all optical systems and elements testing by experiment investigator teams is not complete, and in some cases has hardly begun. Most experiment results to date, document observations and measurements that 'show what happened'. Still to come from many principal investigators is a critical analysis to explain 'why it happened' and future design implications. The original optical system related concerns and the lessons learned at a preliminary stage in the Optical Systems Investigations are summarized. The design of the Optical Experiments Database and how to acquire and use the database to review the LDEF results are described.

  2. Long Duration Exposure Facility (LDEF) optical systems SIG summary and database

    Science.gov (United States)

    Bohnhoff-Hlavacek, Gail

    1992-09-01

    The main objectives of the Long Duration Exposure Facility (LDEF) Optical Systems Special Investigative Group (SIG) Discipline are to develop a database of experimental findings on LDEF optical systems and elements hardware, and provide an optical system overview. Unlike the electrical and mechanical disciplines, the optics effort relies primarily on the testing of hardware at the various principal investigator's laboratories, since minimal testing of optical hardware was done at Boeing. This is because all space-exposed optics hardware are part of other individual experiments. At this time, all optical systems and elements testing by experiment investigator teams is not complete, and in some cases has hardly begun. Most experiment results to date, document observations and measurements that 'show what happened'. Still to come from many principal investigators is a critical analysis to explain 'why it happened' and future design implications. The original optical system related concerns and the lessons learned at a preliminary stage in the Optical Systems Investigations are summarized. The design of the Optical Experiments Database and how to acquire and use the database to review the LDEF results are described.

  3. INTEGRATED HSEQ MANAGEMENT SYSTEMS: DEVELOPMENTS AND TRENDS

    OpenAIRE

    Osmo Kauppila; Janne Härkönen; Seppo Väyrynen

    2015-01-01

    The integration of health and safety, environmental and quality (HSEQ) management systems has become a current topic in the 21st century, as the need for systems thinking has grown along with the number of management system standards. This study aims to map current developments and trends in integrated HSEQ management. Three viewpoints are taken: the current state of the main HSEQ management standards, research literature on integrated management systems (IMS), and a case study of an industry...

  4. Establishing the user requirements for the research reactor decommissioning database system

    International Nuclear Information System (INIS)

    Park, S. K.; Park, H. S.; Lee, G. W.; Park, J. H.

    2002-01-01

    In generally, so much information and data will be raised during the decommissioning activities. It is need a systematical electric system for the management of that. A database system for the decommissioning information and data management from the KRR-1 and 2 decommissioning project is developing now. All information and data will be put into this database system and retrieval also. For the developing the DB system, the basic concept, user requirements were established the then set up the system for categorizing the information and data. The entities of tables for input the data was raised and categorized and then converted the code. The ERD (Entity Relation Diagram) was also set up to show their relation. In need of the developing the user interface system for retrieval the data, is should be studied the analyzing on the relation between the input and output the data. Through this study, as results, the items of output tables are established and categorized according to the requirement of the user interface system for the decommissioning information and data. These tables will be used for designing the prototype and be set up by several feeds back for establishing the decommissioning database system

  5. Military space power systems technology trends and issues

    International Nuclear Information System (INIS)

    Barthelemy, R.R.; Massie, L.D.

    1985-01-01

    This paper assesses baseload and above-baseload (alert, active, pulsed and burst mode) power system options, places them in logical perspective relative to power level and operating time, discusses power systems technology state-of-the-art and trends and finally attempts to project future (post 2000) space power system capabilities

  6. Design of special purpose database for credit cooperation bank business processing network system

    Science.gov (United States)

    Yu, Yongling; Zong, Sisheng; Shi, Jinfa

    2011-12-01

    With the popularization of e-finance in the city, the construction of e-finance is transfering to the vast rural market, and quickly to develop in depth. Developing the business processing network system suitable for the rural credit cooperative Banks can make business processing conveniently, and have a good application prospect. In this paper, We analyse the necessity of adopting special purpose distributed database in Credit Cooperation Band System, give corresponding distributed database system structure , design the specical purpose database and interface technology . The application in Tongbai Rural Credit Cooperatives has shown that system has better performance and higher efficiency.

  7. Logical database design principles

    CERN Document Server

    Garmany, John; Clark, Terry

    2005-01-01

    INTRODUCTION TO LOGICAL DATABASE DESIGNUnderstanding a Database Database Architectures Relational Databases Creating the Database System Development Life Cycle (SDLC)Systems Planning: Assessment and Feasibility System Analysis: RequirementsSystem Analysis: Requirements Checklist Models Tracking and Schedules Design Modeling Functional Decomposition DiagramData Flow Diagrams Data Dictionary Logical Structures and Decision Trees System Design: LogicalSYSTEM DESIGN AND IMPLEMENTATION The ER ApproachEntities and Entity Types Attribute Domains AttributesSet-Valued AttributesWeak Entities Constraint

  8. Coupling an Unstructured NoSQL Database with a Geographic Information System

    OpenAIRE

    Holemans, Amandine; Kasprzyk, Jean-Paul; Donnay, Jean-Paul

    2018-01-01

    The management of unstructured NoSQL (Not only Structured Query Language) databases has undergone a great development in the last years mainly thanks to Big Data. Nevertheless, the specificity of spatial information is not purposely taken into account. To overcome this difficulty, we propose to couple a NoSQL database with a spatial Relational Data Base Management System (RDBMS). Exchanges of information between these two systems are illustrated with relevant examples ...

  9. Establishment of Database System for Radiation Oncology

    International Nuclear Information System (INIS)

    Kim, Dae Sup; Lee, Chang Ju; Yoo, Soon Mi; Kim, Jong Min; Lee, Woo Seok; Kang, Tae Young; Back, Geum Mun; Hong, Dong Ki; Kwon, Kyung Tae

    2008-01-01

    To enlarge the efficiency of operation and establish a constituency for development of new radiotherapy treatment through database which is established by arranging and indexing radiotherapy related affairs in well organized manner to have easy access by the user. In this study, Access program provided by Microsoft (MS Office Access) was used to operate the data base. The data of radiation oncology was distinguished by a business logs and maintenance expenditure in addition to stock management of accessories with respect to affairs and machinery management. Data for education and research was distinguished by education material for department duties, user manual and related thesis depending upon its property. Registration of data was designed to have input form according to its subject and the information of data was designed to be inspected by making a report. Number of machine failure in addition to its respective repairing hours from machine maintenance expenditure in a period of January 2008 to April 2009 was analyzed with the result of initial system usage and one year after the usage. Radiation oncology database system was accomplished by distinguishing work related and research related criteria. The data are arranged and collected according to its subjects and classes, and can be accessed by searching the required data through referring the descriptions from each criteria. 32.3% of total average time was reduced on analyzing repairing hours by acquiring number of machine failure in addition to its type in a period of January 2008 to April 2009 through machine maintenance expenditure. On distinguishing and indexing present and past data upon its subjective criteria through the database system for radiation oncology, the use of information can be easily accessed to enlarge the efficiency of operation, and in further, can be a constituency for improvement of work process by acquiring various information required for new radiotherapy treatment in real time.

  10. Research on the establishment of the database system for R and D on the innovative technology for the earth; Chikyu kankyo sangyo gijutsu kenkyu kaihatsuyo database system ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-03-01

    For the purpose of structuring a database system of technical information about the earth environmental issues, the `database system for R and D of the earth environmental industrial technology` was operationally evaluated, and study was made to open it and structure a prototype of database. In the present state as pointed out in the operational evaluation, the utilization frequency is not heightened due to lack of UNIX experience, absence of system managers and shortage of utilizable articles listed, so that the renewal of database does not ideally progress. Therefore, study was then made to introduce tools utilizable by the initiators and open the information access terminal to the researchers at headquarters utilizing the internet. In order for the earth environment-related researchers to easily obtain the information, a database was prototypically structured to support the research exchange. Tasks were made clear to be taken for selecting the fields of research and compiling common thesauri in Japanese, Western and other languages. 28 figs., 16 tabs.

  11. Review of trends in computerized systems for operator support

    International Nuclear Information System (INIS)

    Cain, D.G.

    1985-01-01

    The major trends shaping the development of computerized operator support systems in nuclear power plants are reviewed. These trends are the result of prior research in disturbance analysis systems that provided the technology base, and the SPDS requirement, which has been the impetus for change. The process is expected to result in hybrid control rooms with computer-driven supervisory workstations that complement conventional control board lay-outs. In the next three to five year period substantial upgrading of computer hardware will allow new and more sophisticated applications routines to be developed for operator support. Greater attention is being given to on-line validation of input signals for computer applications. A general movement towards operating strategies that are not based upon pre-analyzed event sequences is expected to influence the development of operator aids. The integration of displays with operating procedures will enable the computer system to a better coupling between problem detection and its resolution. Improved design methodology will assure that computer applications are accepted and used by operations personnel. Greater on-line analysis capability is stimulating the trend towards more on-site analysis and decision-making at nuclear power plants. Software standardization reflects the high cost of software development and the desire by utilities to gain greater independence from suppliers. There is growing realization that control rooms are beset by many of the demands and limitations of other office settings and that some of these may be addressed by the burgeoning office automation technology. Trends beyond the next five years are difficult to predict; however, there will be a trend towards more intelligent software. Artificial intelligence technology may play a pivotal role in future applications. Taking these trends into perspective, the author concludes that a promising future exists for computerized operator support in nuclear

  12. Examples of use of the database

    Energy Technology Data Exchange (ETDEWEB)

    Gillemot, F [Atomic Energy Research Inst., Budapest (Hungary); Davies, L M [Davies Consultants, Oxford (United Kingdom)

    1997-09-01

    Databases on ageing are generally used for elaboration of trend curves, and development of new steel types. Moreover they can be used for enhancing PTS evaluations. By more detailed PTS evaluation the calculated lifetime will be longer and resulting in the utilities being able to decrease the cost of life management efforts. The paper introduces three examples of database use related to PTS evaluation. (author). 4 refs, 8 figs, 1 tab.

  13. Implementation of dragon-I database system based on B/S model

    International Nuclear Information System (INIS)

    Jiang Wei; Lai Qinggui; Chen Nan; Gao Feng

    2010-01-01

    B/S architecture is utilized in the database system of 'Dragon-I'. The dynamic web software is designed with the technology of ASP. NET, and the web software are divided into three main tiers: user interface tier, business logic tier and access tier. The data of accelerator status and the data generated in experiment processes are managed with SQL Server DBMS, and the database is accessed based on the technology of ADO. NET. The status of facility, control parameters and testing waves are queried by the experiment number and experiment time. The demand of storage, management, browse, query and offline analysis are implemented entirely in this database system based on B/S architecture. (authors)

  14. Using decision-tree classifier systems to extract knowledge from databases

    Science.gov (United States)

    St.clair, D. C.; Sabharwal, C. L.; Hacke, Keith; Bond, W. E.

    1990-01-01

    One difficulty in applying artificial intelligence techniques to the solution of real world problems is that the development and maintenance of many AI systems, such as those used in diagnostics, require large amounts of human resources. At the same time, databases frequently exist which contain information about the process(es) of interest. Recently, efforts to reduce development and maintenance costs of AI systems have focused on using machine learning techniques to extract knowledge from existing databases. Research is described in the area of knowledge extraction using a class of machine learning techniques called decision-tree classifier systems. Results of this research suggest ways of performing knowledge extraction which may be applied in numerous situations. In addition, a measurement called the concept strength metric (CSM) is described which can be used to determine how well the resulting decision tree can differentiate between the concepts it has learned. The CSM can be used to determine whether or not additional knowledge needs to be extracted from the database. An experiment involving real world data is presented to illustrate the concepts described.

  15. Development of database systems for safety of repositories for disposal of radioactive wastes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yeong Hun; Han, Jeong Sang; Shin, Hyeon Jun; Ham, Sang Won; Kim, Hye Seong [Yonsei Univ., Seoul (Korea, Republic of)

    1999-03-15

    In the study, GSIS os developed for the maximizing effectiveness of the database system. For this purpose, the spatial relation of data from various fields that are constructed in the database which was developed for the site selection and management of repository for radioactive waste disposal. By constructing the integration system that can link attribute and spatial data, it is possible to evaluate the safety of repository effectively and economically. The suitability of integrating database and GSIS is examined by constructing the database in the test district where the site characteristics are similar to that of repository for radioactive waste disposal.

  16. BtoxDB: a comprehensive database of protein structural data on toxin-antitoxin systems.

    Science.gov (United States)

    Barbosa, Luiz Carlos Bertucci; Garrido, Saulo Santesso; Marchetto, Reinaldo

    2015-03-01

    Toxin-antitoxin (TA) systems are diverse and abundant genetic modules in prokaryotic cells that are typically formed by two genes encoding a stable toxin and a labile antitoxin. Because TA systems are able to repress growth or kill cells and are considered to be important actors in cell persistence (multidrug resistance without genetic change), these modules are considered potential targets for alternative drug design. In this scenario, structural information for the proteins in these systems is highly valuable. In this report, we describe the development of a web-based system, named BtoxDB, that stores all protein structural data on TA systems. The BtoxDB database was implemented as a MySQL relational database using PHP scripting language. Web interfaces were developed using HTML, CSS and JavaScript. The data were collected from the PDB, UniProt and Entrez databases. These data were appropriately filtered using specialized literature and our previous knowledge about toxin-antitoxin systems. The database provides three modules ("Search", "Browse" and "Statistics") that enable searches, acquisition of contents and access to statistical data. Direct links to matching external databases are also available. The compilation of all protein structural data on TA systems in one platform is highly useful for researchers interested in this content. BtoxDB is publicly available at http://www.gurupi.uft.edu.br/btoxdb. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. CRITICAL ASSESSMENT OF AUDITING CONTRIBUTIONS TO EFFECTIVE AND EFFICIENT SECURITY IN DATABASE SYSTEMS

    OpenAIRE

    Olumuyiwa O. Matthew; Carl Dudley

    2015-01-01

    Database auditing has become a very crucial aspect of security as organisations increase their adoption of database management systems (DBMS) as major asset that keeps, maintain and monitor sensitive information. Database auditing is the group of activities involved in observing a set of stored data in order to be aware of the actions of users. The work presented here outlines the main auditing techniques and methods. Some architectural based auditing systems were also consider...

  18. Database development and management

    CERN Document Server

    Chao, Lee

    2006-01-01

    Introduction to Database Systems Functions of a DatabaseDatabase Management SystemDatabase ComponentsDatabase Development ProcessConceptual Design and Data Modeling Introduction to Database Design Process Understanding Business ProcessEntity-Relationship Data Model Representing Business Process with Entity-RelationshipModelTable Structure and NormalizationIntroduction to TablesTable NormalizationTransforming Data Models to Relational Databases .DBMS Selection Transforming Data Models to Relational DatabasesEnforcing ConstraintsCreating Database for Business ProcessPhysical Design and Database

  19. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  20. Ultra-Structure database design methodology for managing systems biology data and analyses

    Directory of Open Access Journals (Sweden)

    Hemminger Bradley M

    2009-08-01

    Full Text Available Abstract Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping. Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find

  1. A new database sub-system for grain-size analysis

    Science.gov (United States)

    Suckow, Axel

    2013-04-01

    Detailed grain-size analyses of large depth profiles for palaeoclimate studies create large amounts of data. For instance (Novothny et al., 2011) presented a depth profile of grain-size analyses with 2 cm resolution and a total depth of more than 15 m, where each sample was measured with 5 repetitions on a Beckman Coulter LS13320 with 116 channels. This adds up to a total of more than four million numbers. Such amounts of data are not easily post-processed by spreadsheets or standard software; also MS Access databases would face serious performance problems. The poster describes a database sub-system dedicated to grain-size analyses. It expands the LabData database and laboratory management system published by Suckow and Dumke (2001). This compatibility with a very flexible database system provides ease to import the grain-size data, as well as the overall infrastructure of also storing geographic context and the ability to organize content like comprising several samples into one set or project. It also allows easy export and direct plot generation of final data in MS Excel. The sub-system allows automated import of raw data from the Beckman Coulter LS13320 Laser Diffraction Particle Size Analyzer. During post processing MS Excel is used as a data display, but no number crunching is implemented in Excel. Raw grain size spectra can be exported and controlled as Number- Surface- and Volume-fractions, while single spectra can be locked for further post-processing. From the spectra the usual statistical values (i.e. mean, median) can be computed as well as fractions larger than a grain size, smaller than a grain size, fractions between any two grain sizes or any ratio of such values. These deduced values can be easily exported into Excel for one or more depth profiles. However, such a reprocessing for large amounts of data also allows new display possibilities: normally depth profiles of grain-size data are displayed only with summarized parameters like the clay

  2. CD-ROM-aided Databases

    Science.gov (United States)

    Masuyama, Keiichi

    CD-ROM has rapidly evolved as a new information medium with large capacity, In the U.S. it is predicted that it will become two hundred billion yen market in three years, and thus CD-ROM is strategic target of database industry. Here in Japan the movement toward its commercialization has been active since this year. Shall CD-ROM bussiness ever conquer information market as an on-disk database or electronic publication? Referring to some cases of the applications in the U.S. the author views marketability and the future trend of this new optical disk medium.

  3. Palantiri: a distributed real-time database system for process control

    International Nuclear Information System (INIS)

    Tummers, B.J.; Heubers, W.P.J.

    1992-01-01

    The medium-energy accelerator MEA, located in Amsterdam, is controlled by a heterogeneous computer network. A large real-time database contains the parameters involved in the control of the accelerator and the experiments. This database system was implemented about ten years ago and has since been extended several times. In response to increased needs the database system has been redesigned. The new database environment, as described in this paper, consists out of two new concepts: (1) A Palantir which is a per machine process that stores the locally declared data and forwards all non local requests for data access to the appropriate machine. It acts as a storage device for data and a looking glass upon the world. (2) Golems: working units that define the data within the Palantir, and that have knowledge of the hardware they control. Applications access the data of a Golem by name (which do resemble Unix path names). The palantir that runs on the same machine as the application handles the distribution of access requests. This paper focuses on the Palantir concept as a distributed data storage and event handling device for process control. (author)

  4. Evaluation report on research and development of a database system for mutual computer operation; Denshi keisanki sogo un'yo database system no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    This paper describes evaluation on the research and development of a database system for mutual computer operation, with respect to discrete database technology, multi-media technology, high reliability technology, and mutual operation network system technology. A large number of research results placing the views on the future were derived, such as the issues of discretion and utilization patterns of the discrete database, structuring of data for multi-media information, retrieval systems, flexible and high-level utilization of the network, and the issues in database protection. These achievements are publicly disclosed widely. The largest feature of this project is in aiming at forming a network system that can be operated mutually under multi-vender environment. Therefore, the researches and developments have been executed under the spirit of the principle of openness to public and international cooperation. These efforts are represented by organizing the rule establishment committee, execution of mutual interconnection experiment (including demonstration evaluation), and development of the mounting rules based on the ISO's 'open system interconnection (OSI)'. These results are compiled in the JIS as the basic reference model for the open system interconnection, whereas the targets shown in the basic plan have been achieved sufficiently. (NEDO)

  5. Trend and application of CAD/CAM system

    International Nuclear Information System (INIS)

    Kang, Man Ok

    1984-09-01

    This report is about trend and application of CAD/CAM system, giving descriptions of computer aided design which helps construction, engineering and drafting tasks. It also tells of computer aided manufacturing related general design of manufactures, which includes process design, production management, decision of work technology, processing. The need and application of CAD/CAM system is increasing more and more for industry each area.

  6. Development of the software for the component reliability database system of Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sang Hoon; Kim, Seung Hwan; Choi, Sun Young [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    A study was performed to develop the system for the component reliability database which consists of database system to store the reliability data and softwares to analyze the reliability data.This system is a part of KIND (Korea Information System for Nuclear Reliability Database).The MS-SQL database is used to stores the component population data, component maintenance history, and the results of reliability analysis. Two softwares were developed for the component reliability system. One is the KIND-InfoView for the data storing, retrieving and searching. The other is the KIND-CompRel for the statistical analysis of component reliability. 4 refs., 13 figs., 7 tabs. (Author)

  7. Rotator cuff repair in the Brazilian Unified Health System: Brazilian trends from 2003 to 2015

    Directory of Open Access Journals (Sweden)

    Eduardo Angeli Malavolta

    Full Text Available ABSTRACT OBJECTIVE: To assess the historical trend of rotator cuff repairs in Brazil between 2003 and 2015, using the database of the Brazilian Unified Health System's (Sistema Único de Saúde [SUS] Department of Informatics (DataSUS. METHODS: Historical series using DataSUS. Surgeries performed between 2003 and 2015 were included and data relating to cuff tear repair were assessed, including decompression procedures were included. The numerator was the total number of rotator cuff repair and the denominator, the total population of the assessed locality. Population data were based on information from the Instituto Brasileiro de Geografia e Estatística (IBGE. RESULTS: During the period, 50,207 surgeries were performed. The rate was presented as number of procedures per 100,000 inhabitants, and increased from 0.83 to 2.81, a growth of 238%. In 2015, the South region had the highest rate, 6.32, followed by the Southeast, 3.62, while the North had the lowest rate, 0.13. The growing trend can be observed in the Southeast, South, and Midwest, while the rate is stable in the North and Northeast. CONCLUSION: The rate of rotator cuff repairs in Brazil performed through the SUS increased from 0.83 to 2.81 between 2003 and 2015, representing a growth of 238%, but remains lower than that of developed countries. A trend of growth can be observed in the Southeast, South, and Midwest, while the rate is stable in the North and Northeast.

  8. A searching and reporting system for relational databases using a graph-based metadata representation.

    Science.gov (United States)

    Hewitt, Robin; Gobbi, Alberto; Lee, Man-Ling

    2005-01-01

    Relational databases are the current standard for storing and retrieving data in the pharmaceutical and biotech industries. However, retrieving data from a relational database requires specialized knowledge of the database schema and of the SQL query language. At Anadys, we have developed an easy-to-use system for searching and reporting data in a relational database to support our drug discovery project teams. This system is fast and flexible and allows users to access all data without having to write SQL queries. This paper presents the hierarchical, graph-based metadata representation and SQL-construction methods that, together, are the basis of this system's capabilities.

  9. Consumer Attitudes About Renewable Energy. Trends and Regional Differences

    Energy Technology Data Exchange (ETDEWEB)

    Bird, Lori [Natural Marketing Institute, Harleysville, PA (United States); Sumner, Jenny [Natural Marketing Institute, Harleysville, PA (United States)

    2011-04-01

    The data in this report are taken from Natural Marketing Institute's (NMI's) Lifestyles of Health and Sustainability Consumer Trends Database. Created in 2002, the syndicated consumer database contains responses from 2,000 to 4,000 nationally representative U.S. adults (meaning the demographics of the sample are consistent with U.S. Census findings) each year. NMI used the database to analyze consumer attitudes and behavior related to renewable energy and to update previously conducted related research. Specifically, this report will explore consumer awareness, concerns, perceived benefits, knowledge of purchase options, and usage of renewable energy as well as provide regional comparisons and trends over time.

  10. Consumer Attitudes About Renewable Energy: Trends and Regional Differences

    Energy Technology Data Exchange (ETDEWEB)

    Natural Marketing Institute, Harleysville, Pennsylvania

    2011-04-01

    The data in this report are taken from Natural Marketing Institute's (NMI's) Lifestyles of Health and Sustainability Consumer Trends Database. Created in 2002, the syndicated consumer database contains responses from 2,000 to 4,000 nationally representative U.S. adults (meaning the demographics of the sample are consistent with U.S. Census findings) each year. NMI used the database to analyze consumer attitudes and behavior related to renewable energy and to update previously conducted related research. Specifically, this report will explore consumer awareness, concerns, perceived benefits, knowledge of purchase options, and usage of renewable energy as well as provide regional comparisons and trends over time.

  11. Data-base system for northern Midwest regional aquifer-system analysis

    Science.gov (United States)

    Kontis, A.L.; Mandle, Richard J.

    1980-01-01

    The U.S. Geological Survey is conducting a study of the Cambrian and Ordovician aquifer system of the northern Midwest as part of a national series of Regional Aquifer-Systems Analysis (RASA). An integral part of this study will be a simulation of the ground-water flow regime using the Geological Survey's three-dimensional finite-difference model. The first step in the modeling effort is the design and development of a systematic set of processes to facilitate the collection, evaluation, manipulation, and use of large quantities of information. A computerized data-base system to accomplish these goals has been completed for the northern Midwest RASA.

  12. Search extension transforms Wiki into a relational system: a case for flavonoid metabolite database.

    Science.gov (United States)

    Arita, Masanori; Suwa, Kazuhiro

    2008-09-17

    In computer science, database systems are based on the relational model founded by Edgar Codd in 1970. On the other hand, in the area of biology the word 'database' often refers to loosely formatted, very large text files. Although such bio-databases may describe conflicts or ambiguities (e.g. a protein pair do and do not interact, or unknown parameters) in a positive sense, the flexibility of the data format sacrifices a systematic query mechanism equivalent to the widely used SQL. To overcome this disadvantage, we propose embeddable string-search commands on a Wiki-based system and designed a half-formatted database. As proof of principle, a database of flavonoid with 6902 molecular structures from over 1687 plant species was implemented on MediaWiki, the background system of Wikipedia. Registered users can describe any information in an arbitrary format. Structured part is subject to text-string searches to realize relational operations. The system was written in PHP language as the extension of MediaWiki. All modifications are open-source and publicly available. This scheme benefits from both the free-formatted Wiki style and the concise and structured relational-database style. MediaWiki supports multi-user environments for document management, and the cost for database maintenance is alleviated.

  13. Relational Databases and Biomedical Big Data.

    Science.gov (United States)

    de Silva, N H Nisansa D

    2017-01-01

    In various biomedical applications that collect, handle, and manipulate data, the amounts of data tend to build up and venture into the range identified as bigdata. In such occurrences, a design decision has to be taken as to what type of database would be used to handle this data. More often than not, the default and classical solution to this in the biomedical domain according to past research is relational databases. While this used to be the norm for a long while, it is evident that there is a trend to move away from relational databases in favor of other types and paradigms of databases. However, it still has paramount importance to understand the interrelation that exists between biomedical big data and relational databases. This chapter will review the pros and cons of using relational databases to store biomedical big data that previous researches have discussed and used.

  14. The relational clinical database: a possible solution to the star wars in registry systems.

    Science.gov (United States)

    Michels, D K; Zamieroski, M

    1990-12-01

    In summary, having data from other service areas available in a relational clinical database could resolve many of the problems existing in today's registry systems. Uniting sophisticated information systems into a centralized database system could definitely be a corporate asset in managing the bottom line.

  15. System of end-to-end symmetric database encryption

    Science.gov (United States)

    Galushka, V. V.; Aydinyan, A. R.; Tsvetkova, O. L.; Fathi, V. A.; Fathi, D. V.

    2018-05-01

    The article is devoted to the actual problem of protecting databases from information leakage, which is performed while bypassing access control mechanisms. To solve this problem, it is proposed to use end-to-end data encryption, implemented at the end nodes of an interaction of the information system components using one of the symmetric cryptographic algorithms. For this purpose, a key management method designed for use in a multi-user system based on the distributed key representation model, part of which is stored in the database, and the other part is obtained by converting the user's password, has been developed and described. In this case, the key is calculated immediately before the cryptographic transformations and is not stored in the memory after the completion of these transformations. Algorithms for registering and authorizing a user, as well as changing his password, have been described, and the methods for calculating parts of a key when performing these operations have been provided.

  16. The magnet database system

    International Nuclear Information System (INIS)

    Baggett, P.; Delagi, N.; Leedy, R.; Marshall, W.; Robinson, S.L.; Tompkins, J.C.

    1991-01-01

    This paper describes the current status of MagCom, a central database of SSC magnet information that is available to all magnet scientists via network connections. The database has been designed to contain the specifications and measured values of important properties for major materials, plus configuration information (specifying which individual items were used in each cable, coil, and magnet) and the test results on completed magnets. These data will help magnet scientists to track and control the production process and to correlate the performance of magnets with the properties of their constituents

  17. Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App

    Science.gov (United States)

    Nurnawati, E. K.; Ermawati, E.

    2018-02-01

    An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.

  18. CardioTF, a database of deconstructing transcriptional circuits in the heart system.

    Science.gov (United States)

    Zhen, Yisong

    2016-01-01

    Information on cardiovascular gene transcription is fragmented and far behind the present requirements of the systems biology field. To create a comprehensive source of data for cardiovascular gene regulation and to facilitate a deeper understanding of genomic data, the CardioTF database was constructed. The purpose of this database is to collate information on cardiovascular transcription factors (TFs), position weight matrices (PWMs), and enhancer sequences discovered using the ChIP-seq method. The Naïve-Bayes algorithm was used to classify literature and identify all PubMed abstracts on cardiovascular development. The natural language learning tool GNAT was then used to identify corresponding gene names embedded within these abstracts. Local Perl scripts were used to integrate and dump data from public databases into the MariaDB management system (MySQL). In-house R scripts were written to analyze and visualize the results. Known cardiovascular TFs from humans and human homologs from fly, Ciona, zebrafish, frog, chicken, and mouse were identified and deposited in the database. PWMs from Jaspar, hPDI, and UniPROBE databases were deposited in the database and can be retrieved using their corresponding TF names. Gene enhancer regions from various sources of ChIP-seq data were deposited into the database and were able to be visualized by graphical output. Besides biocuration, mouse homologs of the 81 core cardiac TFs were selected using a Naïve-Bayes approach and then by intersecting four independent data sources: RNA profiling, expert annotation, PubMed abstracts and phenotype. The CardioTF database can be used as a portal to construct transcriptional network of cardiac development. Database URL: http://www.cardiosignal.org/database/cardiotf.html.

  19. Studies on preparation of the database system for clinical records of atomic bomb survivors

    International Nuclear Information System (INIS)

    Nakamura, Tsuyoshi

    1981-01-01

    Construction of the database system aimed at multipurpose application of data on clinical medicine was studied through the preparation of database system for clinical records of atomic bomb survivors. The present database includes the data about 110,000 atomic bomb survivors in Nagasaki City. This study detailed: (1) Analysis of errors occurring in a period from generation of data in the clinical field to input into the database, and discovery of a highly precise, effective method of input. (2) Development of a multipurpose program for uniform processing of data on physical examinations from many organizations. (3) Development of a record linkage method for voluminous files which are essential in the construction of a large-scale medical information system. (4) A database model suitable for clinical research and a method for designing a segment suitable for physical examination data. (Chiba, N.)

  20. Development of the plasma movie database system in JT-60

    International Nuclear Information System (INIS)

    Sueoka, Michiharu; Kawamata, Yoichi; Kurihara, Kenichi; Seki, Akiyuki

    2008-03-01

    A plasma movie is generally expected as one of the most efficient methods to know what plasma discharge has been conducted in the experiment. The JT-60 plasma movie is composed of video camera picture looking at a plasma, computer graphics (CG) picture, and magnetic probe signal as a sound channel. In order to use this movie efficiently, we have developed a new system having the following functions: (a) To store a plasma movie in the movie database system automatically combined with the plasma shape CG and the sound according to a discharge sequence. (b) To make a plasma movie is available (downloadable) for experiment data analyses at the Web-site. Especially, this system aimed at minimizing the development cost, and it tried to develop the real-time plasma shape visualization system (RVS) without any operating system (OS) customized for real-time use. As a result, this system succeeded in working under Windows XP. This report deals with the technical details of the plasma movie database system and the real-time plasma shape visualization system. (author)

  1. Identification of contaminant trends and data gaps for terrestrial vertebrates residing in northeastern estuaries of the United States

    Science.gov (United States)

    Rattner, B.A.; Pearson, J.L.; Golden, N.H.; Erwin, R.M.; Ottinger, M.A.

    1998-01-01

    The Biomonitoring of Environmental Status and Trends (BEST) program of the Department of the Interior is focused to identify and understand effects of contaminant stressors on biological resources under their stewardship. One BEST program activity involves evaluation of retrospective data to assess and predict the condition of biota in Atlantic coast estuaries. A 'Contaminant Exposure and Effects--Terrestrial Vertebrates' database (CEE-TV) has been compiled through computerized literature searches of Fish and Wildlife Reviews, BIOSIS, AGRICOLA, and TOXLINE, review of existing databases (e.g., US EPA Ecological Incident Information System, USGS Diagnostic and Epizootic Databases), and solicitation of unpublished reports from conservation agencies, private groups, and universities. Summary information has been entered into the CEE-TV database, including species, collection date (1965-present), site coordinates, sample matrix, contaminant concentrations, biomarker and bioindicator responses, and reference source, utilizing a 96-field dBase format. Currently, the CEE-TV database contains 3500 georeferenced records representing >200 vertebrate species and > 100,000 individuals residing in estuaries from Maine through Florida. This relational database can be directly queried, imported into the ARC/INFO geographic information system (GIS) to examine spatial tendencies, and used to identify 'hot-spots', generate hypotheses, and focus ecotoxicological assessments. An overview of temporal, phylogenetic, and geographic contaminant exposure and effects information, trends, and data gaps will be presented for terrestrial vertebrates residing in estuaries in the northeast United States.

  2. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    Science.gov (United States)

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  3. Validation of risk-based performance indicators: Safety system function trends

    International Nuclear Information System (INIS)

    Boccio, J.L.; Vesely, W.E.; Azarm, M.A.; Carbonaro, J.F.; Usher, J.L.; Oden, N.

    1989-10-01

    This report describes and applies a process for validating a model for a risk-based performance indicator. The purpose of the risk-based indicator evaluated, Safety System Function Trend (SSFT), is to monitor the unavailability of selected safety systems. Interim validation of this indicator is based on three aspects: a theoretical basis, an empirical basis relying on statistical correlations, and case studies employing 25 plant years of historical data collected from five plants for a number of safety systems. Results using the SSFT model are encouraging. Application of the model through case studies dealing with the performance of important safety systems shows that statistically significant trends in, and levels of, system performance can be discerned which thereby can provide leading indications of degrading and/or improving performances. Methods for developing system performance tolerance bounds are discussed and applied to aid in the interpretation of the trends in this risk-based indicator. Some additional characteristics of the SSFT indicator, learned through the data-collection efforts and subsequent data analyses performed, are also discussed. The usefulness and practicality of other data sources for validation purposes are explored. Further validation of this indicator is noted. Also, additional research is underway in developing a more detailed estimator of system unavailability. 9 refs., 18 figs., 5 tabs

  4. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  5. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  6. The use of database management systems in particle physics

    CERN Document Server

    Stevens, P H; Read, B J; Rittenberg, Alan

    1979-01-01

    Examines data-handling needs and problems in particle physics and looks at three very different efforts by the Particle Data Group (PDG) , the CERN-HERA Group in Geneva, and groups cooperating with ZAED in Germany at resolving these problems. The ZAED effort does not use a database management system (DBMS), the CERN-HERA Group uses an existing, limited capability DBMS, and PDG uses the Berkely Database Management (BDMS), which PDG itself designed and implemented with scientific data-handling needs in mind. The range of problems each group tried to resolve was influenced by whether or not a DBMS was available and by what capabilities it had. Only PDG has been able to systematically address all the problems. The authors discuss the BDMS- centered system PDG is now building in some detail. (12 refs).

  7. LINGUISTIC DATABASE FOR AUTOMATIC GENERATION SYSTEM OF ENGLISH ADVERTISING TEXTS

    Directory of Open Access Journals (Sweden)

    N. A. Metlitskaya

    2017-01-01

    Full Text Available The article deals with the linguistic database for the system of automatic generation of English advertising texts on cosmetics and perfumery. The database for such a system includes two main blocks: automatic dictionary (that contains semantic and morphological information for each word, and semantic-syntactical formulas of the texts in a special formal language SEMSINT. The database is built on the result of the analysis of 30 English advertising texts on cosmetics and perfumery. First, each word was given a unique code. For example, N stands for nouns, A – for adjectives, V – for verbs, etc. Then all the lexicon of the analyzed texts was distributed into different semantic categories. According to this semantic classification each word was given a special semantic code. For example, the record N01 that is attributed to the word «lip» in the dictionary means that this word refers to nouns of the semantic category «part of a human’s body».The second block of the database includes the semantic-syntactical formulas of the analyzed advertising texts written in a special formal language SEMSINT. The author gives a brief description of this language, presenting its essence and structure. Also, an example of one formalized advertising text in SEMSINT is provided.

  8. NVST Data Archiving System Based On FastBit NoSQL Database

    Science.gov (United States)

    Liu, Ying-bo; Wang, Feng; Ji, Kai-fan; Deng, Hui; Dai, Wei; Liang, Bo

    2014-06-01

    The New Vacuum Solar Telescope (NVST) is a 1-meter vacuum solar telescope that aims to observe the fine structures of active regions on the Sun. The main tasks of the NVST are high resolution imaging and spectral observations, including the measurements of the solar magnetic field. The NVST has been collecting more than 20 million FITS files since it began routine observations in 2012 and produces a maximum observational records of 120 thousand files in a day. Given the large amount of files, the effective archiving and retrieval of files becomes a critical and urgent problem. In this study, we implement a new data archiving system for the NVST based on the Fastbit Not Only Structured Query Language (NoSQL) database. Comparing to the relational database (i.e., MySQL; My Structured Query Language), the Fastbit database manifests distinctive advantages on indexing and querying performance. In a large scale database of 40 million records, the multi-field combined query response time of Fastbit database is about 15 times faster and fully meets the requirements of the NVST. Our study brings a new idea for massive astronomical data archiving and would contribute to the design of data management systems for other astronomical telescopes.

  9. SOMA: A Proposed Framework for Trend Mining in Large UK Diabetic Retinopathy Temporal Databases

    Science.gov (United States)

    Somaraki, Vassiliki; Harding, Simon; Broadbent, Deborah; Coenen, Frans

    In this paper, we present SOMA, a new trend mining framework; and Aretaeus, the associated trend mining algorithm. The proposed framework is able to detect different kinds of trends within longitudinal datasets. The prototype trends are defined mathematically so that they can be mapped onto the temporal patterns. Trends are defined and generated in terms of the frequency of occurrence of pattern changes over time. To evaluate the proposed framework the process was applied to a large collection of medical records, forming part of the diabetic retinopathy screening programme at the Royal Liverpool University Hospital.

  10. Received Signal Strength Database Interpolation by Kriging for a Wi-Fi Indoor Positioning System.

    Science.gov (United States)

    Jan, Shau-Shiun; Yeh, Shuo-Ju; Liu, Ya-Wen

    2015-08-28

    The main approach for a Wi-Fi indoor positioning system is based on the received signal strength (RSS) measurements, and the fingerprinting method is utilized to determine the user position by matching the RSS values with the pre-surveyed RSS database. To build a RSS fingerprint database is essential for an RSS based indoor positioning system, and building such a RSS fingerprint database requires lots of time and effort. As the range of the indoor environment becomes larger, labor is increased. To provide better indoor positioning services and to reduce the labor required for the establishment of the positioning system at the same time, an indoor positioning system with an appropriate spatial interpolation method is needed. In addition, the advantage of the RSS approach is that the signal strength decays as the transmission distance increases, and this signal propagation characteristic is applied to an interpolated database with the Kriging algorithm in this paper. Using the distribution of reference points (RPs) at measured points, the signal propagation model of the Wi-Fi access point (AP) in the building can be built and expressed as a function. The function, as the spatial structure of the environment, can create the RSS database quickly in different indoor environments. Thus, in this paper, a Wi-Fi indoor positioning system based on the Kriging fingerprinting method is developed. As shown in the experiment results, with a 72.2% probability, the error of the extended RSS database with Kriging is less than 3 dBm compared to the surveyed RSS database. Importantly, the positioning error of the developed Wi-Fi indoor positioning system with Kriging is reduced by 17.9% in average than that without Kriging.

  11. 9th Asian Conference on Intelligent Information and Database Systems

    CERN Document Server

    Nguyen, Ngoc; Shirai, Kiyoaki

    2017-01-01

    This book presents recent research in intelligent information and database systems. The carefully selected contributions were initially accepted for presentation as posters at the 9th Asian Conference on Intelligent Information and Database Systems (ACIIDS 2017) held from to 5 April 2017 in Kanazawa, Japan. While the contributions are of an advanced scientific level, several are accessible for non-expert readers. The book brings together 47 chapters divided into six main parts: • Part I. From Machine Learning to Data Mining. • Part II. Big Data and Collaborative Decision Support Systems, • Part III. Computer Vision Analysis, Detection, Tracking and Recognition, • Part IV. Data-Intensive Text Processing, • Part V. Innovations in Web and Internet Technologies, and • Part VI. New Methods and Applications in Information and Software Engineering. The book is an excellent resource for researchers and those working in algorithmics, artificial and computational intelligence, collaborative systems, decisio...

  12. A survey of the use of database management systems in accelerator projects

    CERN Document Server

    Poole, John

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accelerator projects and what they are being used for. Initially introduced to handle equipment builders' data, commercial DBMS are now being used in almost all areas of accelerators from on-line control to personnel data. A variety of commercial systems are being used in conjunction with a diverse selection of application software for data maintenance/manipulation and controls. This paper reviews the database activities known to IADBG.

  13. PNG Education System: Equity Trends and Comparisons.

    Science.gov (United States)

    Sheret, Michael

    This paper identifies and discusses inequities in the educational system of Papua New Guinea (PNG). It begins by explaining the use of the Gini coefficient as an equity index, and then discusses inequities and equity trends in four concern areas: geographic distribution of formal education between provinces; educational achievement; distribution…

  14. A database system for the management of severe accident risk information, SARD

    International Nuclear Information System (INIS)

    Ahn, K. I.; Kim, D. H.

    2003-01-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies

  15. A database system for the management of severe accident risk information, SARD

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, K. I.; Kim, D. H. [KAERI, Taejon (Korea, Republic of)

    2003-10-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies.

  16. Distributed Pseudo-Random Number Generation and Its Application to Cloud Database

    OpenAIRE

    Chen, Jiageng; Miyaji, Atsuko; Su, Chunhua

    2014-01-01

    Cloud database is now a rapidly growing trend in cloud computing market recently. It enables the clients run their computation on out-sourcing databases or access to some distributed database service on the cloud. At the same time, the security and privacy concerns is major challenge for cloud database to continue growing. To enhance the security and privacy of the cloud database technology, the pseudo-random number generation (PRNG) plays an important roles in data encryptions and privacy-pr...

  17. The Ross Operation in Children and Young Adults: 12-Year Results and Trends From the UK National Database.

    Science.gov (United States)

    Zebele, Carlo; Chivasso, Pierpaolo; Sedmakov, Christo; Angelini, Gianni; Caputo, Massimo; Parry, Andrew; Stoica, Serban

    2014-07-01

    To determine UK national trends and results of the Ross operation in relation to all aortic valve interventions. Examination of the UK Congenital Central Cardiac Audit Database for all aortic valve procedures performed between 2000 and 2011 in children (0-16 years) and young adults (16-30 years). A total of 2,206 aortic valve procedures were performed in children and 1,824 in young adults, the proportions in the two groups being: Ross operation (19% vs 15%, respectively), surgical valvoplasty (9.5% vs 4%), surgical valvotomy (9.5% vs 1%), aortic valve replacement (AVR; 11% vs 55%), aortic root replacement (4% vs 18%), and balloon valvoplasty (47% vs 7%). The 30-day and 1-year survival after Ross is 99.3% and 98.7%, respectively, in the last four years achieving 100%. In children, the proportion of balloon valvoplasty increased from an average of 43% in 2000 to 2006 to 53% in 2007 to 2011, whereas the Ross operation decreased from 22% to 16% (P Ross (P Ross operations performed. The year-on-year changes show a significant decreasing trend locally and nationally. Despite an excellent track record, the Ross operation is performed less frequently in the United Kingdom. This report is a first step in comparing treatment modalities at national level. © The Author(s) 2014.

  18. ASEAN Mineral Database and Information System (AMDIS)

    Science.gov (United States)

    Okubo, Y.; Ohno, T.; Bandibas, J. C.; Wakita, K.; Oki, Y.; Takahashi, Y.

    2014-12-01

    AMDIS has lunched officially since the Fourth ASEAN Ministerial Meeting on Minerals on 28 November 2013. In cooperation with Geological Survey of Japan, the web-based GIS was developed using Free and Open Source Software (FOSS) and the Open Geospatial Consortium (OGC) standards. The system is composed of the local databases and the centralized GIS. The local databases created and updated using the centralized GIS are accessible from the portal site. The system introduces distinct advantages over traditional GIS. Those are a global reach, a large number of users, better cross-platform capability, charge free for users, charge free for provider, easy to use, and unified updates. Raising transparency of mineral information to mining companies and to the public, AMDIS shows that mineral resources are abundant throughout the ASEAN region; however, there are many datum vacancies. We understand that such problems occur because of insufficient governance of mineral resources. Mineral governance we refer to is a concept that enforces and maximizes the capacity and systems of government institutions that manages minerals sector. The elements of mineral governance include a) strengthening of information infrastructure facility, b) technological and legal capacities of state-owned mining companies to fully-engage with mining sponsors, c) government-led management of mining projects by supporting the project implementation units, d) government capacity in mineral management such as the control and monitoring of mining operations, and e) facilitation of regional and local development plans and its implementation with the private sector.

  19. Database Description - SSBD | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available base Description General information of database Database name SSBD Alternative nam...ss 2-2-3 Minatojima-minamimachi, Chuo-ku, Kobe 650-0047, Japan, RIKEN Quantitative Biology Center Shuichi Onami E-mail: Database... classification Other Molecular Biology Databases Database classification Dynamic databa...elegans Taxonomy ID: 6239 Taxonomy Name: Escherichia coli Taxonomy ID: 562 Database description Systems Scie...i Onami Journal: Bioinformatics/April, 2015/Volume 31, Issue 7 External Links: Original website information Database

  20. A dedicated database system for handling multi-level data in systems biology

    OpenAIRE

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Background Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging...

  1. Expert system for quality control in bibliographic databases

    International Nuclear Information System (INIS)

    Todeschini, C.; Farrell, M.P.

    1989-01-01

    An Expert System is presented that can identify errors in the intellectual decisions made by indexers when categorizing documents into an a priori category scheme. The system requires the compilation of a Knowledge Base that incorporates in statistical form the decisions on the linking of indexing and categorization derived from a preceding period of the bibliographic database. New input entering the database is checked against the Knowledge Base, using the descriptor indexing assigned to each record, and the system computed a value for the match of each record with the particular category chosen by the indexer. This category match value is used as a criterion for identifying those documents that have been erroneously categorized. The system was tested on large sample of almost 26,000 documents, representing all the literature falling into ten of the subject categories of the Energy Data Base during the five year period 1980-1984. For valid comparisons among categories, the Knowledge Base must be constructed with an approximately equal number of unique descriptors for each subject category. The system identified those items with high probability of having been erroneously categorized. These items, constituting up to 5% of the sample, were evaluated manually by subject specialists for correct categorization and then compared with the results of the Expert System. Of those pieces of literature deemed by the system to be erroneously categorized, about 75% did indeed belong to a different category. This percentage, however, is dependent on the level at which the threshold on the category match value is set. With a lower threshold value, the percentage can be raised to 90%, but this is accompanied by a lowering of the absolute number of wrongly categorized records caught by the system. The Expert System can be considered as a first step to complete semiautomatic categorizing system

  2. ISOE: an international occupational exposure database and communications network for dose optimisation

    International Nuclear Information System (INIS)

    Robinson, I.F.; Lazo, E.

    1995-01-01

    The Information System on Occupational Exposure (ISOE) was launched by the Nuclear Energy Agency (NEA) of the Organisation for Economic Co-operation and Development (OECD) on 1 January 1992 to facilitate the communication of dosimetry and ALARA implementation data among nuclear facilities around the world. Members of ISOE include 51 utilities from 17 countries and regulators from 11 countries, with four regional technical centres administering the system and a Steering Group which manages the work. ISOE includes three databases and a communications network at several levels. The three databases NEA1, NEA2 and NEA3 include varying levels of details, with NEA3 being the most detailed giving task and site specific ALARA practices and experiences. Utility membership of ISOE gives full access to the databases whereas regulators have more limited access. This paper reviews the current status of participation, describes the three databases and the communications network. Some dose data showing trends in particular countries are presented as well as dose data relating to operation cycle length and outage length. The advantages of membership are described, and it is concluded that ISOE holds the potential for both dose and cost savings. (author)

  3. AtlasT4SS: a curated database for type IV secretion systems.

    Science.gov (United States)

    Souza, Rangel C; del Rosario Quispe Saji, Guadalupe; Costa, Maiana O C; Netto, Diogo S; Lima, Nicholas C B; Klein, Cecília C; Vasconcelos, Ana Tereza R; Nicolás, Marisa F

    2012-08-09

    The type IV secretion system (T4SS) can be classified as a large family of macromolecule transporter systems, divided into three recognized sub-families, according to the well-known functions. The major sub-family is the conjugation system, which allows transfer of genetic material, such as a nucleoprotein, via cell contact among bacteria. Also, the conjugation system can transfer genetic material from bacteria to eukaryotic cells; such is the case with the T-DNA transfer of Agrobacterium tumefaciens to host plant cells. The system of effector protein transport constitutes the second sub-family, and the third one corresponds to the DNA uptake/release system. Genome analyses have revealed numerous T4SS in Bacteria and Archaea. The purpose of this work was to organize, classify, and integrate the T4SS data into a single database, called AtlasT4SS - the first public database devoted exclusively to this prokaryotic secretion system. The AtlasT4SS is a manual curated database that describes a large number of proteins related to the type IV secretion system reported so far in Gram-negative and Gram-positive bacteria, as well as in Archaea. The database was created using the RDBMS MySQL and the Catalyst Framework based in the Perl programming language and using the Model-View-Controller (MVC) design pattern for Web. The current version holds a comprehensive collection of 1,617 T4SS proteins from 58 Bacteria (49 Gram-negative and 9 Gram-Positive), one Archaea and 11 plasmids. By applying the bi-directional best hit (BBH) relationship in pairwise genome comparison, it was possible to obtain a core set of 134 clusters of orthologous genes encoding T4SS proteins. In our database we present one way of classifying orthologous groups of T4SSs in a hierarchical classification scheme with three levels. The first level comprises four classes that are based on the organization of genetic determinants, shared homologies, and evolutionary relationships: (i) F-T4SS, (ii) P-T4SS, (iii

  4. 8th Asian Conference on Intelligent Information and Database Systems

    CERN Document Server

    Madeyski, Lech; Nguyen, Ngoc

    2016-01-01

    The objective of this book is to contribute to the development of the intelligent information and database systems with the essentials of current knowledge, experience and know-how. The book contains a selection of 40 chapters based on original research presented as posters during the 8th Asian Conference on Intelligent Information and Database Systems (ACIIDS 2016) held on 14–16 March 2016 in Da Nang, Vietnam. The papers to some extent reflect the achievements of scientific teams from 17 countries in five continents. The volume is divided into six parts: (a) Computational Intelligence in Data Mining and Machine Learning, (b) Ontologies, Social Networks and Recommendation Systems, (c) Web Services, Cloud Computing, Security and Intelligent Internet Systems, (d) Knowledge Management and Language Processing, (e) Image, Video, Motion Analysis and Recognition, and (f) Advanced Computing Applications and Technologies. The book is an excellent resource for researchers, those working in artificial intelligence, mu...

  5. Performance Assessment of Dynaspeak Speech Recognition System on Inflight Databases

    National Research Council Canada - National Science Library

    Barry, Timothy

    2004-01-01

    .... To aid in the assessment of various commercially available speech recognition systems, several aircraft speech databases have been developed at the Air Force Research Laboratory's Human Effectiveness Directorate...

  6. Current trends on knowledge-based systems

    CERN Document Server

    Valencia-García, Rafael

    2017-01-01

    This book presents innovative and high-quality research on the implementation of conceptual frameworks, strategies, techniques, methodologies, informatics platforms and models for developing advanced knowledge-based systems and their application in different fields, including Agriculture, Education, Automotive, Electrical Industry, Business Services, Food Manufacturing, Energy Services, Medicine and others. Knowledge-based technologies employ artificial intelligence methods to heuristically address problems that cannot be solved by means of formal techniques. These technologies draw on standard and novel approaches from various disciplines within Computer Science, including Knowledge Engineering, Natural Language Processing, Decision Support Systems, Artificial Intelligence, Databases, Software Engineering, etc. As a combination of different fields of Artificial Intelligence, the area of Knowledge-Based Systems applies knowledge representation, case-based reasoning, neural networks, Semantic Web and TICs used...

  7. A survey of the use of database management systems in accelerator projects

    OpenAIRE

    Poole, John; Strubin, Pierre M

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accele...

  8. Contributions to Logical Database Design

    Directory of Open Access Journals (Sweden)

    Vitalie COTELEA

    2012-01-01

    Full Text Available This paper treats the problems arising at the stage of logical database design. It comprises a synthesis of the most common inference models of functional dependencies, deals with the problems of building covers for sets of functional dependencies, makes a synthesizes of normal forms, presents trends regarding normalization algorithms and provides a temporal complexity of those. In addition, it presents a summary of the most known keys’ search algorithms, deals with issues of analysis and testing of relational schemes. It also summarizes and compares the different features of recognition of acyclic database schemas.

  9. An experimental investigation of masking in the US FDA adverse event reporting system database.

    Science.gov (United States)

    Wang, Hsin-wei; Hochberg, Alan M; Pearson, Ronald K; Hauben, Manfred

    2010-12-01

    adjudication was available from a previous study. The original disproportionality analysis identified 8719 SDRs for the 63 PTs. The SU-based unmasking protocols generated variable numbers of masked SDRs ranging from 38 to 156, representing a 0.43-1.8% increase over the number of baseline SDRs. A significant number of baseline SDRs were also lost in the course of our experiments. The trend in the number of gained SDRs per report removed was inversely related to the number of lost SDRs per protocol. Both the number and nature of the reports removed influenced the number of gained SDRs observed. The purely empirical protocols unmasked up to ten times as many SDRs. None of the masked SDRs had strong external evidence supporting a causal association. Most involved associations for which there was no external supporting evidence or were in the original product label. For two masked SDRs, there was external evidence of a possible causal association. We documented masking in the FDA AERS database. Attempts at unmasking SDRs using practically implementable protocols produced only small changes in the output of SDRs in our analysis. This is undoubtedly related to the large size and diversity of the database, but the complex interdependencies between drugs and events in authentic spontaneous reporting system (SRS) databases, and the impact of measures of statistical variability that are typically used in real-world disproportionality analysis, may be additional factors that constrain the discovery of masked SDRs and which may also operate in pharmaceutical company databases. Empirical determination of the most influential drugs may uncover significantly more SDRs than protocols based on predetermined statistical selection rules but are impractical except possibly for evaluating specific events. Routine global exercises to elicit masking, especially in large health authority databases are not justified based on results available to date. Exercises to elicit unmasking should be driven by

  10. An Implementation of a Database System for Book Loan in an ...

    African Journals Online (AJOL)

    A Case Study of the Polytechnic, Ibadan Library) ... the deletion, updating and query operations. Reports can be generated using report generator incorporated into the system. Key Words: Database, Book, loan, Academic, Library System, File ...

  11. JICST Factual DatabaseJICST Chemical Substance Safety Regulation Database

    Science.gov (United States)

    Abe, Atsushi; Sohma, Tohru

    JICST Chemical Substance Safety Regulation Database is based on the Database of Safety Laws for Chemical Compounds constructed by Japan Chemical Industry Ecology-Toxicology & Information Center (JETOC) sponsored by the Sience and Technology Agency in 1987. JICST has modified JETOC database system, added data and started the online service through JOlS-F (JICST Online Information Service-Factual database) in January 1990. JICST database comprises eighty-three laws and fourteen hundred compounds. The authors outline the database, data items, files and search commands. An example of online session is presented.

  12. Trends in Data Locality Abstractions for HPC Systems

    KAUST Repository

    Unat, Didem; Dubey, Anshu; Hoefler, Torsten; Shalf, John; Abraham, Mark; Bianco, Mauro; Chamberlain, Bradford L.; Cledat, Romain; Edwards, H. Carter; Finkel, Hal; Fuerlinger, Karl; Hannig, Frank; Jeannot, Emmanuel; Kamil, Amir; Keasler, Jeff; Kelly, Paul H J; Leung, Vitus; Ltaief, Hatem; Maruyama, Naoya; Newburn, Chris J.; Pericas, Miquel

    2017-01-01

    The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. them However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity and performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. This paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.

  13. Trends in Data Locality Abstractions for HPC Systems

    KAUST Repository

    Unat, Didem

    2017-05-12

    The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. them However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity and performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. This paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.

  14. Design of multi-tiered database application based on CORBA component in SDUV-FEL system

    International Nuclear Information System (INIS)

    Sun Xiaoying; Shen Liren; Dai Zhimin

    2004-01-01

    The drawback of usual two-tiered database architecture was analyzed and the Shanghai Deep Ultraviolet-Free Electron Laser database system under development was discussed. A project for realizing the multi-tiered database architecture based on common object request broker architecture (CORBA) component and middleware model constructed by C++ was presented. A magnet database was given to exhibit the design of the CORBA component. (authors)

  15. 78 FR 2363 - Notification of Deletion of a System of Records; Automated Trust Funds Database

    Science.gov (United States)

    2013-01-11

    ... [Docket No. APHIS-2012-0041] Notification of Deletion of a System of Records; Automated Trust Funds Database AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice of deletion of a system... establishing the Automated Trust Funds (ATF) database system of records. The Federal Information Security...

  16. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    International Nuclear Information System (INIS)

    Waters, Michael; Jackson, Marcus

    2008-01-01

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  17. STARS - Supportability Trend Analysis and Reporting System for the National Space Transportation System

    Science.gov (United States)

    Graham, Leroy J.; Doempke, Gerald T.

    1990-01-01

    The concept, implementation, and long-range goals of a Supportability Trend Analysis and Reporting System (STARS) for the National Space Transportation System (NSTS) are discussed. The requirement was established as a direct result of the recommendations of the Rogers Commission investigation of the circumstances of the Space Shuttle Challenger accident. STARS outlines the requirements for the supportability-trend data collection, analysis, and reporting requirements that each of the project offices supporting the Space Shuttle are required to provide to the NSTS program office. STARS data give the historic and predictive logistics information necessary for all levels of NSTS management to make safe and cost-effective decisions concerning the smooth flow of Space Shuttle turnaround.

  18. National Wilderness Preservation System database: key attributes and trends, 1964 through 1999

    Science.gov (United States)

    Peter Landres; Shannon Meyer

    2000-01-01

    The Wilderness Act of 1964 established a National Wilderness Preservation System, and this publication is a compilation of selected information about every wilderness within this System. For each wilderness, the following information is given: legally correct wilderness name; public law that established the wilderness; date the enabling law was signed by the President...

  19. Development of radiation oncology learning system combined with multi-institutional radiotherapy database (ROGAD)

    International Nuclear Information System (INIS)

    Takemura, Akihiro; Iinuma, Masahiro; Kou, Hiroko; Harauchi, Hajime; Inamura, Kiyonari

    1999-01-01

    We have constructed and are operating a multi-institutional radiotherapy database ROGAD (Radiation Oncology Greater Area Database) since 1992. One of it's purpose is 'to optimize individual radiotherapy plans'. We developed Radiation oncology learning system combined with ROGAD' which conforms to that purpose. Several medical doctors evaluated our system. According to those evaluations, we are now confident that our system is able to contribute to improvement of radiotherapy results. Our final target is to generate a good cyclic relationship among three components: radiotherapy results according to ''Radiation oncology learning system combined with ROGAD.'; The growth of ROGAD; and radiation oncology learning system. (author)

  20. Recent developments and object-oriented approach in FTU database

    International Nuclear Information System (INIS)

    Bertocchi, A.; Bracco, G.; Buceti, G.; Centioli, C.; Iannone, F.; Manduchi, G.; Nanni, U.; Panella, M.; Stracuzzi, C.; Vitale, V.

    2001-01-01

    During the last two years, the experimental database of Frascati Tokamak Upgrade (FTU) has been changed from several points of view, particularly: (i) the data and the analysis codes have been moved from the IBM main frame to Unix platforms making enabling the users to take advantage of the large quantities of commercial and free software available under Unix (Matlab, IDL, etc); (ii) AFS (Andrew File System) has been chosen as the distributed file system making the data available on all the nodes and distributing the workload; (iii) 'One measure/one file' philosophy (vs. the previous 'one pulse/one file') has been adopted increasing the number of files into the database but, at the same time, allowing the most important data to be available just after the plasma discharge. The client-server architecture has been tested using the signal viewer client jScope. Moreover, an object oriented data model (OODM) of FTU experimental data has been tried: a generalized model in tokamak experimental data has been developed with typical concepts such as abstraction, encapsulation, inheritance, and polymorphism. The model has been integrated with data coming from different databases, building an Object Warehouse to extract, with data mining techniques, meaningful trends and patterns from huge amounts of data

  1. Research focus and trends in nuclear science and technology in Ghana: a bibliometric study based on the INIS database

    International Nuclear Information System (INIS)

    Agyeman, E. A.; Bilson, A.

    2015-01-01

    The peaceful application of atomic energy was introduced into Ghana about fifty years ago. This is the first bibliometric study of nuclear science and technology research publications originating from Ghana and listed in the International Nuclear Information System (INIS) Database. The purpose was to use the simple document counting method to determine the geographical distribution, annual growth and the subject areas of the publications as well as communication channels, key journals and authorship trends. The main findings of the study were that, a greater number of the nuclear science and technology records listed in the Database were published in Ghana (598 or 56.57% against 459 or 43.43% published outside Ghana). There has been a steady growth in the number of publications over the years with the most productive year being 2012. The main focus of research has been in the area of applied life sciences, comprising plant cultivation & breeding, pest & disease control, food protection and preservation, human nutrition and animal husbandry; followed by chemistry; environmental sciences; radiation protection; nuclear reactors; physics; energy; and radiology and nuclear medicine. The area with the least number of publications was safeguards and physical protection. The main channel of communicating research results was peer reviewed journals and a greater number of the journal articles were published in Ghana followed by the United Kingdom, Hungary and the Netherlands. The core journals identified in this study were Journal of Applied Science and Technology; Journal of Radioanalytical and Nuclear Chemistry; Journal of the Ghana Science Association; Radiation Protection Dosimetry; Journal of the Kumasi University of Science and Technology; West African Journal of Applied Ecology; Ghana Journal of Science; Applied Radiation and Isotopes; Annals of Nuclear Energy, IOP Conference Series (Earth and Environmental Science) and Radiation Physics and Chemistry. Eighty percent

  2. Application of embedded database to digital power supply system in HIRFL

    International Nuclear Information System (INIS)

    Wu Guanghua; Yan Huaihai; Chen Youxin; Huang Yuzhen; Zhou Zhongzu; Gao Daqing

    2014-01-01

    Background: This paper introduces the application of embedded MySQL database in the real-time monitoring system of the digital power supply system in Heavy Ion Research Facility in Lanzhou (HIRFL). Purpose: The aim is to optimize the real-time monitoring system of the digital power supply system for better performance. Methods: The MySQL database is designed and implemented under Linux operation system running on ARM processor, together with the related functions for real-time data monitoring, such as collection, storage and query. All status parameters of digital power supply system is collected and communicated with ARM by a FPGA, whilst the user interface is realized by Qt toolkits at ARM end. Results: The actual operation indicates that digital power supply can realize the function of real-time data monitoring, collection, storage and so on. Conclusion: Through practical application, we have found some aspects we can improve and we will try to optimize them in the future. (authors)

  3. Development of the sorption and diffusion database system for safety assessment of geological disposal

    International Nuclear Information System (INIS)

    Tachi, Yukio; Tochigi, Yoshikatsu; Suyama, Tadahiro; Saito, Yoshihiko; Yui, Mikazu; Ochs, Michael

    2009-02-01

    Japan Atomic Energy Agency (JAEA) has been developing databases of sorption and diffusion parameters in buffer material (bentonite) and rock, which are key parameters for safety assessment of the geological disposal. These sorption and diffusion databases (SDB/DDB) have been firstly developed as an important basis for the H12 performance assessment (PA) of high-level radioactive waste disposal in Japan, and have been provided through the Web. JAEA has been and is continuing to improve and update the SDB/DDB in view of potential future data needs, focusing on assuring the desired quality level and testing the usefulness of the existing databases for possible applications to parameter-setting for the deep geological environment. The new web-based sorption and diffusion database system (JAEA-SDB/DDB) has been developed to utilize quality assuring procedure and to allow effective application for parameter setting, by adding the following functions to the existing database; - consistency and linkage between sorption and diffusion database - effective utilization of quality assuring (QA) guideline and categolized QA data - additional function for estimating of parameters and graphing of relation between parameters - counting and summarizing function for effective access to respective data for parameter setting. In the present report, practical examples were illustrated regarding the applicability of the database system to the parameter setting by using additional functions such as QA information and data estimation. This database system is expected to make it possible to obtain quick overview of the available data from the database, and to have suitable access to the respective data for parameter-setting for performance assessment and parameter-deriving for mechanistic modeling in traceable and transparent manner. (author)

  4. A distributed database view of network tracking systems

    Science.gov (United States)

    Yosinski, Jason; Paffenroth, Randy

    2008-04-01

    In distributed tracking systems, multiple non-collocated trackers cooperate to fuse local sensor data into a global track picture. Generating this global track picture at a central location is fairly straightforward, but the single point of failure and excessive bandwidth requirements introduced by centralized processing motivate the development of decentralized methods. In many decentralized tracking systems, trackers communicate with their peers via a lossy, bandwidth-limited network in which dropped, delayed, and out of order packets are typical. Oftentimes the decentralized tracking problem is viewed as a local tracking problem with a networking twist; we believe this view can underestimate the network complexities to be overcome. Indeed, a subsequent 'oversight' layer is often introduced to detect and handle track inconsistencies arising from a lack of robustness to network conditions. We instead pose the decentralized tracking problem as a distributed database problem, enabling us to draw inspiration from the vast extant literature on distributed databases. Using the two-phase commit algorithm, a well known technique for resolving transactions across a lossy network, we describe several ways in which one may build a distributed multiple hypothesis tracking system from the ground up to be robust to typical network intricacies. We pay particular attention to the dissimilar challenges presented by network track initiation vs. maintenance and suggest a hybrid system that balances speed and robustness by utilizing two-phase commit for only track initiation transactions. Finally, we present simulation results contrasting the performance of such a system with that of more traditional decentralized tracking implementations.

  5. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    Science.gov (United States)

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  6. Challenges, issues and trends in fall detection systems

    Science.gov (United States)

    2013-01-01

    Since falls are a major public health problem among older people, the number of systems aimed at detecting them has increased dramatically over recent years. This work presents an extensive literature review of fall detection systems, including comparisons among various kinds of studies. It aims to serve as a reference for both clinicians and biomedical engineers planning or conducting field investigations. Challenges, issues and trends in fall detection have been identified after the reviewing work. The number of studies using context-aware techniques is still increasing but there is a new trend towards the integration of fall detection into smartphones as well as the use of machine learning methods in the detection algorithm. We have also identified challenges regarding performance under real-life conditions, usability, and user acceptance as well as issues related to power consumption, real-time operations, sensing limitations, privacy and record of real-life falls. PMID:23829390

  7. Towards cloud-centric distributed database evaluation

    OpenAIRE

    Seybold, Daniel

    2016-01-01

    The area of cloud computing also pushed the evolvement of distributed databases, resulting in a variety of distributed database systems, which can be classified in relation databases, NoSQL and NewSQL database systems. In general all representatives of these database system classes claim to provide elasticity and "unlimited" horizontal scalability. As these characteristics comply with the cloud, distributed databases seem to be a perfect match for Database-as-a-Service systems (DBaaS).

  8. Towards Cloud-centric Distributed Database Evaluation

    OpenAIRE

    Seybold, Daniel

    2016-01-01

    The area of cloud computing also pushed the evolvement of distributed databases, resulting in a variety of distributed database systems, which can be classified in relation databases, NoSQL and NewSQL database systems. In general all representatives of these database system classes claim to provide elasticity and "unlimited" horizontal scalability. As these characteristics comply with the cloud, distributed databases seem to be a perfect match for Database-as-a-Service systems (DBaaS).

  9. Development of the LEP high level control system using ORACLE as an online database

    International Nuclear Information System (INIS)

    Bailey, R.; Belk, A.; Collier, P.; Lamont, M.; De Rijk, G.; Tarrant, M.

    1994-01-01

    A complete rewrite of the high level application software for the control of LEP has been carried out. ORACLE was evaluated and subsequently used as the on-line database in the implementation of the system. All control information and settings are stored on this database. This paper describes the project development cycle, the method used, the use of CASE and the project management used by the team. The performance of the system and the database and their impact on the LEP performance is discussed. ((orig.))

  10. Correlation between national influenza surveillance data and google trends in South Korea.

    Science.gov (United States)

    Cho, Sungjin; Sohn, Chang Hwan; Jo, Min Woo; Shin, Soo-Yong; Lee, Jae Ho; Ryoo, Seoung Mok; Kim, Won Young; Seo, Dong-Woo

    2013-01-01

    In South Korea, there is currently no syndromic surveillance system using internet search data, including Google Flu Trends. The purpose of this study was to investigate the correlation between national influenza surveillance data and Google Trends in South Korea. Our study was based on a publicly available search engine database, Google Trends, using 12 influenza-related queries, from September 9, 2007 to September 8, 2012. National surveillance data were obtained from the Korea Centers for Disease Control and Prevention (KCDC) influenza-like illness (ILI) and virologic surveillance system. Pearson's correlation coefficients were calculated to compare the national surveillance and the Google Trends data for the overall period and for 5 influenza seasons. The correlation coefficient between the KCDC ILI and virologic surveillance data was 0.72 (pcorrelation was between the Google Trends query of H1N1 and the ILI data, with a correlation coefficient of 0.53 (pcorrelation with a correlation coefficient of 0.93 (pcorrelation coefficient compared with ILI data for three consecutive seasons: Tamiflu (r = 0.59, 0.86, 0.90, pcorrelated with national surveillance data in South Korea. The results of this study showed that Google Trends in the Korean language can be used as complementary data for influenza surveillance but was insufficient for the use of predictive models, such as Google Flu Trends.

  11. Database Publication Practices

    DEFF Research Database (Denmark)

    Bernstein, P.A.; DeWitt, D.; Heuer, A.

    2005-01-01

    There has been a growing interest in improving the publication processes for database research papers. This panel reports on recent changes in those processes and presents an initial cut at historical data for the VLDB Journal and ACM Transactions on Database Systems.......There has been a growing interest in improving the publication processes for database research papers. This panel reports on recent changes in those processes and presents an initial cut at historical data for the VLDB Journal and ACM Transactions on Database Systems....

  12. System Study: Emergency Power System 1998–2013

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk Assessment and Management Services Dept.

    2015-02-01

    This report presents an unreliability evaluation of the emergency power system (EPS) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2013 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10-year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant trends were identified in the EPS results.

  13. SmallSat Database

    Science.gov (United States)

    Petropulos, Dolores; Bittner, David; Murawski, Robert; Golden, Bert

    2015-01-01

    The SmallSat has an unrealized potential in both the private industry and in the federal government. Currently over 70 companies, 50 universities and 17 governmental agencies are involved in SmallSat research and development. In 1994, the U.S. Army Missile and Defense mapped the moon using smallSat imagery. Since then Smart Phones have introduced this imagery to the people of the world as diverse industries watched this trend. The deployment cost of smallSats is also greatly reduced compared to traditional satellites due to the fact that multiple units can be deployed in a single mission. Imaging payloads have become more sophisticated, smaller and lighter. In addition, the growth of small technology obtained from private industries has led to the more widespread use of smallSats. This includes greater revisit rates in imagery, significantly lower costs, the ability to update technology more frequently and the ability to decrease vulnerability of enemy attacks. The popularity of smallSats show a changing mentality in this fast paced world of tomorrow. What impact has this created on the NASA communication networks now and in future years? In this project, we are developing the SmallSat Relational Database which can support a simulation of smallSats within the NASA SCaN Compatability Environment for Networks and Integrated Communications (SCENIC) Modeling and Simulation Lab. The NASA Space Communications and Networks (SCaN) Program can use this modeling to project required network support needs in the next 10 to 15 years. The SmallSat Rational Database could model smallSats just as the other SCaN databases model the more traditional larger satellites, with a few exceptions. One being that the smallSat Database is designed to be built-to-order. The SmallSat database holds various hardware configurations that can be used to model a smallSat. It will require significant effort to develop as the research material can only be populated by hand to obtain the unique data

  14. Distributed Database Semantic Integration of Wireless Sensor Network to Access the Environmental Monitoring System

    Directory of Open Access Journals (Sweden)

    Ubaidillah Umar

    2018-06-01

    Full Text Available A wireless sensor network (WSN works continuously to gather information from sensors that generate large volumes of data to be handled and processed by applications. Current efforts in sensor networks focus more on networking and development services for a variety of applications and less on processing and integrating data from heterogeneous sensors. There is an increased need for information to become shareable across different sensors, database platforms, and applications that are not easily implemented in traditional database systems. To solve the issue of these large amounts of data from different servers and database platforms (including sensor data, a semantic sensor web service platform is needed to enable a machine to extract meaningful information from the sensor’s raw data. This additionally helps to minimize and simplify data processing and to deduce new information from existing data. This paper implements a semantic web data platform (SWDP to manage the distribution of data sensors based on the semantic database system. SWDP uses sensors for temperature, humidity, carbon monoxide, carbon dioxide, luminosity, and noise. The system uses the Sesame semantic web database for data processing and a WSN to distribute, minimize, and simplify information processing. The sensor nodes are distributed in different places to collect sensor data. The SWDP generates context information in the form of a resource description framework. The experiment results demonstrate that the SWDP is more efficient than the traditional database system in terms of memory usage and processing time.

  15. Design of remote weather monitor system based on embedded web database

    International Nuclear Information System (INIS)

    Gao Jiugang; Zhuang Along

    2010-01-01

    The remote weather monitoring system is designed by employing the embedded Web database technology and the S3C2410 microprocessor as the core. The monitoring system can simultaneously monitor the multi-channel sensor signals, and can give a dynamic Web pages display of various types of meteorological information on the remote computer. It gives a elaborated introduction of the construction and application of the Web database under the embedded Linux. Test results show that the client access the Web page via the GPRS or the Internet, acquires data and uses an intuitive graphical way to display the value of various types of meteorological information. (authors)

  16. EPAUS9R - An Energy Systems Database for use with the Market Allocation (MARKAL) Model

    Science.gov (United States)

    EPA’s MARKAL energy system databases estimate future-year technology dispersals and associated emissions. These databases are valuable tools for exploring a variety of future scenarios for the U.S. energy-production systems that can impact climate change c

  17. Development of Vision Based Multiview Gait Recognition System with MMUGait Database

    Directory of Open Access Journals (Sweden)

    Hu Ng

    2014-01-01

    Full Text Available This paper describes the acquisition setup and development of a new gait database, MMUGait. This database consists of 82 subjects walking under normal condition and 19 subjects walking with 11 covariate factors, which were captured under two views. This paper also proposes a multiview model-based gait recognition system with joint detection approach that performs well under different walking trajectories and covariate factors, which include self-occluded or external occluded silhouettes. In the proposed system, the process begins by enhancing the human silhouette to remove the artifacts. Next, the width and height of the body are obtained. Subsequently, the joint angular trajectories are determined once the body joints are automatically detected. Lastly, crotch height and step-size of the walking subject are determined. The extracted features are smoothened by Gaussian filter to eliminate the effect of outliers. The extracted features are normalized with linear scaling, which is followed by feature selection prior to the classification process. The classification experiments carried out on MMUGait database were benchmarked against the SOTON Small DB from University of Southampton. Results showed correct classification rate above 90% for all the databases. The proposed approach is found to outperform other approaches on SOTON Small DB in most cases.

  18. Development of radiation oncology learning system combined with multi-institutional radiotherapy database (ROGAD)

    Energy Technology Data Exchange (ETDEWEB)

    Takemura, Akihiro; Iinuma, Masahiro; Kou, Hiroko [Kanazawa Univ. (Japan). School of Medicine; Harauchi, Hajime; Inamura, Kiyonari

    1999-09-01

    We have constructed and are operating a multi-institutional radiotherapy database ROGAD (Radiation Oncology Greater Area Database) since 1992. One of it's purpose is 'to optimize individual radiotherapy plans'. We developed Radiation oncology learning system combined with ROGAD' which conforms to that purpose. Several medical doctors evaluated our system. According to those evaluations, we are now confident that our system is able to contribute to improvement of radiotherapy results. Our final target is to generate a good cyclic relationship among three components: radiotherapy results according to ''Radiation oncology learning system combined with ROGAD.'; The growth of ROGAD; and radiation oncology learning system. (author)

  19. A computer database system to calculate staff radiation doses and maintain records

    International Nuclear Information System (INIS)

    Clewer, P.

    1985-01-01

    A database has been produced to record the personal dose records of all employees monitored for radiation exposure in the Wessex Health Region. Currently there are more than 2000 personnel in 115 departments but the capacity of the database allows for expansion. The computer is interfaced to a densitometer for film badge reading. The hardware used by the database, which is based on a popular microcomputer, is described, as are the various programs that make up the software. The advantages over the manual card index system that it replaces are discussed. (author)

  20. Analysing and Rationalising Molecular and Materials Databases Using Machine-Learning

    Science.gov (United States)

    de, Sandip; Ceriotti, Michele

    Computational materials design promises to greatly accelerate the process of discovering new or more performant materials. Several collaborative efforts are contributing to this goal by building databases of structures, containing between thousands and millions of distinct hypothetical compounds, whose properties are computed by high-throughput electronic-structure calculations. The complexity and sheer amount of information has made manual exploration, interpretation and maintenance of these databases a formidable challenge, making it necessary to resort to automatic analysis tools. Here we will demonstrate how, starting from a measure of (dis)similarity between database items built from a combination of local environment descriptors, it is possible to apply hierarchical clustering algorithms, as well as dimensionality reduction methods such as sketchmap, to analyse, classify and interpret trends in molecular and materials databases, as well as to detect inconsistencies and errors. Thanks to the agnostic and flexible nature of the underlying metric, we will show how our framework can be applied transparently to different kinds of systems ranging from organic molecules and oligopeptides to inorganic crystal structures as well as molecular crystals. Funded by National Center for Computational Design and Discovery of Novel Materials (MARVEL) and Swiss National Science Foundation.

  1. The new ENSDF search system NESSY: IBM/PC nuclear spectroscopy database

    International Nuclear Information System (INIS)

    Boboshin, I.N.; Varlamov, V.V.

    1996-01-01

    The universal relational nuclear structure and decay database NESSY (New ENSDF Search SYstem) developed for the IBM/PC and compatible PCs, and based on the international file ENSDF (Evaluated Nuclear Structure Data File), is described. The NESSY provides the possibility of high efficiency processing (the search and retrieval of any kind of physical data) of the information from ENSDF. The principles of the database development are described and examples of applications are presented. (orig.)

  2. Retrieval program system of Chinese Evaluated (frequently useful) Nuclear Decay Database

    International Nuclear Information System (INIS)

    Huang Xiaolong; Zhou Chunmei

    1995-01-01

    The Chinese Evaluated (frequently useful) Nuclear Decay Database has been set up in MICRO-VAX-11 computer at Chinese Nuclear Data Center (CNDC). For users' convenience, the retrieval program system of the database is written. Retrieval can be carried out for one nucleus or multi-nucleus. The retrieved results can be displayed on terminal screen or output to M3081 printer and laser printer in ENSDF format, table report or scheme diagrams

  3. Solvent Handbook Database System user's manual

    International Nuclear Information System (INIS)

    1993-03-01

    Industrial solvents and cleaners are used in maintenance facilities to remove wax, grease, oil, carbon, machining fluids, solder fluxes, mold release, and various other contaminants from parts, and to prepare the surface of various metals. However, because of growing environmental and worker-safety concerns, government regulations have already excluded the use of some chemicals and have restricted the use of halogenated hydrocarbons because they affect the ozone layer and may cause cancer. The Solvent Handbook Database System lets you view information on solvents and cleaners, including test results on cleaning performance, air emissions, recycling and recovery, corrosion, and non-metals compatibility. Company and product safety information is also available

  4. Preliminary study for unified management of CANDU safety codes and construction of database system

    International Nuclear Information System (INIS)

    Min, Byung Joo; Kim, Hyoung Tae

    2003-03-01

    It is needed to develop the Graphical User Interface(GUI) for the unified management of CANDU safety codes and to construct database system for the validation of safety codes, for which the preliminary study is done in the first stage of the present work. The input and output structures and data flow of CATHENA and PRESCON2 are investigated and the interaction of the variables between CATHENA and PRESCON2 are identified. Furthermore, PC versions of CATHENA and PRESCON2 codes are developed for the interaction of these codes and GUI(Graphic User Interface). The PC versions are assessed by comparing the calculation results with those by HP workstation or from FSAR(Final Safety Analysis Report). Preliminary study on the GUI for the safety codes in the unified management system are done. The sample of GUI programming is demonstrated preliminarily. Visual C++ is selected as the programming language for the development of GUI system. The data for Wolsong plants, reactor core, and thermal-hydraulic experiments executed in the inside and outside of the country, are collected and classified following the structure of the database system, of which two types are considered for the final web-based database system. The preliminary GUI programming for database system is demonstrated, which is updated in the future work

  5. Report on the achievements in the Sunshine Project in fiscal 1986. Surveys on systems to structure a coal liquefaction database; 1986 nendo sekitan ekika database kochiku no tame no system chosa seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-03-01

    Surveys are carried out on the current status of information control systems for development projects being performed or planned in developing coal liquefaction technologies. The conception for structuring a coal liquefaction database (CLDB) is made clear to manage comprehensively and utilize effectively the information in the systems. Section 3 investigated and analyzed the current status of data processing for experimental plants. The data for each experimental plant are processed individually. Therefore, it is preferable that the CLDB shall be provided with accommodating locations to receive respective data. Section 4 put into order the flows of operation and information in the coal liquefaction research, and depicted an overall configuration diagram for the system. Section 5 discusses problems in structuring this system. There is a large number of problems to be discussed from now on, not only in the technological aspect, but in analyzing the organizational roles of NEDO and commissioned business entities, and the needs of users. The last section summarizes the steps and schedule for developing this system. The development steps should preferably be implemented stepwise along the progress of the experimental plants, in such an order as the fundamental database, analysis database and engineering database. (NEDO)

  6. Perspectives on a Big Data Application: What Database Engineers and IT Students Need to Know

    Directory of Open Access Journals (Sweden)

    E. Erturk

    2015-10-01

    Full Text Available Cloud Computing and Big Data are important and related current trends in the world of information technology. They will have significant impact on the curricula of computer engineering and information systems at universities and higher education institutions. Learning about big data is useful for both working database professionals and students, in accordance with the increase in jobs requiring these skills. It is also important to address a broad gamut of database engineering skills, i.e. database design, installation, and operation. Therefore the authors have investigated MongoDB, a popular application, both from the perspective of industry retraining for database specialists and for teaching. This paper demonstrates some practical activities that can be done by students at the Eastern Institute of Technology New Zealand. In addition to testing and preparing new content for future students, this paper contributes to the very recent and emerging academic literature in this area. This paper concludes with general recommendations for IT educators, database engineers, and other IT professionals.

  7. Dynamic graph system for a semantic database

    Science.gov (United States)

    Mizell, David

    2015-01-27

    A method and system in a computer system for dynamically providing a graphical representation of a data store of entries via a matrix interface is disclosed. A dynamic graph system provides a matrix interface that exposes to an application program a graphical representation of data stored in a data store such as a semantic database storing triples. To the application program, the matrix interface represents the graph as a sparse adjacency matrix that is stored in compressed form. Each entry of the data store is considered to represent a link between nodes of the graph. Each entry has a first field and a second field identifying the nodes connected by the link and a third field with a value for the link that connects the identified nodes. The first, second, and third fields represent the rows, column, and elements of the adjacency matrix.

  8. Infodemiology of systemic lupus erythematous using Google Trends.

    Science.gov (United States)

    Radin, M; Sciascia, S

    2017-07-01

    Objective People affected by chronic rheumatic conditions, such as systemic lupus erythematosus (SLE), frequently rely on the Internet and search engines to look for terms related to their disease and its possible causes, symptoms and treatments. 'Infodemiology' and 'infoveillance' are two recent terms created to describe a new developing approach for public health, based on Big Data monitoring and data mining. In this study, we aim to investigate trends of Internet research linked to SLE and symptoms associated with the disease, applying a Big Data monitoring approach. Methods We analysed the large amount of data generated by Google Trends, considering 'lupus', 'relapse' and 'fatigue' in a 10-year web-based research. Google Trends automatically normalized data for the overall number of searches, and presented them as relative search volumes, in order to compare variations of different search terms across regions and periods. The Menn-Kendall test was used to evaluate the overall seasonal trend of each search term and possible correlation between search terms. Results We observed a seasonality for Google search volumes for lupus-related terms. In the Northern hemisphere, relative search volumes for 'lupus' were correlated with 'relapse' (τ = 0.85; p = 0.019) and with fatigue (τ = 0.82; p = 0.003), whereas in the Southern hemisphere we observed a significant correlation between 'fatigue' and 'relapse' (τ = 0.85; p = 0.018). Similarly, a significant correlation between 'fatigue' and 'relapse' (τ = 0.70; p < 0.001) was seen also in the Northern hemisphere. Conclusion Despite the intrinsic limitations of this approach, Internet-acquired data might represent a real-time surveillance tool and an alert for healthcare systems in order to plan the most appropriate resources in specific moments with higher disease burden.

  9. Computer-Aided Systems Engineering for Flight Research Projects Using a Workgroup Database

    Science.gov (United States)

    Mizukami, Masahi

    2004-01-01

    An online systems engineering tool for flight research projects has been developed through the use of a workgroup database. Capabilities are implemented for typical flight research systems engineering needs in document library, configuration control, hazard analysis, hardware database, requirements management, action item tracking, project team information, and technical performance metrics. Repetitive tasks are automated to reduce workload and errors. Current data and documents are instantly available online and can be worked on collaboratively. Existing forms and conventional processes are used, rather than inventing or changing processes to fit the tool. An integrated tool set offers advantages by automatically cross-referencing data, minimizing redundant data entry, and reducing the number of programs that must be learned. With a simplified approach, significant improvements are attained over existing capabilities for minimal cost. By using a workgroup-level database platform, personnel most directly involved in the project can develop, modify, and maintain the system, thereby saving time and money. As a pilot project, the system has been used to support an in-house flight experiment. Options are proposed for developing and deploying this type of tool on a more extensive basis.

  10. 18th East European Conference on Advances in Databases and Information Systems and Associated Satellite Events

    CERN Document Server

    Ivanovic, Mirjana; Kon-Popovska, Margita; Manolopoulos, Yannis; Palpanas, Themis; Trajcevski, Goce; Vakali, Athena

    2015-01-01

    This volume contains the papers of 3 workshops and the doctoral consortium, which are organized in the framework of the 18th East-European Conference on Advances in Databases and Information Systems (ADBIS’2014). The 3rd International Workshop on GPUs in Databases (GID’2014) is devoted to subjects related to utilization of Graphics Processing Units in database environments. The use of GPUs in databases has not yet received enough attention from the database community. The intention of the GID workshop is to provide a discussion on popularizing the GPUs and providing a forum for discussion with respect to the GID’s research ideas and their potential to achieve high speedups in many database applications. The 3rd International Workshop on Ontologies Meet Advanced Information Systems (OAIS’2014) has a twofold objective to present: new and challenging issues in the contribution of ontologies for designing high quality information systems, and new research and technological developments which use ontologie...

  11. The Establishment of the SAR images database System Based on Oracle and ArcSDE

    International Nuclear Information System (INIS)

    Zhou, Jijin; Li, Zhen; Chen, Quan; Tian, Bangsen

    2014-01-01

    Synthetic aperture radar is a kind of microwave imaging system, and has the advantages of multi-band, multi-polarization and multi-angle. At present, there is no SAR images database system based on typical features. For solving problems in interpretation and identification, a new SAR images database system of the typical features is urgent in the current development need. In this article, a SAR images database system based on Oracle and ArcSDE was constructed. The main works involving are as follows: (1) SAR image data was calibrated and corrected geometrically and geometrically. Besides, the fully polarimetric image was processed as the coherency matrix[T] to preserve the polarimetric information. (2) After analyzing multiple space borne SAR images, the metadata table was defined as: IMAGEID; Name of features; Latitude and Longitude; Sensor name; Range and Azimuth resolution etc. (3) Through the comparison between GeoRaster and ArcSDE, result showed ArcSDE is a more appropriate technology to store images in a central database. The System stores and manages multisource SAR image data well, reflects scattering, geometry, polarization, band and angle characteristics, and combines with analysis of the managed objects and service objects of the database as well as focuses on constructing SAR image system in the aspects of data browse and data retrieval. According the analysis of characteristics of SAR images such as scattering, polarization, incident angle and wave band information, different weights can be given to these characteristics. Then an interpreted tool is formed to provide an efficient platform for interpretation

  12. Fossil-Fuel C02 Emissions Database and Exploration System

    Science.gov (United States)

    Krassovski, M.; Boden, T.

    2012-04-01

    Fossil-Fuel C02 Emissions Database and Exploration System Misha Krassovski and Tom Boden Carbon Dioxide Information Analysis Center Oak Ridge National Laboratory The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) quantifies the release of carbon from fossil-fuel use and cement production each year at global, regional, and national spatial scales. These estimates are vital to climate change research given the strong evidence suggesting fossil-fuel emissions are responsible for unprecedented levels of carbon dioxide (CO2) in the atmosphere. The CDIAC fossil-fuel emissions time series are based largely on annual energy statistics published for all nations by the United Nations (UN). Publications containing historical energy statistics make it possible to estimate fossil-fuel CO2 emissions back to 1751 before the Industrial Revolution. From these core fossil-fuel CO2 emission time series, CDIAC has developed a number of additional data products to satisfy modeling needs and to address other questions aimed at improving our understanding of the global carbon cycle budget. For example, CDIAC also produces a time series of gridded fossil-fuel CO2 emission estimates and isotopic (e.g., C13) emissions estimates. The gridded data are generated using the methodology described in Andres et al. (2011) and provide monthly and annual estimates for 1751-2008 at 1° latitude by 1° longitude resolution. These gridded emission estimates are being used in the latest IPCC Scientific Assessment (AR4). Isotopic estimates are possible thanks to detailed information for individual nations regarding the carbon content of select fuels (e.g., the carbon signature of natural gas from Russia). CDIAC has recently developed a relational database to house these baseline emissions estimates and associated derived products and a web-based interface to help users worldwide query these data holdings. Users can identify, explore and download desired CDIAC

  13. A real time multi-server multi-client coherent database for a new high voltage system

    International Nuclear Information System (INIS)

    Gorbics, M.; Green, M.

    1995-01-01

    A high voltage system has been designed to allow multiple users (clients) access to the database of measured values and settings. This database is actively maintained in real time for a given mainframe containing multiple modules each having their own database. With limited CPU nd memory resources the mainframe system provides a data coherency scheme for multiple clients which (1) allows the client to determine when and what values need to be updated, (2) allows for changes from one client to be detected by another client, and (3) does not depend on the mainframe system tracking client accesses

  14. PACSY, a relational database management system for protein structure and chemical shift analysis.

    Science.gov (United States)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L

    2012-10-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu.

  15. PACSY, a relational database management system for protein structure and chemical shift analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Woonghee, E-mail: whlee@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States); Yu, Wookyung [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Kim, Suhkmann [Pusan National University, Department of Chemistry and Chemistry Institute for Functional Materials (Korea, Republic of); Chang, Iksoo [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Lee, Weontae, E-mail: wlee@spin.yonsei.ac.kr [Yonsei University, Structural Biochemistry and Molecular Biophysics Laboratory, Department of Biochemistry (Korea, Republic of); Markley, John L., E-mail: markley@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States)

    2012-10-15

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  16. PACSY, a relational database management system for protein structure and chemical shift analysis

    Science.gov (United States)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu. PMID:22903636

  17. PACSY, a relational database management system for protein structure and chemical shift analysis

    International Nuclear Information System (INIS)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L.

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  18. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  19. KALIMER database development

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  20. Report on the present situation of the FY 1998 technical literature database; 1998 nendo gijutsu bunken database nado genjo chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    To study database which contributes to the future scientific technology information distribution, survey/analysis were conducted of the present status of the service supply side. In the survey on the database trend, the trend of relations between DB producers and distributors was investigated. As a result, there were seen the increase in DB producers, expansion of internet/distribution/service, etc., and there were no changes in the U.S.-centered structure. Further, it was recognized that the DB service in the internet age now faces the time of change as seen in existing producers' response to internet, on-line service of primary information source, creation of new on-line service, etc. By the internet impact, the following are predicted for the future DB service: slump of producers without strong points and gateway type distributors, appearance of new types of DB service, etc. (NEDO)

  1. An Adaptive Database Intrusion Detection System

    Science.gov (United States)

    Barrios, Rita M.

    2011-01-01

    Intrusion detection is difficult to accomplish when attempting to employ current methodologies when considering the database and the authorized entity. It is a common understanding that current methodologies focus on the network architecture rather than the database, which is not an adequate solution when considering the insider threat. Recent…

  2. A Preliminary Study on the Multiple Mapping Structure of Classification Systems for Heterogeneous Databases

    Directory of Open Access Journals (Sweden)

    Seok-Hyoung Lee

    2012-06-01

    Full Text Available While science and technology information service portals and heterogeneous databases produced in Korea and other countries are integrated, methods of connecting the unique classification systems applied to each database have been studied. Results of technologists' research, such as, journal articles, patent specifications, and research reports, are organically related to each other. In this case, if the most basic and meaningful classification systems are not connected, it is difficult to achieve interoperability of the information and thus not easy to implement meaningful science technology information services through information convergence. This study aims to address the aforementioned issue by analyzing mapping systems between classification systems in order to design a structure to connect a variety of classification systems used in the academic information database of the Korea Institute of Science and Technology Information, which provides science and technology information portal service. This study also aims to design a mapping system for the classification systems to be applied to actual science and technology information services and information management systems.

  3. Integrating the DLD dosimetry system into the Almaraz NPP Corporative Database

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    1996-01-01

    The article discusses the experience acquired during the integration of a new MGP Instruments DLD Dosimetry System into the Almaraz NPP corporative database and general communications network, following a client-server philosophy and taking into account the computer standards of the Plant. The most important results obtained are: Integration of DLD dosimetry information into corporative databases, permitting the use of new applications Sharing of existing personnel information with the DLD dosimetry application, thereby avoiding the redundant work of introducing data and improving the quality of the information. Facilitation of maintenance, both software and hardware, of the DLD system. Maximum explotation, from the computer point of view, of the initial investment. Adaptation of the application to the applicable legislation. (Author)

  4. The Barcelona Hospital Clínic therapeutic apheresis database.

    Science.gov (United States)

    Cid, Joan; Carbassé, Gloria; Cid-Caballero, Marc; López-Púa, Yolanda; Alba, Cristina; Perea, Dolores; Lozano, Miguel

    2017-09-22

    A therapeutic apheresis (TA) database helps to increase knowledge about indications and type of apheresis procedures that are performed in clinical practice. The objective of the present report was to describe the type and number of TA procedures that were performed at our institution in a 10-year period, from 2007 to 2016. The TA electronic database was created by transferring patient data from electronic medical records and consultation forms into a Microsoft Access database developed exclusively for this purpose. Since 2007, prospective data from every TA procedure were entered in the database. A total of 5940 TA procedures were performed: 3762 (63.3%) plasma exchange (PE) procedures, 1096 (18.5%) hematopoietic progenitor cell (HPC) collections, and 1082 (18.2%) TA procedures other than PEs and HPC collections. The overall trend for the time-period was progressive increase in total number of TA procedures performed each year (from 483 TA procedures in 2007 to 822 in 2016). The tracking trend of each procedure during the 10-year period was different: the number of PE and other type of TA procedures increased 22% and 2818%, respectively, and the number of HPC collections decreased 28%. The TA database helped us to increase our knowledge about various indications and type of TA procedures that were performed in our current practice. We also believe that this database could serve as a model that other institutions can use to track service metrics. © 2017 Wiley Periodicals, Inc.

  5. Structure health monitoring system using internet and database technologies

    International Nuclear Information System (INIS)

    Kwon, Il Bum; Kim, Chi Yeop; Choi, Man Yong; Lee, Seung Seok

    2003-01-01

    Structural health monitoring system should developed to be based on internet and database technology in order to manage efficiently large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconvince.

  6. Structural health monitoring system using internet and database technologies

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chi Yeop; Choi, Man Yong; Kwon, Il Bum; Lee, Seung Seok [Nonstructive Measurment Lab., KRISS, Daejeon (Korea, Republic of)

    2003-07-01

    Structure health monitoring system should develope to be based on internet and database technology in order to manage efficiency large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconviniences.

  7. Structure health monitoring system using internet and database technologies

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Il Bum; Kim, Chi Yeop; Choi, Man Yong; Lee, Seung Seok [Smart Measurment Group. Korea Resarch Institute of Standards and Science, Saejeon (Korea, Republic of)

    2003-05-15

    Structural health monitoring system should developed to be based on internet and database technology in order to manage efficiently large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconvince.

  8. Structural health monitoring system using internet and database technologies

    International Nuclear Information System (INIS)

    Kim, Chi Yeop; Choi, Man Yong; Kwon, Il Bum; Lee, Seung Seok

    2003-01-01

    Structure health monitoring system should develope to be based on internet and database technology in order to manage efficiency large structures. This system is operated by internet connected with the side of structures. The monitoring system has some functions: self monitoring, self diagnosis, and self control etc. Self monitoring is the function of sensor fault detection. If some sensors are not normally worked, then this system can detect the fault sensors. Also Self diagnosis function repair the abnormal condition of sensors. And self control is the repair function of the monitoring system. Especially, the monitoring system can identify the replacement of sensors. For further study, the real application test will be performed to check some unconviniences.

  9. A Tephra Database With an Intelligent Correlation System, Mono-Inyo Volcanic Chain, CA

    Science.gov (United States)

    Bursik, M.; Rogova, G.

    2004-12-01

    We are assembling a web-accessible, relational database of information on past eruptions of the Mono-Inyo volcanic chain, eastern California. The PostgreSQL database structure follows the North American Data Model and CordLink. The database allows us to extract the features diagnostic of particular pyroclastic layers, as well as lava domes and flows. The features include depth in the section, layer thickness and internal stratigraphy, mineral assemblage, major and trace element composition, tephra componentry and granulometry, and radiocarbon age. Our working hypotheses are that 1) the database will prove useful for unraveling the complex recent volcanic history of the Mono-Inyo chain 2) aided by the use of an intelligent correlation system integrated into the database system. The Mono-Inyo chain consists of domes, craters and flows that stretch for 50 km north-south, subparallel to the Sierran range front fault system. Almost all eruptions within the chain probably occurred less than 50,000 years ago. Because of the variety of magma and eruption types, and the migration of source regions in time and space, it is nontrivial to discern patterns of behaviour. We have explored the use of multiple artificial neural networks combined within the framework of the Dempster-Shafer theory of evidence to construct a hybrid information processing system as an aid in the correlation of Mono-Inyo pyroclastic layers. It is hoped that such a system could provide information useful to discerning eruptive patterns that would otherwise be difficult to sort and categorize. In a test case on tephra layers at known sites, the intelligent correlation system was able to categorize observations correctly 96% of the time. In a test case with layers at one unknown site, and using a pairwise comparison of the unknown site with the known sites, a one-to-one correlation between the unknown site and the known sites was found to sometimes be poor. Such a result could be used to aid a

  10. The New Trends in Adaptive Educational Hypermedia Systems

    Science.gov (United States)

    Somyürek, Sibel

    2015-01-01

    This paper aims to give a general review of existing literature on adaptive educational hypermedia systems and to reveal technological trends and approaches within these studies. Fifty-six studies conducted between 2002 and 2012 were examined, to identify prominent themes and approaches. According to the content analysis, the new technological…

  11. National trends in anterior cervical fusion procedures.

    Science.gov (United States)

    Marawar, Satyajit; Girardi, Federico P; Sama, Andrew A; Ma, Yan; Gaber-Baylis, Licia K; Besculides, Melanie C; Memtsoudis, Stavros G

    2010-07-01

    Population-based database analysis. To analyze trends in patient- and healthcare-system-related characteristics, utilization and outcomes associated with anterior cervical spine fusions. Anterior cervical decompression and spine fusion (ACDF) is one of the most commonly performed surgical procedures of the spine. However, few data analyzing trends in patient- and healthcare-system-related characteristics, utilization and outcomes exist. Data from 1990 to 2004 collected in the National Hospital Discharge Survey were accessed. ACDF procedures were identified. Five-year periods of interest (POI) were created for temporal analysis and changes in the prevalence and utilization of this procedure as well as in patient- and healthcare-system-related variables were examined. The changes in the occurrence of procedure-related complications were evaluated. An estimated total of 771,932 discharges after ACDF were identified. Temporally, an almost 8-fold increase in total prevalence was accompanied by a similar increase in utilization (23/100.000 civilians/POI to 157/100.000/civilians/POI). The highest increase in utilization was observed in those > or =65 years (28-fold). Average age increased from 47.2 years to 50.5 years over time. Length of hospital stay decreased from 5.17 days to 2.38 days. Overall procedure-related complication rates decreased from 4.6% to 3.03%. The prevalence of hypertension, diabetes mellitus, hypercholesterolemia, obesity, pulmonary, and coronary artery increased over time among patients undergoing ACDF. Despite limitations inherent to secondary analysis of large databases, we identified a number of significant changes in the utilization, demographics, and outcomes associated with ACDF, which can be used to assess the effect of changes in medical care, direct health care resources, and future research. The effect of the increased prevalence of comorbidities on medical practice remains to be evaluated. Further studies are necessary to evaluate causal

  12. Trends in Compulsory Licensing of Pharmaceuticals Since the Doha Declaration: A Database Analysis

    Science.gov (United States)

    Beall, Reed; Kuhn, Randall

    2012-01-01

    Background It is now a decade since the World Trade Organization (WTO) adopted the “Declaration on the TRIPS Agreement and Public Health” at its 4th Ministerial Conference in Doha. Many anticipated that these actions would lead nations to claim compulsory licenses (CLs) for pharmaceutical products with greater regularity. A CL is the use of a patented innovation that has been licensed by a state without the permission of the patent title holder. Skeptics doubted that many CLs would occur, given political pressure against CL activity and continued health system weakness in poor countries. The subsequent decade has seen little systematic assessment of the Doha Declaration's impact. Methods and Findings We assembled a database of all episodes in which a CL was publically entertained or announced by a WTO member state since 1995. Broad searches of CL activity were conducted using media, academic, and legal databases, yielding 34 potential CL episodes in 26 countries. Country- and product-specific searches were used to verify government participation, resulting in a final database of 24 verified CLs in 17 nations. We coded CL episodes in terms of outcome, national income, and disease group over three distinct periods of CL activity. Most CL episodes occurred between 2003 and 2005, involved drugs for HIV/AIDS, and occurred in upper-middle-income countries (UMICs). Aside from HIV/AIDS, few CL episodes involved communicable disease, and none occurred in least-developed or low-income countries. Conclusions Given skepticism about the Doha Declaration's likely impact, we note the relatively high occurrence of CLs, yet CL activity has diminished markedly since 2006. While UMICs have high CL activity and strong incentives to use CLs compared to other countries, we note considerable countervailing pressures against CL use even in UMICs. We conclude that there is a low probability of continued CL activity. We highlight the need for further systematic evaluation of global

  13. Trends in compulsory licensing of pharmaceuticals since the Doha Declaration: a database analysis.

    Science.gov (United States)

    Beall, Reed; Kuhn, Randall

    2012-01-01

    It is now a decade since the World Trade Organization (WTO) adopted the "Declaration on the TRIPS Agreement and Public Health" at its 4th Ministerial Conference in Doha. Many anticipated that these actions would lead nations to claim compulsory licenses (CLs) for pharmaceutical products with greater regularity. A CL is the use of a patented innovation that has been licensed by a state without the permission of the patent title holder. Skeptics doubted that many CLs would occur, given political pressure against CL activity and continued health system weakness in poor countries. The subsequent decade has seen little systematic assessment of the Doha Declaration's impact. We assembled a database of all episodes in which a CL was publically entertained or announced by a WTO member state since 1995. Broad searches of CL activity were conducted using media, academic, and legal databases, yielding 34 potential CL episodes in 26 countries. Country- and product-specific searches were used to verify government participation, resulting in a final database of 24 verified CLs in 17 nations. We coded CL episodes in terms of outcome, national income, and disease group over three distinct periods of CL activity. Most CL episodes occurred between 2003 and 2005, involved drugs for HIV/AIDS, and occurred in upper-middle-income countries (UMICs). Aside from HIV/AIDS, few CL episodes involved communicable disease, and none occurred in least-developed or low-income countries. Given skepticism about the Doha Declaration's likely impact, we note the relatively high occurrence of CLs, yet CL activity has diminished markedly since 2006. While UMICs have high CL activity and strong incentives to use CLs compared to other countries, we note considerable countervailing pressures against CL use even in UMICs. We conclude that there is a low probability of continued CL activity. We highlight the need for further systematic evaluation of global health governance actions. Please see later in the

  14. Trends in compulsory licensing of pharmaceuticals since the Doha Declaration: a database analysis.

    Directory of Open Access Journals (Sweden)

    Reed Beall

    2012-01-01

    Full Text Available BACKGROUND: It is now a decade since the World Trade Organization (WTO adopted the "Declaration on the TRIPS Agreement and Public Health" at its 4th Ministerial Conference in Doha. Many anticipated that these actions would lead nations to claim compulsory licenses (CLs for pharmaceutical products with greater regularity. A CL is the use of a patented innovation that has been licensed by a state without the permission of the patent title holder. Skeptics doubted that many CLs would occur, given political pressure against CL activity and continued health system weakness in poor countries. The subsequent decade has seen little systematic assessment of the Doha Declaration's impact. METHODS AND FINDINGS: We assembled a database of all episodes in which a CL was publically entertained or announced by a WTO member state since 1995. Broad searches of CL activity were conducted using media, academic, and legal databases, yielding 34 potential CL episodes in 26 countries. Country- and product-specific searches were used to verify government participation, resulting in a final database of 24 verified CLs in 17 nations. We coded CL episodes in terms of outcome, national income, and disease group over three distinct periods of CL activity. Most CL episodes occurred between 2003 and 2005, involved drugs for HIV/AIDS, and occurred in upper-middle-income countries (UMICs. Aside from HIV/AIDS, few CL episodes involved communicable disease, and none occurred in least-developed or low-income countries. CONCLUSIONS: Given skepticism about the Doha Declaration's likely impact, we note the relatively high occurrence of CLs, yet CL activity has diminished markedly since 2006. While UMICs have high CL activity and strong incentives to use CLs compared to other countries, we note considerable countervailing pressures against CL use even in UMICs. We conclude that there is a low probability of continued CL activity. We highlight the need for further systematic

  15. Development of a database system for near-future climate change projections under the Japanese National Project SI-CAT

    Science.gov (United States)

    Nakagawa, Y.; Kawahara, S.; Araki, F.; Matsuoka, D.; Ishikawa, Y.; Fujita, M.; Sugimoto, S.; Okada, Y.; Kawazoe, S.; Watanabe, S.; Ishii, M.; Mizuta, R.; Murata, A.; Kawase, H.

    2017-12-01

    Analyses of large ensemble data are quite useful in order to produce probabilistic effect projection of climate change. Ensemble data of "+2K future climate simulations" are currently produced by Japanese national project "Social Implementation Program on Climate Change Adaptation Technology (SI-CAT)" as a part of a database for Policy Decision making for Future climate change (d4PDF; Mizuta et al. 2016) produced by Program for Risk Information on Climate Change. Those data consist of global warming simulations and regional downscaling simulations. Considering that those data volumes are too large (a few petabyte) to download to a local computer of users, a user-friendly system is required to search and download data which satisfy requests of the users. We develop "a database system for near-future climate change projections" for providing functions to find necessary data for the users under SI-CAT. The database system for near-future climate change projections mainly consists of a relational database, a data download function and user interface. The relational database using PostgreSQL is a key function among them. Temporally and spatially compressed data are registered on the relational database. As a first step, we develop the relational database for precipitation, temperature and track data of typhoon according to requests by SI-CAT members. The data download function using Open-source Project for a Network Data Access Protocol (OPeNDAP) provides a function to download temporally and spatially extracted data based on search results obtained by the relational database. We also develop the web-based user interface for using the relational database and the data download function. A prototype of the database system for near-future climate change projections are currently in operational test on our local server. The database system for near-future climate change projections will be released on Data Integration and Analysis System Program (DIAS) in fiscal year 2017

  16. The CUTLASS database facilities

    International Nuclear Information System (INIS)

    Jervis, P.; Rutter, P.

    1988-09-01

    The enhancement of the CUTLASS database management system to provide improved facilities for data handling is seen as a prerequisite to its effective use for future power station data processing and control applications. This particularly applies to the larger projects such as AGR data processing system refurbishments, and the data processing systems required for the new Coal Fired Reference Design stations. In anticipation of the need for improved data handling facilities in CUTLASS, the CEGB established a User Sub-Group in the early 1980's to define the database facilities required by users. Following the endorsement of the resulting specification and a detailed design study, the database facilities have been implemented as an integral part of the CUTLASS system. This paper provides an introduction to the range of CUTLASS Database facilities, and emphasises the role of Database as the central facility around which future Kit 1 and (particularly) Kit 6 CUTLASS based data processing and control systems will be designed and implemented. (author)

  17. Safety system function trend indicator: Theory and test application

    International Nuclear Information System (INIS)

    Azarm, M.A.; Carbonaro, J.F.; Boccio, J.L.; Vesely, W.E.

    1989-01-01

    The purpose of this paper is to summarize research conducted on the development and validation of quantitative indicators of safety performance. This work, performed under the Risk-Based Performance Indicator (RBPI) Project, FIN A-3295, for the Office of Research (RES), is considered part of NRC's Performance Indicator Program which is being coordinated through the Office for the Analysis and Evaluation of Operational Data (AEOD). The program originally focused on risk-based indicators at high levels of safety indices (e.g., core-damage frequency, functional unavailabilities, and sequence monitoring). The program was then redirected towards a more amenable goal, safety system unavailability indicators, mainly due to the lack of PRA models and plant data. In that regard, BNL published a technical report that introduced the concept of cycle-based indicators and also described various alternatives of monitoring safety system unavailabilities. Further simplification of these indicators was requested by NRC to facilitate their applications to all plants in a timely manner. This resulted in the development of Safety System Function Trend (SSFT) indicators which minimize the need for detailed system model as well as component history. The theoretical bases for these indicators were developed through various simulation studies to determine the ease of detecting a trend and/or unacceptable performance. These indicators, along with several other indicators, were then generated and compared using plant data as a part of a test application. The SSFT indicators, specifically, were constructed for a total of eight plants, consisting of two systems per plant. Emphasis was placed on examining relative changes, as well as the indicator's actual level. Both the trend and actual indicator level were found to be important in identifying plants with potential problems

  18. Substandard and Counterfeit Antimicrobials: Recent Trends and ...

    African Journals Online (AJOL)

    ... trends in the availability of substandard and counterfeit antimicrobials in the global market ... Literature search using PubMed and Medline databases and Google search engine was conducted to identify related publications on the subject.

  19. Concepts and trends in healthcare information systems

    CERN Document Server

    Koutsouris, Dionysios-Dimitrios

    2014-01-01

    ​Concepts and Trends in Healthcare Information Systems covers the latest research topics in the field from leading researchers and practitioners. This book offers theory-driven research that explores the role of Information Systems in the delivery of healthcare in its diverse organizational and regulatory settings. In addition to the embedded role of Information Technology (IT) in clinical and diagnostics equipment, Information Systems are uniquely positioned to capture, store, process, and communicate timely information to decision makers for better coordination of healthcare at both the individual and population levels. For example, data mining and decision support capabilities can identify potential adverse events for an individual patient while also contributing to the population's health by providing insights into the causes of disease complications. Information systems have great potential to reduce healthcare costs and improve outcomes. The healthcare delivery systems share similar characteristics w...

  20. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  1. Answering biological questions: Querying a systems biology database for nutrigenomics

    NARCIS (Netherlands)

    Evelo, C.T.; Bochove, K. van; Saito, J.T.

    2011-01-01

    The requirement of systems biology for connecting different levels of biological research leads directly to a need for integrating vast amounts of diverse information in general and of omics data in particular. The nutritional phenotype database addresses this challenge for nutrigenomics. A

  2. YUCSA: A CLIPS expert database system to monitor academic performance

    Science.gov (United States)

    Toptsis, Anestis A.; Ho, Frankie; Leindekar, Milton; Foon, Debra Low; Carbonaro, Mike

    1991-01-01

    The York University CLIPS Student Administrator (YUCSA), an expert database system implemented in C Language Integrated Processing System (CLIPS), for monitoring the academic performance of undergraduate students at York University, is discussed. The expert system component in the system has already been implemented for two major departments, and it is under testing and enhancement for more departments. Also, more elaborate user interfaces are under development. We describe the design and implementation of the system, problems encountered, and immediate future plans. The system has excellent maintainability and it is very efficient, taking less than one minute to complete an assessment of one student.

  3. Study on Mandatory Access Control in a Secure Database Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper proposes a security policy model for mandatory access control in class B1 database management system whose level of labeling is tuple. The relation-hierarchical data model is extended to multilevel relation-hierarchical data model. Based on the multilevel relation-hierarchical data model, the concept of upper-lower layer relational integrity is presented after we analyze and eliminate the covert channels caused by the database integrity. Two SQL statements are extended to process polyinstantiation in the multilevel secure environment. The system is based on the multilevel relation-hierarchical data model and is capable of integratively storing and manipulating multilevel complicated objects (e. g., multilevel spatial data) and multilevel conventional data ( e. g., integer. real number and character string).

  4. Development of web database system for JAERI ERL-FEL

    International Nuclear Information System (INIS)

    Kikuzawa, Nobuhiro

    2005-01-01

    The accelerator control system for the JAERI ERL-FEL is a PC-based distributed control system. The accelerator status record is stored automatically through the control system to analyze the influence on the electron beam. In order to handle effectively a large number of stored data, it is necessary that the required data can be searched and visualized in easy operation. For this reason, a web database (DB) system which can search of the required data and display visually on a web browser was developed by using open source software. With introduction of this system, accelerator operators can monitor real-time information anytime, anywhere through a web browser. Development of the web DB system is described in this paper. (author)

  5. Development of web database system for JAERI ERL-FEL

    Energy Technology Data Exchange (ETDEWEB)

    Kikuzawa, Nobuhiro [Japan Atomic Energy Research Inst., Kansai Research Establishment, Advanced Photon Research Center, Tokai, Ibaraki (Japan)

    2005-06-01

    The accelerator control system for the JAERI ERL-FEL is a PC-based distributed control system. The accelerator status record is stored automatically through the control system to analyze the influence on the electron beam. In order to handle effectively a large number of stored data, it is necessary that the required data can be searched and visualized in easy operation. For this reason, a web database (DB) system which can search of the required data and display visually on a web browser was developed by using open source software. With introduction of this system, accelerator operators can monitor real-time information anytime, anywhere through a web browser. Development of the web DB system is described in this paper. (author)

  6. NoSQL database scaling

    OpenAIRE

    Žardin, Norbert

    2017-01-01

    NoSQL database scaling is a decision, where system resources or financial expenses are traded for database performance or other benefits. By scaling a database, database performance and resource usage might increase or decrease, such changes might have a negative impact on an application that uses the database. In this work it is analyzed how database scaling affect database resource usage and performance. As a results, calculations are acquired, using which database scaling types and differe...

  7. Intelligent Control of Micro Grid: A Big Data-Based Control Center

    Science.gov (United States)

    Liu, Lu; Wang, Yanping; Liu, Li; Wang, Zhiseng

    2018-01-01

    In this paper, a structure of micro grid system with big data-based control center is introduced. Energy data from distributed generation, storage and load are analized through the control center, and from the results new trends will be predicted and applied as a feedback to optimize the control. Therefore, each step proceeded in micro grid can be adjusted and orgnized in a form of comprehensive management. A framework of real-time data collection, data processing and data analysis will be proposed by employing big data technology. Consequently, a integrated distributed generation and a optimized energy storage and transmission process can be implemented in the micro grid system.

  8. The International Experimental Thermal Hydraulic Systems database – TIETHYS: A new NEA validation tool

    Energy Technology Data Exchange (ETDEWEB)

    Rohatgi, Upendra S.

    2018-07-22

    Nuclear reactor codes require validation with appropriate data representing the plant for specific scenarios. The thermal-hydraulic data is scattered in different locations and in different formats. Some of the data is in danger of being lost. A relational database is being developed to organize the international thermal hydraulic test data for various reactor concepts and different scenarios. At the reactor system level, that data is organized to include separate effect tests and integral effect tests for specific scenarios and corresponding phenomena. The database relies on the phenomena identification sections of expert developed PIRTs. The database will provide a summary of appropriate data, review of facility information, test description, instrumentation, references for the experimental data and some examples of application of the data for validation. The current database platform includes scenarios for PWR, BWR, VVER, and specific benchmarks for CFD modelling data and is to be expanded to include references for molten salt reactors. There are place holders for high temperature gas cooled reactors, CANDU and liquid metal reactors. This relational database is called The International Experimental Thermal Hydraulic Systems (TIETHYS) database and currently resides at Nuclear Energy Agency (NEA) of the OECD and is freely open to public access. Going forward the database will be extended to include additional links and data as they become available. https://www.oecd-nea.org/tiethysweb/

  9. Report on the present situation of the FY 1998 technical literature database; 1998 nendo gijutsu bunken database nado genjo chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    To study database which contributes to the future scientific technology information distribution, survey/analysis were conducted of the present status of the service supply side. In the survey on the database trend, the trend of relations between DB producers and distributors was investigated. As a result, there were seen the increase in DB producers, expansion of internet/distribution/service, etc., and there were no changes in the U.S.-centered structure. Further, it was recognized that the DB service in the internet age now faces the time of change as seen in existing producers' response to internet, on-line service of primary information source, creation of new on-line service, etc. By the internet impact, the following are predicted for the future DB service: slump of producers without strong points and gateway type distributors, appearance of new types of DB service, etc. (NEDO)

  10. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    Science.gov (United States)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  11. Reliability of piping system components. Volume 4: The pipe failure event database

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, R; Erixon, S [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Tomic, B [ENCONET Consulting GmbH, Vienna (Austria); Lydell, B [RSA Technologies, Visat, CA (United States)

    1996-07-01

    Available public and proprietary databases on piping system failures were searched for relevant information. Using a relational database to identify groupings of piping failure modes and failure mechanisms, together with insights from published PSAs, the project team determined why, how and where piping systems fail. This report represents a compendium of technical issues important to the analysis of pipe failure events, and statistical estimation of failure rates. Inadequacies of traditional PSA methodology are addressed, with directions for PSA methodology enhancements. A `data driven and systems oriented` analysis approach is proposed to enable assignment of unique identities to risk-significant piping system component failure. Sufficient operating experience does exist to generate quality data on piping failures. Passive component failures should be addressed by today`s PSAs to allow for aging analysis and effective, on-line risk management. 42 refs, 25 figs.

  12. Data-based fault-tolerant control for affine nonlinear systems with actuator faults.

    Science.gov (United States)

    Xie, Chun-Hua; Yang, Guang-Hong

    2016-09-01

    This paper investigates the fault-tolerant control (FTC) problem for unknown nonlinear systems with actuator faults including stuck, outage, bias and loss of effectiveness. The upper bounds of stuck faults, bias faults and loss of effectiveness faults are unknown. A new data-based FTC scheme is proposed. It consists of the online estimations of the bounds and a state-dependent function. The estimations are adjusted online to compensate automatically the actuator faults. The state-dependent function solved by using real system data helps to stabilize the system. Furthermore, all signals in the resulting closed-loop system are uniformly bounded and the states converge asymptotically to zero. Compared with the existing results, the proposed approach is data-based. Finally, two simulation examples are provided to show the effectiveness of the proposed approach. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Reliability of piping system components. Volume 4: The pipe failure event database

    International Nuclear Information System (INIS)

    Nyman, R.; Erixon, S.; Tomic, B.; Lydell, B.

    1996-07-01

    Available public and proprietary databases on piping system failures were searched for relevant information. Using a relational database to identify groupings of piping failure modes and failure mechanisms, together with insights from published PSAs, the project team determined why, how and where piping systems fail. This report represents a compendium of technical issues important to the analysis of pipe failure events, and statistical estimation of failure rates. Inadequacies of traditional PSA methodology are addressed, with directions for PSA methodology enhancements. A 'data driven and systems oriented' analysis approach is proposed to enable assignment of unique identities to risk-significant piping system component failure. Sufficient operating experience does exist to generate quality data on piping failures. Passive component failures should be addressed by today's PSAs to allow for aging analysis and effective, on-line risk management. 42 refs, 25 figs

  14. Examining Mobile Learning Trends 2003-2008: A Categorical Meta-Trend Analysis Using Text Mining Techniques

    Science.gov (United States)

    Hung, Jui-Long; Zhang, Ke

    2012-01-01

    This study investigated the longitudinal trends of academic articles in Mobile Learning (ML) using text mining techniques. One hundred and nineteen (119) refereed journal articles and proceedings papers from the SCI/SSCI database were retrieved and analyzed. The taxonomies of ML publications were grouped into twelve clusters (topics) and four…

  15. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Directory of Open Access Journals (Sweden)

    Xiaohuan Yang

    2009-02-01

    Full Text Available The spatial distribution of population is closely related to land use and land cover (LULC patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B data integrated with a Pattern Decomposition Method (PDM and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM. The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable.

  16. An Updating System for the Gridded Population Database of China Based on Remote Sensing, GIS and Spatial Database Technologies

    Science.gov (United States)

    Yang, Xiaohuan; Huang, Yaohuan; Dong, Pinliang; Jiang, Dong; Liu, Honghui

    2009-01-01

    The spatial distribution of population is closely related to land use and land cover (LULC) patterns on both regional and global scales. Population can be redistributed onto geo-referenced square grids according to this relation. In the past decades, various approaches to monitoring LULC using remote sensing and Geographic Information Systems (GIS) have been developed, which makes it possible for efficient updating of geo-referenced population data. A Spatial Population Updating System (SPUS) is developed for updating the gridded population database of China based on remote sensing, GIS and spatial database technologies, with a spatial resolution of 1 km by 1 km. The SPUS can process standard Moderate Resolution Imaging Spectroradiometer (MODIS L1B) data integrated with a Pattern Decomposition Method (PDM) and an LULC-Conversion Model to obtain patterns of land use and land cover, and provide input parameters for a Population Spatialization Model (PSM). The PSM embedded in SPUS is used for generating 1 km by 1 km gridded population data in each population distribution region based on natural and socio-economic variables. Validation results from finer township-level census data of Yishui County suggest that the gridded population database produced by the SPUS is reliable. PMID:22399959

  17. A Preliminary Study on the Multiple Mapping Structure of Classification Systems for Heterogeneous Databases

    OpenAIRE

    Seok-Hyoung Lee; Hwan-Min Kim; Ho-Seop Choe

    2012-01-01

    While science and technology information service portals and heterogeneous databases produced in Korea and other countries are integrated, methods of connecting the unique classification systems applied to each database have been studied. Results of technologists' research, such as, journal articles, patent specifications, and research reports, are organically related to each other. In this case, if the most basic and meaningful classification systems are not connected, it is difficult to ach...

  18. Current technological trends in development of NPP systems

    International Nuclear Information System (INIS)

    Florescu, Gheorghe; Panaitescu, Valeriu

    2010-01-01

    The recent nuclear research issues look for new technologies and continuous progress in finding different and efficient solutions for sustained and upraising energy demand. The trend of increasing energy consumption and occurring of new and large consumers, especially from Asian countries, imposes finding of new means for clean, large scale and sustained energy production. NPPs availability was continuously monitored and improved; at the same time the safety of the nuclear energy production was under surveillance. The present development of the new technologies, the discoveries of new materials and development of efficient technological processes offer the opportunities for their appropriate implementation and use in the NPP system configurations and functioning/operation. The new technologies and scientific discoveries, and also the international cooperation, offer the opportunities to mitigate the actual barriers in order to cumulate and use advanced energy production, to find new energy sources and to build improved, reliable and safe power plants. The monitoring systems, intelligent sensors and SSCs, nanotechnologies and new/intelligent materials constitute the main ways for improvement of the NPP systems configuration and processes. The paper presents: - The state of the art in the level of the currently applied technologies for nuclear power systems development; - The actual technological limits that need to be over passed for improving the NPP systems ; - The main systems that need improvement and reconfiguration for development of currently operating NPPs as well as raising the operation efficiency, availability and total safety; - The actual energy production issues; - The key arguments in sustaining the R and D new NPP systems development; - Future trends in NPP development; - The limitations in industrial processes knowledge and use. Appropriate R and D in the field of NPP systems have specific characteristics that were considered in paper completion

  19. Collecting Taxes Database

    Data.gov (United States)

    US Agency for International Development — The Collecting Taxes Database contains performance and structural indicators about national tax systems. The database contains quantitative revenue performance...

  20. Reactor pressure vessel embrittlement management through EPRI-Developed material property databases

    International Nuclear Information System (INIS)

    Rosinski, S.T.; Server, W.L.; Griesbach, T.J.

    1997-01-01

    Uncertainties and variability in U.S. reactor pressure vessel (RPV) material properties have caused the U.S. Nuclear Regulatory Commission (NRC) to request information from all nuclear utilities in order to assess the impact of these data scatter and uncertainties on compliance with existing regulatory criteria. Resolving the vessel material uncertainty issues requires compiling all available data into a single integrated database to develop a better understanding of irradiated material property behavior. EPRI has developed two comprehensive databases for utility implementation to compile and evaluate available material property and surveillance data. RPVDATA is a comprehensive reactor vessel materials database and data management program that combines data from many different sources into one common database. Searches of the data can be easily performed to identify plants with similar materials, sort through measured test results, compare the ''best-estimates'' for reported chemistries with licensing basis values, quantify variability in measured weld qualification and test data, identify relevant surveillance results for characterizing embrittlement trends, and resolve uncertainties in vessel material properties. PREP4 has been developed to assist utilities in evaluating existing unirradiated and irradiated data for plant surveillance materials; PREP4 evaluations can be used to assess the accuracy of new trend curve predictions. In addition, searches of the data can be easily performed to identify available Charpy shift and upper shelf data, review surveillance material chemistry and fabrication information, review general capsule irradiation information, and identify applicable source reference information. In support of utility evaluations to consider thermal annealing as a viable embrittlement management option, EPRI is also developing a database to evaluate material response to thermal annealing. Efforts are underway to develop an irradiation

  1. A Database-Based and Web-Based Meta-CASE System

    Science.gov (United States)

    Eessaar, Erki; Sgirka, Rünno

    Each Computer Aided Software Engineering (CASE) system provides support to a software process or specific tasks or activities that are part of a software process. Each meta-CASE system allows us to create new CASE systems. The creators of a new CASE system have to specify abstract syntax of the language that is used in the system and functionality as well as non-functional properties of the new system. Many meta-CASE systems record their data directly in files. In this paper, we introduce a meta-CASE system, the enabling technology of which is an object-relational database system (ORDBMS). The system allows users to manage specifications of languages and create models by using these languages. The system has web-based and form-based user interface. We have created a proof-of-concept prototype of the system by using PostgreSQL ORDBMS and PHP scripting language.

  2. Development of database management system for monitoring of radiation workers for actinides

    International Nuclear Information System (INIS)

    Kalyane, G.N.; Mishra, L.; Nadar, M.Y.; Singh, I.S.; Rao, D.D.

    2012-01-01

    Annually around 500 radiation workers are monitored for estimation of lung activities and internal dose due to Pu/Am and U from various divisions of Bhabha Atomic Research Centre (Trombay) and from PREFRE and A3F facilities (Tarapur) in lung counting laboratory located at Bhabha Atomic Research Centre hospital under Routine and Special monitoring program. A 20 cm diameter phoswich and an array of HPGe detector were used for this purpose. In case of positive contamination, workers are followed up and monitored using both the detection systems in different geometries. Management of this huge data becomes difficult and therefore an easily retrievable database system containing all the relevant data of the monitored radiation workers. Materials and methods: The database management system comprises of three main modules integrated together: 1) Apache server installed on a Windows (XP) platform (Apache version 2.2.17) 2) MySQL database management system (MySQL version 5.5.8) 3) PHP (Preformatted Hypertext) programming language (PHP version 5.3.5). All the 3 modules work together seamlessly as a single software program. The front end user interaction is through an user friendly and interactive local web page where internet connection is not required. This front page has hyperlinks to many other pages, which have different utilities for the user. The user has to log in using username and password. Results and Conclusions: Database management system is used for entering, updating and management of lung monitoring data of radiation workers, The program is having following utilities: bio-data entry of new subjects, editing of bio-data of old subjects (only one subject at a time), entry of counting data of that day's lung monitoring, retrieval of old records based on a number of parameters and filters like date of counting, employee number, division, counts fulfilling a given criterion, etc. and calculation of MEQ CWT (Muscle Equivalent Chest Wall Thickness), energy

  3. Wind Energy Conversion Systems Technology and Trends

    CERN Document Server

    2012-01-01

    Wind Energy Conversion System covers the technological progress of wind energy conversion systems, along with potential future trends. It includes recently developed wind energy conversion systems such as multi-converter operation of variable-speed wind generators, lightning protection schemes, voltage flicker mitigation and prediction schemes for advanced control of wind generators. Modeling and control strategies of variable speed wind generators are discussed, together with the frequency converter topologies suitable for grid integration. Wind Energy Conversion System also describes offshore farm technologies including multi-terminal topology and space-based wind observation schemes, as well as both AC and DC based wind farm topologies. The stability and reliability of wind farms are discussed, and grid integration issues are examined in the context of the most recent industry guidelines. Wind power smoothing, one of the big challenges for transmission system operators, is a particular focus. Fault ride th...

  4. Intelligent software systems and SMiRT: Potentials, actual results, expectations, trends

    International Nuclear Information System (INIS)

    Jovanovic, A.

    1993-01-01

    The paper gives a survey of recent development trends in the area of knowledge-based systems, hypermedia, neural networks and other similar technologies which are on the baseline of modern 'intelligent' software systems, applied in the areas relevant for SMiRT: power plant operation, design and analysis of structural components, materials, and many others. The paper highlights the historical background of these trends, as well as the methodologies and technologies which made such a development possible ('enabling methodologies/technologies'). Examples from several, SMiRT characteristic application areas are mentioned, in order to give an illustration what the deployment of an intelligent software system can mean in practice. Finally, a summary of these result is made and future perspectives indicated. (author)

  5. Development of an integrated database management system to evaluate integrity of flawed components of nuclear power plant

    International Nuclear Information System (INIS)

    Mun, H. L.; Choi, S. N.; Jang, K. S.; Hong, S. Y.; Choi, J. B.; Kim, Y. J.

    2001-01-01

    The object of this paper is to develop an NPP-IDBMS(Integrated DataBase Management System for Nuclear Power Plants) for evaluating the integrity of components of nuclear power plant using relational data model. This paper describes the relational data model, structure and development strategy for the proposed NPP-IDBMS. The NPP-IDBMS consists of database, database management system and interface part. The database part consists of plant, shape, operating condition, material properties and stress database, which are required for the integrity evaluation of each component in nuclear power plants. For the development of stress database, an extensive finite element analysis was performed for various components considering operational transients. The developed NPP-IDBMS will provide efficient and accurate way to evaluate the integrity of flawed components

  6. Trends in global acupuncture publications: An analysis of the Web of Science database from 1988 to 2015.

    Science.gov (United States)

    Kung, Yen-Ying; Hwang, Shinn-Jang; Li, Tsai-Feng; Ko, Seong-Gyu; Huang, Ching-Wen; Chen, Fang-Pey

    2017-08-01

    Acupuncture is a rapidly growing medical specialty worldwide. This study aimed to analyze the acupuncture publications from 1988 to 2015 by using the Web of Science (WoS) database. Familiarity with the trend of acupuncture publications will facilitate a better understanding of existing academic research in acupuncture and its applications. Academic articles published focusing on acupuncture were retrieved and analyzed from the WoS database which included articles published in Science Citation Index-Expanded and Social Science Citation Indexed journals from 1988 to 2015. A total of 7450 articles were published in the field of acupuncture during the period of 1988-2015. Annual article publications increased from 109 in 1988 to 670 in 2015. The People's Republic of China (published 2076 articles, 27.9%), USA (published 1638 articles, 22.0%) and South Korea (published 707 articles, 9.5%) were the most abundantly prolific countries. According to the WoS subject categories, 2591 articles (34.8%) were published in the category of Integrative and Complementary Medicine, followed by Neurosciences (1147 articles, 15.4%), and General Internal Medicine (918 articles, 12.3%). Kyung Hee University (South Korea) is the most prolific organization that is the source of acupuncture publications (365 articles, 4.9%). Fields within acupuncture with the most cited articles included mechanism, clinical trials, epidemiology, and a new research method of acupuncture. Publications associated with acupuncture increased rapidly from 1988 to 2015. The different applications of acupuncture were extensive in multiple fields of medicine. It is important to maintain and even nourish a certain quantity and quality of published acupuncture papers, which can play an important role in developing a medical discipline for acupuncture. Copyright © 2017. Published by Elsevier Taiwan LLC.

  7. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    International Nuclear Information System (INIS)

    Shao, Weber; Kupelian, Patrick A; Wang, Jason; Low, Daniel A; Ruan, Dan

    2014-01-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  8. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    Science.gov (United States)

    Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan

    2014-03-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  9. Traditional Medicine Collection Tracking System (TM-CTS): a database for ethnobotanically driven drug-discovery programs.

    Science.gov (United States)

    Harris, Eric S J; Erickson, Sean D; Tolopko, Andrew N; Cao, Shugeng; Craycroft, Jane A; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E; Eisenberg, David M

    2011-05-17

    Ethnobotanically driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically driven natural product collection and drug-discovery programs. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  10. Traditional Medicine Collection Tracking System (TM-CTS): A Database for Ethnobotanically-Driven Drug-Discovery Programs

    Science.gov (United States)

    Harris, Eric S. J.; Erickson, Sean D.; Tolopko, Andrew N.; Cao, Shugeng; Craycroft, Jane A.; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E.; Eisenberg, David M.

    2011-01-01

    Aim of the study. Ethnobotanically-driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine-Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. Materials and Methods. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. Results. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. Conclusions. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically-driven natural product collection and drug-discovery programs. PMID:21420479

  11. An integrated data-analysis and database system for AMS 14C

    International Nuclear Information System (INIS)

    Kjeldsen, Henrik; Olsen, Jesper; Heinemeier, Jan

    2010-01-01

    AMSdata is the name of a combined database and data-analysis system for AMS 14 C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS 14 C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.

  12. The UCSD HIRES/Keck I Damped Lyα Abundance Database. II. The Implications

    Science.gov (United States)

    Prochaska, Jason X.; Wolfe, Arthur M.

    2002-02-01

    We present a comprehensive analysis of the damped Lyα (DLA) abundance database presented in the first paper of this series. This database provides a homogeneous set of abundance measurements for many elements including Si, Cr, Ni, Zn, Fe, Al, S, Co, O, and Ar from 38 DLA systems with zabs>1.5. With little exception, these DLA systems exhibit very similar relative abundances. There is no significant correlation in X/Fe with [Fe/H] metallicity, and the dispersion in X/Fe is small at all metallicity. We search the database for trends indicative of dust depletion and in a few cases find strong evidence. Specifically, we identify a correlation between [Si/Ti] and [Zn/Fe] which is unambiguous evidence for depletion. Following Hou and colleagues, we present [X/Si] abundances against [Si/H]+logN(HI) and note trends of decreasing X/Si with increasing [Si/H]+logN(HI) which argue for dust depletion. Similarly, comparisons of [Si/Fe] and [Si/Cr] against [Si/H] indicate significant depletion at [Si/H]>-1 but suggest essentially dust-free damped systems at [Si/H]0.25 dex as [Zn/Fe]-->0 and that the [Si/Fe] values exhibit a plateau of ~0.3 dex at [Si/H]good agreement with our previous work, but we emphasize two differences: (1) the unweighted and N(H I)-weighted [Fe/H] mean metallicities now have similar values at all epochs except z>3.5, where small number statistics dominate the N(H I)-weighted mean; and (2) there is no evolution in the mean [Fe/H] metallicity from z=1.7 to 3.5 but possibly a marked drop at higher redshift. We conclude with a general discussion on the physical nature of the DLA systems. We stress the uniformity of the DLA chemical abundances which indicates that the protogalaxies identified with DLA systems have very similar enrichment histories, i.e., a nearly constant relative contribution from Type Ia and Type II supernovae. The DLA systems also show constant relative abundances within a given system, which places strict constraints on the mixing timescales

  13. Current trends in nursing theories.

    Science.gov (United States)

    Im, Eun-Ok; Chang, Sun Ju

    2012-06-01

    To explore current trends in nursing theories through an integrated literature review. The literature related to nursing theories during the past 10 years was searched through multiple databases and reviewed to determine themes reflecting current trends in nursing theories. The trends can be categorized into six themes: (a) foci on specifics; (b) coexistence of various types of theories; (c) close links to research; (d) international collaborative works; (e) integration to practice; and (f) selective evolution. We need to make our continuous efforts to link research and practice to theories, to identify specifics of our theories, to develop diverse types of theories, and to conduct international collaborative works. Our paper gives implications for future theoretical development in diverse clinical areas of nursing research and practice. © 2012 Sigma Theta Tau International.

  14. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  15. Development and Field Test of a Real-Time Database in the Korean Smart Distribution Management System

    Directory of Open Access Journals (Sweden)

    Sang-Yun Yun

    2014-03-01

    Full Text Available Recently, a distribution management system (DMS that can conduct periodical system analysis and control by mounting various applications programs has been actively developed. In this paper, we summarize the development and demonstration of a database structure that can perform real-time system analysis and control of the Korean smart distribution management system (KSDMS. The developed database structure consists of a common information model (CIM-based off-line database (DB, a physical DB (PDB for DB establishment of the operating server, a real-time DB (RTDB for real-time server operation and remote terminal unit data interconnection, and an application common model (ACM DB for running application programs. The ACM DB for real-time system analysis and control of the application programs was developed by using a parallel table structure and a link list model, thereby providing fast input and output as well as high execution speed of application programs. Furthermore, the ACM DB was configured with hierarchical and non-hierarchical data models to reflect the system models that increase the DB size and operation speed through the reduction of the system, of which elements were unnecessary for analysis and control. The proposed database model was implemented and tested at the Gochaing and Jeju offices using a real system. Through data measurement of the remote terminal units, and through the operation and control of the application programs using the measurement, the performance, speed, and integrity of the proposed database model were validated, thereby demonstrating that this model can be applied to real systems.

  16. Analysis of Cloud-Based Database Systems

    Science.gov (United States)

    2015-06-01

    deploying the VM, we installed SQL Server 2014 relational database management software (RDBMS) and restored a copy of the PYTHON database onto the server ...management views within SQL Server , we retrieved lists of the most commonly executed queries, the percentage of reads versus writes, as well as...Monitor. This gave us data regarding resource utilization and queueing. The second tool we used was the SQL Server Profiler provided by Microsoft

  17. Tradeoffs in distributed databases

    OpenAIRE

    Juntunen, R. (Risto)

    2016-01-01

    Abstract In a distributed database data is spread throughout the network into separated nodes with different DBMS systems (Date, 2000). According to CAP-theorem three database properties — consistency, availability and partition tolerance cannot be achieved simultaneously in distributed database systems. Two of these properties can be achieved but not all three at the same time (Brewer, 2000). Since this theorem there has b...

  18. Field validation of food service listings: a comparison of commercial and online geographic information system databases.

    Science.gov (United States)

    Seliske, Laura; Pickett, William; Bates, Rebecca; Janssen, Ian

    2012-08-01

    Many studies examining the food retail environment rely on geographic information system (GIS) databases for location information. The purpose of this study was to validate information provided by two GIS databases, comparing the positional accuracy of food service places within a 1 km circular buffer surrounding 34 schools in Ontario, Canada. A commercial database (InfoCanada) and an online database (Yellow Pages) provided the addresses of food service places. Actual locations were measured using a global positioning system (GPS) device. The InfoCanada and Yellow Pages GIS databases provided the locations for 973 and 675 food service places, respectively. Overall, 749 (77.1%) and 595 (88.2%) of these were located in the field. The online database had a higher proportion of food service places found in the field. The GIS locations of 25% of the food service places were located within approximately 15 m of their actual location, 50% were within 25 m, and 75% were within 50 m. This validation study provided a detailed assessment of errors in the measurement of the location of food service places in the two databases. The location information was more accurate for the online database, however, when matching criteria were more conservative, there were no observed differences in error between the databases.

  19. Heterogeneous distributed databases: A case study

    Science.gov (United States)

    Stewart, Tracy R.; Mukkamala, Ravi

    1991-01-01

    Alternatives are reviewed for accessing distributed heterogeneous databases and a recommended solution is proposed. The current study is limited to the Automated Information Systems Center at the Naval Sea Combat Systems Engineering Station at Norfolk, VA. This center maintains two databases located on Digital Equipment Corporation's VAX computers running under the VMS operating system. The first data base, ICMS, resides on a VAX11/780 and has been implemented using VAX DBMS, a CODASYL based system. The second database, CSA, resides on a VAX 6460 and has been implemented using the ORACLE relational database management system (RDBMS). Both databases are used for configuration management within the U.S. Navy. Different customer bases are supported by each database. ICMS tracks U.S. Navy ships and major systems (anti-sub, sonar, etc.). Even though the major systems on ships and submarines have totally different functions, some of the equipment within the major systems are common to both ships and submarines.

  20. A Tactical Database for the Low Cost Combat Direction System

    Science.gov (United States)

    1990-12-01

    A Tactical Database for the Low Cost Combat Direction System by Everton G. de Paula Captain, Brazilian Air Force B.S., Instituto Tecnologico de...objects as a unit. The AVANCE object management system [Ref. 29] uses the timestamp 156 model (pessimistic approach) for concurrency control. The Vbase...are no longer used). In AVANCE [Ref. 291, garbage collection is performed on user request. In GemStone [Ref. 25], garbage collection is executed in

  1. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  2. National Carbon Sequestration Database and Geographic Information System (NatCarb)

    Energy Technology Data Exchange (ETDEWEB)

    Kenneth Nelson; Timothy Carr

    2009-03-31

    This annual and final report describes the results of the multi-year project entitled 'NATional CARBon Sequestration Database and Geographic Information System (NatCarb)' (http://www.natcarb.org). The original project assembled a consortium of five states (Indiana, Illinois, Kansas, Kentucky and Ohio) in the midcontinent of the United States (MIDCARB) to construct an online distributed Relational Database Management System (RDBMS) and Geographic Information System (GIS) covering aspects of carbon dioxide (CO{sub 2}) geologic sequestration. The NatCarb system built on the technology developed in the initial MIDCARB effort. The NatCarb project linked the GIS information of the Regional Carbon Sequestration Partnerships (RCSPs) into a coordinated regional database system consisting of datasets useful to industry, regulators and the public. The project includes access to national databases and GIS layers maintained by the NatCarb group (e.g., brine geochemistry) and publicly accessible servers (e.g., USGS, and Geography Network) into a single system where data are maintained and enhanced at the local level, but are accessed and assembled through a single Web portal to facilitate query, assembly, analysis and display. This project improves the flow of data across servers and increases the amount and quality of available digital data. The purpose of NatCarb is to provide a national view of the carbon capture and storage potential in the U.S. and Canada. The digital spatial database allows users to estimate the amount of CO{sub 2} emitted by sources (such as power plants, refineries and other fossil-fuel-consuming industries) in relation to geologic formations that can provide safe, secure storage sites over long periods of time. The NatCarb project worked to provide all stakeholders with improved online tools for the display and analysis of CO{sub 2} carbon capture and storage data through a single website portal (http://www.natcarb.org/). While the external

  3. RODOS database adapter

    International Nuclear Information System (INIS)

    Xie Gang

    1995-11-01

    Integrated data management is an essential aspect of many automatical information systems such as RODOS, a real-time on-line decision support system for nuclear emergency management. In particular, the application software must provide access management to different commercial database systems. This report presents the tools necessary for adapting embedded SQL-applications to both HP-ALLBASE/SQL and CA-Ingres/SQL databases. The design of the database adapter and the concept of RODOS embedded SQL syntax are discussed by considering some of the most important features of SQL-functions and the identification of significant differences between SQL-implementations. Finally fully part of the software developed and the administrator's and installation guides are described. (orig.) [de

  4. Analysis/design of tensile property database system

    International Nuclear Information System (INIS)

    Park, S. J.; Kim, D. H.; Jeon, I.; Lyu, W. S.

    2001-01-01

    The data base construction using the data produced from tensile experiment can increase the application of test results. Also, we can get the basic data ease from database when we prepare the new experiment and can produce high quality result by compare the previous data. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, the analysis and design was performed to develop the database for tensile extension property

  5. Nonmaterialized Relations and the Support of Information Retrieval Applications by Relational Database Systems.

    Science.gov (United States)

    Lynch, Clifford A.

    1991-01-01

    Describes several aspects of the problem of supporting information retrieval system query requirements in the relational database management system (RDBMS) environment and proposes an extension to query processing called nonmaterialized relations. User interactions with information retrieval systems are discussed, and nonmaterialized relations are…

  6. Evolution of the use of relational and NoSQL databases in the ATLAS experiment

    CERN Document Server

    Barberis, Dario; The ATLAS collaboration

    2015-01-01

    The ATLAS experiment used for many years a large database infrastructure based on Oracle to store several different types of non-event data: time-dependent detector configuration and conditions data, calibrations and alignments, configurations of Grid sites, catalogues for data management tools, job records for distributed workload management tools, run and event metadata. The rapid development of “NoSQL” databases (structured storage services) in the last five years allowed an extended and complementary usage of traditional relational databases and new structured storage tools in order to improve the performance of existing applications and to extend their functionalities using the possibilities offered by the modern storage systems. The trend is towards using the best tool for each kind of data, separating for example the intrinsically relational metadata from payload storage, and records that are frequently updated and benefit from transactions from archived information. Access to all components has to...

  7. Evolution of the use of relational and NoSQL databases in the ATLAS experiment

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00064378; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment used for many years a large database infrastructure based on Oracle to store several different types of non-event data: time-dependent detector configuration and conditions data, calibrations and alignments, configurations of Grid sites, catalogues for data management tools, job records for distributed workload management tools, run and event metadata. The rapid development of “NoSQL” databases (structured storage services) in the last five years allowed an extended and complementary usage of traditional relational databases and new structured storage tools in order to improve the performance of existing applications and to extend their functionalities using the possibilities offered by the modern storage systems. The trend is towards using the best tool for each kind of data, separating for example the intrinsically relational metadata from payload storage, and records that are frequently updated and benefit from transactions from archived information. Access to all components has to...

  8. Development of a database system for the management of non-treated radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso, E-mail: ajp@cdtn.br, E-mail: cbf@cdtn.br, E-mail: vc@cdtn.br, E-mail: pos@cdtn.br, E-mail: seless@cdtn.br, E-mail: hauczmj@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  9. ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii.

    Science.gov (United States)

    May, Patrick; Christian, Jan-Ole; Kempa, Stefan; Walther, Dirk

    2009-05-04

    The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. In the framework of the German Systems Biology initiative GoFORSYS, a pathway database and web-portal for Chlamydomonas (ChlamyCyc) was established, which currently features about 250 metabolic pathways with associated genes, enzymes, and compound information. ChlamyCyc was assembled using an integrative approach combining the recently published genome sequence, bioinformatics methods, and experimental data from metabolomics and proteomics experiments. We analyzed and integrated a combination of primary and secondary database resources, such as existing genome annotations from JGI, EST collections, orthology information, and MapMan classification. ChlamyCyc provides a curated and integrated systems biology repository that will enable and assist in systematic studies of fundamental cellular processes in Chlamydomonas. The ChlamyCyc database and web-portal is freely available under http://chlamycyc.mpimp-golm.mpg.de.

  10. Development of a database system for the management of non-treated radioactive waste

    International Nuclear Information System (INIS)

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso

    2017-01-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  11. Database Foundation For The Configuration Management Of The CERN Accelerator Controls Systems

    CERN Document Server

    Zaharieva, Z; Peryt, M

    2011-01-01

    The Controls Configuration Database (CCDB) and its interfaces have been developed over the last 25 years in order to become nowadays the basis for the Configuration Management of the Controls System for all accelerators at CERN. The CCDB contains data for all configuration items and their relationships, required for the correct functioning of the Controls System. The configuration items are quite heterogeneous, depicting different areas of the Controls System – ranging from 3000 Front-End Computers, 75 000 software devices allowing remote control of the accelerators, to valid states of the Accelerators Timing System. The article will describe the different areas of the CCDB, their interdependencies and the challenges to establish the data model for such a diverse configuration management database, serving a multitude of clients. The CCDB tracks the life of the configuration items by allowing their clear identification, triggering of change management processes as well as providing status accounting and aud...

  12. MetaMetaDB: a database and analytic system for investigating microbial habitability.

    Directory of Open Access Journals (Sweden)

    Ching-chia Yang

    Full Text Available MetaMetaDB (http://mmdb.aori.u-tokyo.ac.jp/ is a database and analytic system for investigating microbial habitability, i.e., how a prokaryotic group can inhabit different environments. The interaction between prokaryotes and the environment is a key issue in microbiology because distinct prokaryotic communities maintain distinct ecosystems. Because 16S ribosomal RNA (rRNA sequences play pivotal roles in identifying prokaryotic species, a system that comprehensively links diverse environments to 16S rRNA sequences of the inhabitant prokaryotes is necessary for the systematic understanding of the microbial habitability. However, existing databases are biased to culturable prokaryotes and exhibit limitations in the comprehensiveness of the data because most prokaryotes are unculturable. Recently, metagenomic and 16S rRNA amplicon sequencing approaches have generated abundant 16S rRNA sequence data that encompass unculturable prokaryotes across diverse environments; however, these data are usually buried in large databases and are difficult to access. In this study, we developed MetaMetaDB (Meta-Metagenomic DataBase, which comprehensively and compactly covers 16S rRNA sequences retrieved from public datasets. Using MetaMetaDB, users can quickly generate hypotheses regarding the types of environments a prokaryotic group may be adapted to. We anticipate that MetaMetaDB will improve our understanding of the diversity and evolution of prokaryotes.

  13. MetaMetaDB: a database and analytic system for investigating microbial habitability.

    Science.gov (United States)

    Yang, Ching-chia; Iwasaki, Wataru

    2014-01-01

    MetaMetaDB (http://mmdb.aori.u-tokyo.ac.jp/) is a database and analytic system for investigating microbial habitability, i.e., how a prokaryotic group can inhabit different environments. The interaction between prokaryotes and the environment is a key issue in microbiology because distinct prokaryotic communities maintain distinct ecosystems. Because 16S ribosomal RNA (rRNA) sequences play pivotal roles in identifying prokaryotic species, a system that comprehensively links diverse environments to 16S rRNA sequences of the inhabitant prokaryotes is necessary for the systematic understanding of the microbial habitability. However, existing databases are biased to culturable prokaryotes and exhibit limitations in the comprehensiveness of the data because most prokaryotes are unculturable. Recently, metagenomic and 16S rRNA amplicon sequencing approaches have generated abundant 16S rRNA sequence data that encompass unculturable prokaryotes across diverse environments; however, these data are usually buried in large databases and are difficult to access. In this study, we developed MetaMetaDB (Meta-Metagenomic DataBase), which comprehensively and compactly covers 16S rRNA sequences retrieved from public datasets. Using MetaMetaDB, users can quickly generate hypotheses regarding the types of environments a prokaryotic group may be adapted to. We anticipate that MetaMetaDB will improve our understanding of the diversity and evolution of prokaryotes.

  14. Trends in Antipsychotic Drug Use by Very Young, Privately Insured Children

    Science.gov (United States)

    Olfson, Mark; Crystal, Stephen; Huang, Cecilia; Gerhard, Tobias

    2010-01-01

    Objective: This study describes recent trends and patterns in antipsychotic treatment of privately insured children aged 2 through 5 years. Method: A trend analysis is presented of antipsychotic medication use (1999-2001 versus 2007) stratified by patient characteristics. Data are analyzed from a large administrative database of privately insured…

  15. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  16. Nuclear power plant control room crew task analysis database: SEEK system. Users manual

    International Nuclear Information System (INIS)

    Burgy, D.; Schroeder, L.

    1984-05-01

    The Crew Task Analysis SEEK Users Manual was prepared for the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission. It is designed for use with the existing computerized Control Room Crew Task Analysis Database. The SEEK system consists of a PR1ME computer with its associated peripherals and software augmented by General Physics Corporation SEEK database management software. The SEEK software programs provide the Crew Task Database user with rapid access to any number of records desired. The software uses English-like sentences to allow the user to construct logical sorts and outputs of the task data. Given the multiple-associative nature of the database, users can directly access the data at the plant, operating sequence, task or element level - or any combination of these levels. A complete description of the crew task data contained in the database is presented in NUREG/CR-3371, Task Analysis of Nuclear Power Plant Control Room Crews (Volumes 1 and 2)

  17. Evolution of the use of relational and NoSQL databases in the ATLAS experiment

    Science.gov (United States)

    Barberis, D.

    2016-09-01

    The ATLAS experiment used for many years a large database infrastructure based on Oracle to store several different types of non-event data: time-dependent detector configuration and conditions data, calibrations and alignments, configurations of Grid sites, catalogues for data management tools, job records for distributed workload management tools, run and event metadata. The rapid development of "NoSQL" databases (structured storage services) in the last five years allowed an extended and complementary usage of traditional relational databases and new structured storage tools in order to improve the performance of existing applications and to extend their functionalities using the possibilities offered by the modern storage systems. The trend is towards using the best tool for each kind of data, separating for example the intrinsically relational metadata from payload storage, and records that are frequently updated and benefit from transactions from archived information. Access to all components has to be orchestrated by specialised services that run on front-end machines and shield the user from the complexity of data storage infrastructure. This paper describes this technology evolution in the ATLAS database infrastructure and presents a few examples of large database applications that benefit from it.

  18. Database Systems and Oracle: Experiences and Lessons Learned

    Science.gov (United States)

    Dunn, Deborah

    2005-01-01

    In a tight job market, IT professionals with database experience are likely to be in great demand. Companies need database personnel who can help improve access to and security of data. The events of September 11 have increased business' awareness of the need for database security, backup, and recovery procedures. It is our responsibility to…

  19. Pneumonia mortality trends in all Brazilian geographical regions between 1996 and 2012

    Directory of Open Access Journals (Sweden)

    Rosemeire de Olanda Ferraz

    Full Text Available ABSTRACT Objective: To analyze the temporal trends in pneumonia mortality rates (standardized by age, using the 2010 population of Brazil as the standard in all Brazilian geographical regions between 1996 and 2012. Methods: This was an ecological time-series study examining secondary data from the Mortality Database maintained by the Information Technology Department of the Brazilian Unified Health Care System. Polynomial and joinpoint regression models, and corresponding 95% CIs, were used for trend analysis. Results: The pneumonia mortality rates in the South, Southeast, and Central-West showed a decreasing behavior until 2000, followed by increases, whereas, in the North and Northeast, they showed increasing trends virtually throughout the period studied. There was variation in annual percent change in pneumonia mortality rates in all regions except the North. The Central-West had the greatest decrease in annual percent change between 1996 and 2000, followed by an increase of the same magnitude until 2005. The 80 years and over age group was the one most influencing the trend behavior of pneumonia mortality rates in all regions. Conclusions: In general, pneumonia mortality trends reversed, with an important increase occurring in the years after 2000.

  20. The use of intelligent database systems in acute pancreatitis--a systematic review.

    Science.gov (United States)

    van den Heever, Marc; Mittal, Anubhav; Haydock, Matthew; Windsor, John

    2014-01-01

    Acute pancreatitis (AP) is a complex disease with multiple aetiological factors, wide ranging severity, and multiple challenges to effective triage and management. Databases, data mining and machine learning algorithms (MLAs), including artificial neural networks (ANNs), may assist by storing and interpreting data from multiple sources, potentially improving clinical decision-making. 1) Identify database technologies used to store AP data, 2) collate and categorise variables stored in AP databases, 3) identify the MLA technologies, including ANNs, used to analyse AP data, and 4) identify clinical and non-clinical benefits and obstacles in establishing a national or international AP database. Comprehensive systematic search of online reference databases. The predetermined inclusion criteria were all papers discussing 1) databases, 2) data mining or 3) MLAs, pertaining to AP, independently assessed by two reviewers with conflicts resolved by a third author. Forty-three papers were included. Three data mining technologies and five ANN methodologies were reported in the literature. There were 187 collected variables identified. ANNs increase accuracy of severity prediction, one study showed ANNs had a sensitivity of 0.89 and specificity of 0.96 six hours after admission--compare APACHE II (cutoff score ≥8) with 0.80 and 0.85 respectively. Problems with databases were incomplete data, lack of clinical data, diagnostic reliability and missing clinical data. This is the first systematic review examining the use of databases, MLAs and ANNs in the management of AP. The clinical benefits these technologies have over current systems and other advantages to adopting them are identified. Copyright © 2013 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  1. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T W; Sutton, M

    2011-09-19

    , meaning that they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  2. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    International Nuclear Information System (INIS)

    Wolery, T.W.; Sutton, M.

    2011-01-01

    they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  3. High-precision positioning system of four-quadrant detector based on the database query

    Science.gov (United States)

    Zhang, Xin; Deng, Xiao-guo; Su, Xiu-qin; Zheng, Xiao-qiang

    2015-02-01

    The fine pointing mechanism of the Acquisition, Pointing and Tracking (APT) system in free space laser communication usually use four-quadrant detector (QD) to point and track the laser beam accurately. The positioning precision of QD is one of the key factors of the pointing accuracy to APT system. A positioning system is designed based on FPGA and DSP in this paper, which can realize the sampling of AD, the positioning algorithm and the control of the fast swing mirror. We analyze the positioning error of facular center calculated by universal algorithm when the facular energy obeys Gauss distribution from the working principle of QD. A database is built by calculation and simulation with MatLab software, in which the facular center calculated by universal algorithm is corresponded with the facular center of Gaussian beam, and the database is stored in two pieces of E2PROM as the external memory of DSP. The facular center of Gaussian beam is inquiry in the database on the basis of the facular center calculated by universal algorithm in DSP. The experiment results show that the positioning accuracy of the high-precision positioning system is much better than the positioning accuracy calculated by universal algorithm.

  4. Making a search engine for Indocean - A database of abstracts: An experience

    Digital Repository Service at National Institute of Oceanography (India)

    Tapaswi, M.P.; Haravu, L.J.

    stream_size 23701 stream_content_type text/plain stream_name Inf_Manage_Trends_Issues_2003_307.pdf.txt stream_source_info Inf_Manage_Trends_Issues_2003_307.pdf.txt Content-Encoding UTF-8 Content-Type text/plain; charset=UTF-8... Information Mallagement : Trends and Issues (Festschrift ill honour of Prof S. Seetharama) 52 . Making a Search Engine for Indocean - A Database of Abstracts : An Experience Murari P Tapaswi* and L J Haravu** *Documentation Officer. National Information...

  5. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    Science.gov (United States)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  6. Software configuration management plan for the TWRS controlled baseline database system [TCBD

    International Nuclear Information System (INIS)

    Spencer, S.G.

    1998-01-01

    LHMC, TWRS Business Management Organization (BMO) is designated as system owner, operator, and maintenance authority. The TWAS BMO identified the need for the TCBD. The TWRS BMO users have established all requirements for the database and are responsible for maintaining database integrity and control (after the interface data has been received). Initial interface data control and integrity is maintained through functional and administrative processes and is the responsibility of the database owners who are providing the data. The specific groups within the TWRS BMO affected by this plan are the Financial Management and TWRS Management Support Project, Master Planning, and the Financial Control Integration and Reporting. The interfaces between these organizations are through normal line management chain of command. The Master Planning Group is assigned the responsibility to continue development and maintenance of the TCBD. This group maintains information that includes identification of requirements and changes to those requirements in a TCBD project file. They are responsible for the issuance, maintenance, and change authority of this SCW. LHMC, TWRS TCBD Users are designated as providing the project's requirement changes for implementation and also testing of the TCBD during development. The Master Planning Group coordinates and monitors the user's requests for system requirements (new/existing) as well as beta and acceptance testing. Users are those individuals and organizations needing data or information from the TCBD and having both a need-to-know and the proper training and authority to access the database. Each user or user organization is required to comply with the established requirements and procedures governing the TCBD. Lockheed Martin Services, Inc. (LMSI) is designated the TCBD developer, maintainer, and custodian until acceptance and process testing of the system has been completed via the TWRS BMO. Once this occurs, the TCBD will be completed and

  7. Development of a database system for operational use in the selection of titanium alloys

    Science.gov (United States)

    Han, Yuan-Fei; Zeng, Wei-Dong; Sun, Yu; Zhao, Yong-Qing

    2011-08-01

    The selection of titanium alloys has become a complex decision-making task due to the growing number of creation and utilization for titanium alloys, with each having its own characteristics, advantages, and limitations. In choosing the most appropriate titanium alloys, it is very essential to offer a reasonable and intelligent service for technical engineers. One possible solution of this problem is to develop a database system (DS) to help retrieve rational proposals from different databases and information sources and analyze them to provide useful and explicit information. For this purpose, a design strategy of the fuzzy set theory is proposed, and a distributed database system is developed. Through ranking of the candidate titanium alloys, the most suitable material is determined. It is found that the selection results are in good agreement with the practical situation.

  8. The plasma movie database system for JT-60

    International Nuclear Information System (INIS)

    Sueoka, Michiharu; Kawamata, Yoichi; Kurihara, Kenichi; Seki, Akiyuki

    2007-01-01

    The real-time plasma movie with the computer graphics (CG) of plasma shape is one of the most effective methods to know what discharge have been made in the experiment. For an easy use of the movie in the data analysis, we have developed the plasma movie database system (PMDS), which automatically records plasma movie according to the JT-60 discharge sequence, and transfers the movie files on request from the web site. The file is compressed to about 8 MB/shot small enough to be transferred within a few seconds through local area network (LAN). In this report, we describe the developed system from the technical point of view, and discuss a future plan on the basis of advancing video technology

  9. Database system for management of health physics and industrial hygiene records

    International Nuclear Information System (INIS)

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-01-01

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection

  10. Acceptance test procedure for the master equipment list (MEL)database system -- phase I

    International Nuclear Information System (INIS)

    Jech, J.B.

    1997-01-01

    The Waste Remediation System/.../Facilities Configuration Management Integration group has requested development of a system to help resolve many of the difficulties associated with management of master equipment list information. This project has been identified as Master Equipment List (MEL) database system. Further definition is contained in the system requirements specification (SRS), reference 7

  11. Large-scale Health Information Database and Privacy Protection.

    Science.gov (United States)

    Yamamoto, Ryuichi

    2016-09-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients' medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  12. Database Replication

    CERN Document Server

    Kemme, Bettina

    2010-01-01

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and

  13. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    OpenAIRE

    Raied Salman

    2015-01-01

    In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed an...

  14. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  15. Draft secure medical database standard.

    Science.gov (United States)

    Pangalos, George

    2002-01-01

    Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.

  16. Site initialization, recovery, and back-up in a distributed database system

    International Nuclear Information System (INIS)

    Attar, R.; Bernstein, P.A.; Goodman, N.

    1982-01-01

    Site initialization is the problem of integrating a new site into a running distributed database system (DDBS). Site recovery is the problem of integrating an old site into a DDBS when the site recovers from failure. Site backup is the problem of creating a static backup copy of a database for archival or query purposes. We present an algorithm that solves the site initialization problem. By modifying the algorithm slightly, we get solutions to the other two problems as well. Our algorithm exploits the fact that a correct DDBS must run a serializable concurrency control algorithm. Our algorithm relies on the concurrency control algorithm to handle all inter-site synchronization

  17. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    Science.gov (United States)

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  18. Brief Report: Rheumatoid Arthritis as the Underlying Cause of Death in Thirty-One Countries, 1987-2011: Trend Analysis of World Health Organization Mortality Database.

    Science.gov (United States)

    Kiadaliri, Aliasghar A; Felson, David T; Neogi, Tuhina; Englund, Martin

    2017-08-01

    To examine trends in rheumatoid arthritis (RA) as an underlying cause of death (UCD) in 31 countries across the world from 1987 to 2011. Data on mortality and population were collected from the World Health Organization mortality database and from the United Nations Population Prospects database. Age-standardized mortality rates (ASMRs) were calculated by means of direct standardization. We applied joinpoint regression analysis to identify trends. Between-country disparities were examined using between-country variance and the Gini coefficient. Due to low numbers of deaths, we smoothed the ASMRs using a 3-year moving average. Changes in the number of RA deaths between 1987 and 2011 were decomposed using 2 counterfactual scenarios. The absolute number of deaths with RA registered as the UCD decreased from 9,281 (0.12% of all-cause deaths) in 1987 to 8,428 (0.09% of all-cause deaths) in 2011. The mean ASMR decreased from 7.1 million person-years in 1987-1989 to 3.7 million person-years in 2009-2011 (48.2% reduction). A reduction of ≥25% in the ASMR occurred in 21 countries, while a corresponding increase was observed in 3 countries. There was a persistent reduction in RA mortality, and on average, the ASMR declined by 3.0% per year. The absolute and relative between-country disparities decreased during the study period. The rates of mortality attributable to RA have declined globally. However, we observed substantial between-country disparities in RA mortality, although these disparities decreased over time. Population aging combined with a decline in RA mortality may lead to an increase in the economic burden of disease that should be taken into consideration in policy-making. © 2017, American College of Rheumatology.

  19. ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii

    Directory of Open Access Journals (Sweden)

    Kempa Stefan

    2009-05-01

    Full Text Available Abstract Background The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. Results In the framework of the German Systems Biology initiative GoFORSYS, a pathway database and web-portal for Chlamydomonas (ChlamyCyc was established, which currently features about 250 metabolic pathways with associated genes, enzymes, and compound information. ChlamyCyc was assembled using an integrative approach combining the recently published genome sequence, bioinformatics methods, and experimental data from metabolomics and proteomics experiments. We analyzed and integrated a combination of primary and secondary database resources, such as existing genome annotations from JGI, EST collections, orthology information, and MapMan classification. Conclusion ChlamyCyc provides a curated and integrated systems biology repository that will enable and assist in systematic studies of fundamental cellular processes in Chlamydomonas. The ChlamyCyc database and web-portal is freely available under http://chlamycyc.mpimp-golm.mpg.de.

  20. Efficient Incremental Garbage Collection for Workstation/Server Database Systems

    OpenAIRE

    Amsaleg , Laurent; Gruber , Olivier; Franklin , Michael

    1994-01-01

    Projet RODIN; We describe an efficient server-based algorithm for garbage collecting object-oriented databases in a workstation/server environment. The algorithm is incremental and runs concurrently with client transactions, however, it does not hold any locks on data and does not require callbacks to clients. It is fault tolerant, but performs very little logging. The algorithm has been designed to be integrated into existing OODB systems, and therefore it works with standard implementation ...

  1. Brasilia’s Database Administrators

    Directory of Open Access Journals (Sweden)

    Jane Adriana

    2016-06-01

    Full Text Available Database administration has gained an essential role in the management of new database technologies. Different data models are being created for supporting the enormous data volume, from the traditional relational database. These new models are called NoSQL (Not only SQL databases. The adoption of best practices and procedures, has become essential for the operation of database management systems. Thus, this paper investigates some of the techniques and tools used by database administrators. The study highlights features and particularities in databases within the area of Brasilia, the Capital of Brazil. The results point to which new technologies regarding database management are currently the most relevant, as well as the central issues in this area.

  2. Applying cognitive load theory to the redesign of a conventional database systems course

    Science.gov (United States)

    Mason, Raina; Seton, Carolyn; Cooper, Graham

    2016-01-01

    Cognitive load theory (CLT) was used to redesign a Database Systems course for Information Technology students. The redesign was intended to address poor student performance and low satisfaction, and to provide a more relevant foundation in database design and use for subsequent studies and industry. The original course followed the conventional structure for a database course, covering database design first, then database development. Analysis showed the conventional course content was appropriate but the instructional materials used were too complex, especially for novice students. The redesign of instructional materials applied CLT to remove split attention and redundancy effects, to provide suitable worked examples and sub-goals, and included an extensive re-sequencing of content. The approach was primarily directed towards mid- to lower performing students and results showed a significant improvement for this cohort with the exam failure rate reducing by 34% after the redesign on identical final exams. Student satisfaction also increased and feedback from subsequent study was very positive. The application of CLT to the design of instructional materials is discussed for delivery of technical courses.

  3. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    Science.gov (United States)

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  4. Rapid storage and retrieval of genomic intervals from a relational database system using nested containment lists.

    Science.gov (United States)

    Wiley, Laura K; Sivley, R Michael; Bush, William S

    2013-01-01

    Efficient storage and retrieval of genomic annotations based on range intervals is necessary, given the amount of data produced by next-generation sequencing studies. The indexing strategies of relational database systems (such as MySQL) greatly inhibit their use in genomic annotation tasks. This has led to the development of stand-alone applications that are dependent on flat-file libraries. In this work, we introduce MyNCList, an implementation of the NCList data structure within a MySQL database. MyNCList enables the storage, update and rapid retrieval of genomic annotations from the convenience of a relational database system. Range-based annotations of 1 million variants are retrieved in under a minute, making this approach feasible for whole-genome annotation tasks. Database URL: https://github.com/bushlab/mynclist.

  5. Emerging Trends in Computing, Informatics, Systems Sciences, and Engineering

    CERN Document Server

    Elleithy, Khaled

    2013-01-01

    Emerging Trends in Computing, Informatics, Systems Sciences, and Engineering includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of  Industrial Electronics, Technology & Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning. This book includes the proceedings of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2010). The proceedings are a set of rigorously reviewed world-class manuscripts presenting the state of international practice in Innovative Algorithms and Techniques in Automation, Industrial Electronics and Telecommunications.

  6. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  7. QED's School Market Trends: Teacher Buying Behavior & Attitudes, 2001-2002. Research Report.

    Science.gov (United States)

    Quality Education Data, Inc., Denver, CO.

    This study examined teachers' classroom material buying behaviors and trends. Data came from Quality Education Data's National Education Database, which includes U.S. K-12 public, private, and Catholic schools and districts. Researchers surveyed K-8 teachers randomly selected from QED's National Education Database. Results show that teachers spend…

  8. An integrated data-analysis and database system for AMS {sup 14}C

    Energy Technology Data Exchange (ETDEWEB)

    Kjeldsen, Henrik, E-mail: kjeldsen@phys.au.d [AMS 14C Dating Centre, Department of Physics and Astronomy, Aarhus University, Aarhus (Denmark); Olsen, Jesper [Department of Earth Sciences, Aarhus University, Aarhus (Denmark); Heinemeier, Jan [AMS 14C Dating Centre, Department of Physics and Astronomy, Aarhus University, Aarhus (Denmark)

    2010-04-15

    AMSdata is the name of a combined database and data-analysis system for AMS {sup 14}C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS {sup 14}C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.

  9. Acromegaly at diagnosis in 3173 patients from the Liège Acromegaly Survey (LAS) Database

    Science.gov (United States)

    Petrossians, Patrick; Daly, Adrian F; Natchev, Emil; Maione, Luigi; Blijdorp, Karin; Sahnoun-Fathallah, Mona; Auriemma, Renata; Diallo, Alpha M; Hulting, Anna-Lena; Ferone, Diego; Hana, Vaclav; Filipponi, Silvia; Sievers, Caroline; Nogueira, Claudia; Fajardo-Montañana, Carmen; Carvalho, Davide; Hana, Vaclav; Stalla, Günter K; Jaffrain-Réa, Marie-Lise; Delemer, Brigitte; Colao, Annamaria; Brue, Thierry; Neggers, Sebastian J C M M; Zacharieva, Sabina; Chanson, Philippe

    2017-01-01

    Acromegaly is a rare disorder caused by chronic growth hormone (GH) hypersecretion. While diagnostic and therapeutic methods have advanced, little information exists on trends in acromegaly characteristics over time. The Liège Acromegaly Survey (LAS) Database, a relational database, is designed to assess the profile of acromegaly patients at diagnosis and during long-term follow-up at multiple treatment centers. The following results were obtained at diagnosis. The study population consisted of 3173 acromegaly patients from ten countries; 54.5% were female. Males were significantly younger at diagnosis than females (43.5 vs 46.4 years; P 3100 patients is the largest international acromegaly database and shows clinically relevant trends in the characteristics of acromegaly at diagnosis. PMID:28733467

  10. Acromegaly at diagnosis in 3173 patients from the Liège Acromegaly Survey (LAS) Database.

    Science.gov (United States)

    Petrossians, Patrick; Daly, Adrian F; Natchev, Emil; Maione, Luigi; Blijdorp, Karin; Sahnoun-Fathallah, Mona; Auriemma, Renata; Diallo, Alpha M; Hulting, Anna-Lena; Ferone, Diego; Hana, Vaclav; Filipponi, Silvia; Sievers, Caroline; Nogueira, Claudia; Fajardo-Montañana, Carmen; Carvalho, Davide; Hana, Vaclav; Stalla, Günter K; Jaffrain-Réa, Marie-Lise; Delemer, Brigitte; Colao, Annamaria; Brue, Thierry; Neggers, Sebastian J C M M; Zacharieva, Sabina; Chanson, Philippe; Beckers, Albert

    2017-10-01

    Acromegaly is a rare disorder caused by chronic growth hormone (GH) hypersecretion. While diagnostic and therapeutic methods have advanced, little information exists on trends in acromegaly characteristics over time. The Liège Acromegaly Survey (LAS) Database , a relational database, is designed to assess the profile of acromegaly patients at diagnosis and during long-term follow-up at multiple treatment centers. The following results were obtained at diagnosis. The study population consisted of 3173 acromegaly patients from ten countries; 54.5% were female. Males were significantly younger at diagnosis than females (43.5 vs 46.4 years; P  3100 patients is the largest international acromegaly database and shows clinically relevant trends in the characteristics of acromegaly at diagnosis. © 2017 The authors.

  11. Cloud database development and management

    CERN Document Server

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  12. Database Optimizing Services

    Directory of Open Access Journals (Sweden)

    Adrian GHENCEA

    2010-12-01

    Full Text Available Almost every organization has at its centre a database. The database provides support for conducting different activities, whether it is production, sales and marketing or internal operations. Every day, a database is accessed for help in strategic decisions. The satisfaction therefore of such needs is entailed with a high quality security and availability. Those needs can be realised using a DBMS (Database Management System which is, in fact, software for a database. Technically speaking, it is software which uses a standard method of cataloguing, recovery, and running different data queries. DBMS manages the input data, organizes it, and provides ways of modifying or extracting the data by its users or other programs. Managing the database is an operation that requires periodical updates, optimizing and monitoring.

  13. Timeliness and Predictability in Real-Time Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H

    1998-01-01

    The confluence of computers, communications, and databases is quickly creating a globally distributed database where many applications require real time access to both temporally accurate and multimedia data...

  14. Development of the Lymphoma Enterprise Architecture Database: a caBIG Silver level compliant system.

    Science.gov (United States)

    Huang, Taoying; Shenoy, Pareen J; Sinha, Rajni; Graiser, Michael; Bumpers, Kevin W; Flowers, Christopher R

    2009-04-03

    Lymphomas are the fifth most common cancer in United States with numerous histological subtypes. Integrating existing clinical information on lymphoma patients provides a platform for understanding biological variability in presentation and treatment response and aids development of novel therapies. We developed a cancer Biomedical Informatics Grid (caBIG) Silver level compliant lymphoma database, called the Lymphoma Enterprise Architecture Data-system (LEAD), which integrates the pathology, pharmacy, laboratory, cancer registry, clinical trials, and clinical data from institutional databases. We utilized the Cancer Common Ontological Representation Environment Software Development Kit (caCORE SDK) provided by National Cancer Institute's Center for Bioinformatics to establish the LEAD platform for data management. The caCORE SDK generated system utilizes an n-tier architecture with open Application Programming Interfaces, controlled vocabularies, and registered metadata to achieve semantic integration across multiple cancer databases. We demonstrated that the data elements and structures within LEAD could be used to manage clinical research data from phase 1 clinical trials, cohort studies, and registry data from the Surveillance Epidemiology and End Results database. This work provides a clear example of how semantic technologies from caBIG can be applied to support a wide range of clinical and research tasks, and integrate data from disparate systems into a single architecture. This illustrates the central importance of caBIG to the management of clinical and biological data.

  15. Global ocean carbon uptake: magnitude, variability and trends

    Directory of Open Access Journals (Sweden)

    R. Wanninkhof

    2013-03-01

    Full Text Available The globally integrated sea–air anthropogenic carbon dioxide (CO2 flux from 1990 to 2009 is determined from models and data-based approaches as part of the Regional Carbon Cycle Assessment and Processes (RECCAP project. Numerical methods include ocean inverse models, atmospheric inverse models, and ocean general circulation models with parameterized biogeochemistry (OBGCMs. The median value of different approaches shows good agreement in average uptake. The best estimate of anthropogenic CO2 uptake for the time period based on a compilation of approaches is −2.0 Pg C yr−1. The interannual variability in the sea–air flux is largely driven by large-scale climate re-organizations and is estimated at 0.2 Pg C yr−1 for the two decades with some systematic differences between approaches. The largest differences between approaches are seen in the decadal trends. The trends range from −0.13 (Pg C yr−1 decade−1 to −0.50 (Pg C yr−1 decade−1 for the two decades under investigation. The OBGCMs and the data-based sea–air CO2 flux estimates show appreciably smaller decadal trends than estimates based on changes in carbon inventory suggesting that methods capable of resolving shorter timescales are showing a slowing of the rate of ocean CO2 uptake. RECCAP model outputs for five decades show similar differences in trends between approaches.

  16. Generic Database Cost Models for Hierarchical Memory Systems

    OpenAIRE

    Manegold, Stefan; Boncz, Peter; Kersten, Martin

    2002-01-01

    textabstractAccurate prediction of operator execution time is a prerequisite for database query optimization. Although extensively studied for conventional disk-based DBMSs, cost modeling in main-memory DBMSs is still an open issue. Recent database research has demonstrated that memory access is more and more becoming a significant---if not the major---cost component of database operations. If used properly, fast but small cache memories---usually organized in cascading hierarchy between CPU ...

  17. Report on comprehensive surveys of nationwide geothermal resources in fiscal 1979. Conceptual design of a database system; 1979 nendo zenkoku chinetsu shigen sogo chosa hokokusho. Database system gainen sekkei

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1980-03-31

    Conceptual design was made on a database system as part of the comprehensive surveys of nationwide geothermal resources. Underground hot water in depths of several kilometers close to the ground surface is a utilizable geothermal energy. Exploration using the ground surface survey is much less expensive than the test drilling survey, but has greater error in estimation because of being an indirect method. However, integrating data by freely using a number of exploration methods can improve the accuracy of estimation on the whole. In performing the conceptual design of a geothermal resource information system, the functions of this large scale database were used as the framework. Further data collection, distribution and interactive type man-machine communication, modeling, and environment surveillance functions were incorporated. Considerations were also given on further diversified utilization patterns and on support to users in remote areas and end users. What is important in designing the system is that constituting elements of hardware and software should function while being combined organically as one system, rather than the elements work independently. In addition, sufficient expandability and flexibility are indispensable. (NEDO)

  18. Moving Observer Support for Databases

    DEFF Research Database (Denmark)

    Bukauskas, Linas

    Interactive visual data explorations impose rigid requirements on database and visualization systems. Systems that visualize huge amounts of data tend to request large amounts of memory resources and heavily use the CPU to process and visualize data. Current systems employ a loosely coupled...... architecture to exchange data between database and visualization. Thus, the interaction of the visualizer and the database is kept to the minimum, which most often leads to superfluous data being passed from database to visualizer. This Ph.D. thesis presents a novel tight coupling of database and visualizer....... The thesis discusses the VR-tree, an extension of the R-tree that enables observer relative data extraction. To support incremental observer position relative data extraction the thesis proposes the Volatile Access Structure (VAST). VAST is a main memory structure that caches nodes of the VR-tree. VAST...

  19. Computerized nuclear material database management system for power reactors

    International Nuclear Information System (INIS)

    Cheng Binghao; Zhu Rongbao; Liu Daming; Cao Bin; Liu Ling; Tan Yajun; Jiang Jincai

    1994-01-01

    The software packages for nuclear material database management for power reactors are described. The database structure, data flow and model for management of the database are analysed. Also mentioned are the main functions and characterizations of the software packages, which are successfully installed and used at both the Daya Bay Nuclear Power Plant and the Qinshan Nuclear Power Plant for the purposed of handling nuclear material database automatically

  20. Semantic-Based Concurrency Control for Object-Oriented Database Systems Supporting Real-Time Applications

    National Research Council Canada - National Science Library

    Lee, Juhnyoung; Son, Sang H

    1994-01-01

    .... This paper investigates major issues in designing semantic-based concurrency control for object-oriented database systems supporting real-time applications, and it describes approaches to solving...