WorldWideScience

Sample records for database management research

  1. Development of operation management database for research reactors

    Zhang Xinjun; Chen Wei; Yang Jun

    2005-01-01

    An Operation Database for Pulsed Reactor has been developed on the platform for Microsoft visual C++ 6.0. This database includes four function modules, fuel elements management, incident management, experiment management and file management. It is essential for reactor security and information management. (authors)

  2. Concierge: Personal database software for managing digital research resources

    Hiroyuki Sakai

    2007-11-01

    Full Text Available This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literaturemanagement, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp.

  3. DOG-SPOT database for comprehensive management of dog genetic research data

    Sutter Nathan B

    2010-12-01

    Full Text Available Abstract Research laboratories studying the genetics of companion animals have no database tools specifically designed to aid in the management of the many kinds of data that are generated, stored and analyzed. We have developed a relational database, "DOG-SPOT," to provide such a tool. Implemented in MS-Access, the database is easy to extend or customize to suit a lab's particular needs. With DOG-SPOT a lab can manage data relating to dogs, breeds, samples, biomaterials, phenotypes, owners, communications, amplicons, sequences, markers, genotypes and personnel. Such an integrated data structure helps ensure high quality data entry and makes it easy to track physical stocks of biomaterials and oligonucleotides.

  4. Database development and management

    Chao, Lee

    2006-01-01

    Introduction to Database Systems Functions of a DatabaseDatabase Management SystemDatabase ComponentsDatabase Development ProcessConceptual Design and Data Modeling Introduction to Database Design Process Understanding Business ProcessEntity-Relationship Data Model Representing Business Process with Entity-RelationshipModelTable Structure and NormalizationIntroduction to TablesTable NormalizationTransforming Data Models to Relational Databases .DBMS Selection Transforming Data Models to Relational DatabasesEnforcing ConstraintsCreating Database for Business ProcessPhysical Design and Database

  5. Ageing Management Program Database

    Basic, I.; Vrbanic, I.; Zabric, I.; Savli, S.

    2008-01-01

    The aspects of plant ageing management (AM) gained increasing attention over the last ten years. Numerous technical studies have been performed to study the impact of ageing mechanisms on the safe and reliable operation of nuclear power plants. National research activities have been initiated or are in progress to provide the technical basis for decision making processes. The long-term operation of nuclear power plants is influenced by economic considerations, the socio-economic environment including public acceptance, developments in research and the regulatory framework, the availability of technical infrastructure to maintain and service the systems, structures and components as well as qualified personnel. Besides national activities there are a number of international activities in particular under the umbrella of the IAEA, the OECD and the EU. The paper discusses the process, procedure and database developed for Slovenian Nuclear Safety Administration (SNSA) surveillance of ageing process of Nuclear power Plant Krsko.(author)

  6. Records Management Database

    US Agency for International Development — The Records Management Database is tool created in Microsoft Access specifically for USAID use. It contains metadata in order to access and retrieve the information...

  7. PlantDB – a versatile database for managing plant research

    Gruissem Wilhelm

    2008-01-01

    Full Text Available Abstract Background Research in plant science laboratories often involves usage of many different species, cultivars, ecotypes, mutants, alleles or transgenic lines. This creates a great challenge to keep track of the identity of experimental plants and stored samples or seeds. Results Here, we describe PlantDB – a Microsoft® Office Access database – with a user-friendly front-end for managing information relevant for experimental plants. PlantDB can hold information about plants of different species, cultivars or genetic composition. Introduction of a concise identifier system allows easy generation of pedigree trees. In addition, all information about any experimental plant – from growth conditions and dates over extracted samples such as RNA to files containing images of the plants – can be linked unequivocally. Conclusion We have been using PlantDB for several years in our laboratory and found that it greatly facilitates access to relevant information.

  8. Generalized Database Management System Support for Numeric Database Environments.

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  9. NIRS database of the original research database

    Morita, Kyoko

    1991-01-01

    Recently, library staffs arranged and compiled the original research papers that have been written by researchers for 33 years since National Institute of Radiological Sciences (NIRS) established. This papers describes how the internal database of original research papers has been created. This is a small sample of hand-made database. This has been cumulating by staffs who have any knowledge about computer machine or computer programming. (author)

  10. Design and utilization of a Flight Test Engineering Database Management System at the NASA Dryden Flight Research Facility

    Knighton, Donna L.

    1992-01-01

    A Flight Test Engineering Database Management System (FTE DBMS) was designed and implemented at the NASA Dryden Flight Research Facility. The X-29 Forward Swept Wing Advanced Technology Demonstrator flight research program was chosen for the initial system development and implementation. The FTE DBMS greatly assisted in planning and 'mass production' card preparation for an accelerated X-29 research program. Improved Test Plan tracking and maneuver management for a high flight-rate program were proven, and flight rates of up to three flights per day, two times per week were maintained.

  11. Nuclear database management systems

    Stone, C.; Sutton, R.

    1996-01-01

    The authors are developing software tools for accessing and visualizing nuclear data. MacNuclide was the first software application produced by their group. This application incorporates novel database management and visualization tools into an intuitive interface. The nuclide chart is used to access properties and to display results of searches. Selecting a nuclide in the chart displays a level scheme with tables of basic, radioactive decay, and other properties. All level schemes are interactive, allowing the user to modify the display, move between nuclides, and display entire daughter decay chains

  12. Database Quality and Access Issues Relevant to Research Using Anesthesia Information Management System Data.

    Epstein, Richard H; Dexter, Franklin

    2018-07-01

    For this special article, we reviewed the computer code, used to extract the data, and the text of all 47 studies published between January 2006 and August 2017 using anesthesia information management system (AIMS) data from Thomas Jefferson University Hospital (TJUH). Data from this institution were used in the largest number (P = .0007) of papers describing the use of AIMS published in this time frame. The AIMS was replaced in April 2017, making this finite sample finite. The objective of the current article was to identify factors that made TJUH successful in publishing anesthesia informatics studies. We examined the structured query language used for each study to examine the extent to which databases outside of the AIMS were used. We examined data quality from the perspectives of completeness, correctness, concordance, plausibility, and currency. Our results were that most could not have been completed without external database sources (36/47, 76.6%; P = .0003 compared with 50%). The operating room management system was linked to the AIMS and was used significantly more frequently (26/36, 72%) than other external sources. Access to these external data sources was provided, allowing exploration of data quality. The TJUH AIMS used high-resolution timestamps (to the nearest 3 milliseconds) and created audit tables to track changes to clinical documentation. Automatic data were recorded at 1-minute intervals and were not editable; data cleaning occurred during analysis. Few paired events with an expected order were out of sequence. Although most data elements were of high quality, there were notable exceptions, such as frequent missing values for estimated blood loss, height, and weight. Some values were duplicated with different units, and others were stored in varying locations. Our conclusions are that linking the TJUH AIMS to the operating room management system was a critical step in enabling publication of multiple studies using AIMS data. Access to this and

  13. [Research and development of medical case database: a novel medical case information system integrating with biospecimen management].

    Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang

    2010-04-01

    To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.

  14. Native Health Research Database

    ... Indian Health Board) Welcome to the Native Health Database. Please enter your search terms. Basic Search Advanced ... To learn more about searching the Native Health Database, click here. Tutorial Video The NHD has made ...

  15. Djeen (Database for Joomla!'s Extensible Engine): a research information management system for flexible multi-technology project administration.

    Stahl, Olivier; Duvergey, Hugo; Guille, Arnaud; Blondin, Fanny; Vecchio, Alexandre Del; Finetti, Pascal; Granjeaud, Samuel; Vigy, Oana; Bidaut, Ghislain

    2013-06-06

    With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. We developed Djeen (Database for Joomla!'s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group.Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material.

  16. A plant resource and experiment management system based on the Golm Plant Database as a basic tool for omics research

    Selbig Joachim

    2008-05-01

    names generated by the system and barcode labels facilitate identification and management of the material. Web pages are provided as user interfaces to facilitate maintaining the system in an environment with many desktop computers and a rapidly changing user community. Web based search tools are the basis for joint use of the material by all researchers of the institute. Conclusion The Golm Plant Database system, which is based on a relational database, collects the genetic and environmental information on plant material during its production or experimental use at the Max-Planck-Institute of Molecular Plant Physiology. It thus provides information according to the MIAME standard for the component 'Sample' in a highly standardised format. The Plant Database system thus facilitates collaborative work and allows efficient queries in data analysis for systems biology research.

  17. Radiation safety research information database

    Yukawa, Masae; Miyamoto, Kiriko; Takeda, Hiroshi; Kuroda, Noriko; Yamamoto, Kazuhiko

    2004-01-01

    National Institute of Radiological Sciences in Japan began to construct Radiation Safety Research Information Database' in 2001. The research information database is of great service to evaluate the effects of radiation on people by estimating exposure dose by determining radiation and radioactive matters in the environment. The above database (DB) consists of seven DB such as Nirs Air Borne Dust Survey DB, Nirs Environmental Tritium Survey DB, Nirs Environmental Carbon Survey DB, Environmental Radiation Levels, Abe, Metabolic Database for Assessment of Internal Dose, Graphs of Predicted Monitoring Data, and Nirs nuclear installation environment water tritium survey DB. Outline of DB and each DB are explained. (S.Y.)

  18. Aging management database

    Vidican, Dan

    2003-01-01

    As operation time is accumulated, the overall safety and performance of NPP tend to decrease. The reasons for potential non-availability of the structures, Systems and Components (SCC) in operation, are various but they represent in different mode the end result of the ageing phenomena. In order to understand the ageing phenomena and to be able to take adequate countermeasures, it is necessary to accumulate a big amount of information, from worldwide and also from the own plant. These Data have to be organized in a systematic form, easy to retrieval and use. General requirements and structure of an Ageing DataBase Activities related to ageing evaluation have to allow: - Identification and evaluation of degradation phenomena, potential malfunction and failure mode of the plant typical components; - Trend analyses (on selected critical components), prediction of the future performance and the remaining service life. To perform these activities, it is necessary to have information on similar components behavior in different NPP (in different environment and different operating conditions) and also the results from different pilot studies. The knowledge of worldwide experience is worthwhile. Also, it is necessary to know very well the operating and environmental conditions in own NPP and to analyze in detail the failure mode and root cause for the components removed from the plant due to extended degradation. Based on the above aspects, one presents a proposal for the structure of an Ageing DataBase. It has three main sections: - Section A: General knowledge about ageing phenomena. It contain all the information collected based on the worldwide experience. It could have, a general part with crude information and a synthetic one, structured on typical components (if possible on different manufacturers). The synthetic part, has to consider different ageing aspects and different monitoring and evaluation methods (e. g. component, function, environment condition, specific

  19. Database searches for qualitative research

    Evans, David

    2002-01-01

    Interest in the role of qualitative research in evidence-based health care is growing. However, the methods currently used to identify quantitative research do not translate easily to qualitative research. This paper highlights some of the difficulties during searches of electronic databases for qualitative research. These difficulties relate to the descriptive nature of the titles used in some qualitative studies, the variable information provided in abstracts, and the differences in the ind...

  20. Distributed Database Management Systems A Practical Approach

    Rahimi, Saeed K

    2010-01-01

    This book addresses issues related to managing data across a distributed database system. It is unique because it covers traditional database theory and current research, explaining the difficulties in providing a unified user interface and global data dictionary. The book gives implementers guidance on hiding discrepancies across systems and creating the illusion of a single repository for users. It also includes three sample frameworksâ€"implemented using J2SE with JMS, J2EE, and Microsoft .Netâ€"that readers can use to learn how to implement a distributed database management system. IT and

  1. Database management systems understanding and applying database technology

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  2. Research reactor records in the INIS database

    Marinkovic, N.

    2001-01-01

    This report presents a statistical analysis of more than 13,000 records of publications concerned with research and technology in the field of research and experimental reactors which are included in the INIS Bibliographic Database for the period from 1970 to 2001. The main objectives of this bibliometric study were: to make an inventory of research reactor related records in the INIS Database; to provide statistics and scientific indicators for the INIS users, namely science managers, researchers, engineers, operators, scientific editors and publishers, decision-makers in the field of research reactors related subjects; to extract other useful information from the INIS Bibliographic Database about articles published in research reactors research and technology. (author)

  3. Management system of instrument database

    Zhang Xin

    1997-01-01

    The author introduces a management system of instrument database. This system has been developed using with Foxpro on network. The system has some characters such as clear structure, easy operation, flexible and convenient query, as well as the data safety and reliability

  4. A Database Management Assessment Instrument

    Landry, Jeffrey P.; Pardue, J. Harold; Daigle, Roy; Longenecker, Herbert E., Jr.

    2013-01-01

    This paper describes an instrument designed for assessing learning outcomes in data management. In addition to assessment of student learning and ABET outcomes, we have also found the instrument to be effective for determining database placement of incoming information systems (IS) graduate students. Each of these three uses is discussed in this…

  5. Interconnecting heterogeneous database management systems

    Gligor, V. D.; Luckenbaugh, G. L.

    1984-01-01

    It is pointed out that there is still a great need for the development of improved communication between remote, heterogeneous database management systems (DBMS). Problems regarding the effective communication between distributed DBMSs are primarily related to significant differences between local data managers, local data models and representations, and local transaction managers. A system of interconnected DBMSs which exhibit such differences is called a network of distributed, heterogeneous DBMSs. In order to achieve effective interconnection of remote, heterogeneous DBMSs, the users must have uniform, integrated access to the different DBMs. The present investigation is mainly concerned with an analysis of the existing approaches to interconnecting heterogeneous DBMSs, taking into account four experimental DBMS projects.

  6. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  7. Design and Implementation of a Research Data Management System: The CRC/TR32 Project Database (TR32DB)

    Curdt, Constanze

    2014-01-01

    Research data management (RDM) includes all processes and measures which ensure that research data are well-organised, documented, preserved, stored, backed up, accessible, available, and re-usable. Corresponding RDM systems or repositories form the technical framework to support the collection, accurate documentation, storage, back-up, sharing, and provision of research data, which are created in a specific environment, like a research group or institution. The required measures for the impl...

  8. Design research of uranium mine borehole database

    Xie Huaming; Hu Guangdao; Zhu Xianglin; Chen Dehua; Chen Miaoshun

    2008-01-01

    With short supply of energy sources, exploration of uranium mine have been enhanced, but data storage, analysis and usage of exploration data of uranium mine are not highly computerized currently in China, the data is poor shared and used that it can not adapt the need of production and research. It will be well done, if the data are stored and managed in a database system. The concept structure design, logic structure design and data integrity checks are discussed according to the demand of applications and the analysis of exploration data of uranium mine. An application of the database is illustrated finally. (authors)

  9. Security Research on Engineering Database System

    2002-01-01

    Engine engineering database system is an oriented C AD applied database management system that has the capability managing distributed data. The paper discusses the security issue of the engine engineering database management system (EDBMS). Through studying and analyzing the database security, to draw a series of securi ty rules, which reach B1, level security standard. Which includes discretionary access control (DAC), mandatory access control (MAC) and audit. The EDBMS implem ents functions of DAC, ...

  10. A lake-centric geospatial database to guide research and inform management decisions in an Arctic watershed in northern Alaska experiencing climate and land-use changes

    Jones, Benjamin M.; Arp, Christopher D.; Whitman, Matthew S.; Nigro, Debora A.; Nitze, Ingmar; Beaver, John; Gadeke, Anne; Zuck, Callie; Liljedahl, Anna K.; Daanen, Ronald; Torvinen, Eric; Fritz, Stacey; Grosse, Guido

    2017-01-01

    Lakes are dominant and diverse landscape features in the Arctic, but conventional land cover classification schemes typically map them as a single uniform class. Here, we present a detailed lake-centric geospatial database for an Arctic watershed in northern Alaska. We developed a GIS dataset consisting of 4362 lakes that provides information on lake morphometry, hydrologic connectivity, surface area dynamics, surrounding terrestrial ecotypes, and other important conditions describing Arctic lakes. Analyzing the geospatial database relative to fish and bird survey data shows relations to lake depth and hydrologic connectivity, which are being used to guide research and aid in the management of aquatic resources in the National Petroleum Reserve in Alaska. Further development of similar geospatial databases is needed to better understand and plan for the impacts of ongoing climate and land-use changes occurring across lake-rich landscapes in the Arctic.

  11. Column-oriented database management systems

    Možina, David

    2013-01-01

    In the following thesis I will present column-oriented database. Among other things, I will answer on a question why there is a need for a column-oriented database. In recent years there have been a lot of attention regarding a column-oriented database, even if the existence of a columnar database management systems dates back in the early seventies of the last century. I will compare both systems for a database management – a colum-oriented database system and a row-oriented database system ...

  12. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  13. Management research

    Berry, M.

    1988-01-01

    The 1988 progress report of the Management Research center (Polytechnic School, France), is presented. The Center research programs include the management of different organizations, such as industry, administrative systems, hospitals and cultural systems. The investigations performed concern the improvement and better knowledge of the new methods of analysis: the role of the speech, the logic conflicts; the crisis development, symptoms and effects; the relationship between the management practices and the prevailing ideas or theories. The approach adopted by the scientists involves the accurate analysis of the essential management activities. The investigations carried out in 1988 are summarized. The published papers, the congress communications and the thesis are listed [fr

  14. Microcomputer Database Management Systems for Bibliographic Data.

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  15. TWRS technical baseline database manager definition document

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  16. Cloud database development and management

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  17. Content And Multimedia Database Management Systems

    de Vries, A.P.

    1999-01-01

    A database management system is a general-purpose software system that facilitates the processes of defining, constructing, and manipulating databases for various applications. The main characteristic of the ‘database approach’ is that it increases the value of data by its emphasis on data

  18. Reexamining Operating System Support for Database Management

    Vasil, Tim

    2003-01-01

    In 1981, Michael Stonebraker [21] observed that database management systems written for commodity operating systems could not effectively take advantage of key operating system services, such as buffer pool management and process scheduling, due to expensive overhead and lack of customizability. The “not quite right” fit between these kernel services and the demands of database systems forced database designers to work around such limitations or re-implement some kernel functionality in user ...

  19. Performance Enhancements for Advanced Database Management Systems

    Helmer, Sven

    2000-01-01

    New applications have emerged, demanding database management systems with enhanced functionality. However, high performance is a necessary precondition for the acceptance of such systems by end users. In this context we developed, implemented, and tested algorithms and index structures for improving the performance of advanced database management systems. We focused on index structures and join algorithms for set-valued attributes.

  20. Database on veterinary clinical research in homeopathy.

    Clausen, Jürgen; Albrecht, Henning

    2010-07-01

    The aim of the present report is to provide an overview of the first database on clinical research in veterinary homeopathy. Detailed searches in the database 'Veterinary Clinical Research-Database in Homeopathy' (http://www.carstens-stiftung.de/clinresvet/index.php). The database contains about 200 entries of randomised clinical trials, non-randomised clinical trials, observational studies, drug provings, case reports and case series. Twenty-two clinical fields are covered and eight different groups of species are included. The database is free of charge and open to all interested veterinarians and researchers. The database enables researchers and veterinarians, sceptics and supporters to get a quick overview of the status of veterinary clinical research in homeopathy and alleviates the preparation of systematical reviews or may stimulate reproductions or even new studies. 2010 Elsevier Ltd. All rights reserved.

  1. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  2. Temporal Databases in Network Management

    Gupta, Ajay

    1998-01-01

    .... This thesis discusses issues involved with performing network management, specifically with means of reducing and storing the large quantity of data that networks management tools and systems generate...

  3. Applications of GIS and database technologies to manage a Karst Feature Database

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  4. Correlates of Access to Business Research Databases

    Gottfried, John C.

    2010-01-01

    This study examines potential correlates of business research database access through academic libraries serving top business programs in the United States. Results indicate that greater access to research databases is related to enrollment in graduate business programs, but not to overall enrollment or status as a public or private institution.…

  5. Management of virtualized infrastructure for physics databases

    Topurov, Anton; Gallerani, Luigi; Chatal, Francois; Piorkowski, Mariusz

    2012-01-01

    Demands for information storage of physics metadata are rapidly increasing together with the requirements for its high availability. Most of the HEP laboratories are struggling to squeeze more from their computer centers, thus focus on virtualizing available resources. CERN started investigating database virtualization in early 2006, first by testing database performance and stability on native Xen. Since then we have been closely evaluating the constantly evolving functionality of virtualisation solutions for database and middle tier together with the associated management applications – Oracle's Enterprise Manager and VM Manager. This session will detail our long experience in dealing with virtualized environments, focusing on newest Oracle OVM 3.0 for x86 and Oracle Enterprise Manager functionality for efficiently managing your virtualized database infrastructure.

  6. Study on managing EPICS database using ORACLE

    Liu Shu; Wang Chunhong; Zhao Jijiu

    2007-01-01

    EPICS is used as a development toolkit of BEPCII control system. The core of EPICS is a distributed database residing in front-end machines. The distributed database is usually created by tools such as VDCT and text editor in the host, then loaded to front-end target IOCs through the network. In BEPCII control system there are about 20,000 signals, which are distributed in more than 20 IOCs. All the databases are developed by device control engineers using VDCT or text editor. There's no uniform tools providing transparent management. The paper firstly presents the current status on EPICS database management issues in many labs. Secondly, it studies EPICS database and the interface between ORACLE and EPICS database. finally, it introduces the software development and application is BEPCII control system. (authors)

  7. Using Online Databases in Corporate Issues Management.

    Thomsen, Steven R.

    1995-01-01

    Finds that corporate public relations practitioners felt they were able, using online database and information services, to intercept issues earlier in the "issue cycle" and thus enable their organizations to develop more "proactionary" or "catalytic" issues management repose strategies. (SR)

  8. Database Support for Research in Public Administration

    Tucker, James Cory

    2005-01-01

    This study examines the extent to which databases support student and faculty research in the area of public administration. A list of journals in public administration, public policy, political science, public budgeting and finance, and other related areas was compared to the journal content list of six business databases. These databases…

  9. SIRSALE: integrated video database management tools

    Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.

    2002-07-01

    Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.

  10. Managing the BABAR Object Oriented Database

    Hasan, Adil

    2002-01-01

    The BaBar experiment stores its data in an Object Oriented federated database supplied by Objectivity/DB(tm). This database is currently 350TB in size and is expected to increase considerably as the experiment matures. Management of this database requires careful planning and specialized tools in order to make the data available to physicists in an efficient and timely manner. We discuss the operational issues and management tools that were developed during the previous run to deal with this vast quantity of data at SLAC

  11. Guide to Research Databases at IDRC

    Mélanie Brunet

    sponsibility of each user to ensure that he or she uses ... a collection of documents and research outputs generated as part of projects ... Although the commercial databases also have a French or Spanish interface, the content is mainly in En-.

  12. National Database for Autism Research (NDAR)

    U.S. Department of Health & Human Services — National Database for Autism Research (NDAR) is an extensible, scalable informatics platform for austism spectrum disorder-relevant data at all levels of biological...

  13. Using Large Diabetes Databases for Research.

    Wild, Sarah; Fischbacher, Colin; McKnight, John

    2016-09-01

    There are an increasing number of clinical, administrative and trial databases that can be used for research. These are particularly valuable if there are opportunities for linkage to other databases. This paper describes examples of the use of large diabetes databases for research. It reviews the advantages and disadvantages of using large diabetes databases for research and suggests solutions for some challenges. Large, high-quality databases offer potential sources of information for research at relatively low cost. Fundamental issues for using databases for research are the completeness of capture of cases within the population and time period of interest and accuracy of the diagnosis of diabetes and outcomes of interest. The extent to which people included in the database are representative should be considered if the database is not population based and there is the intention to extrapolate findings to the wider diabetes population. Information on key variables such as date of diagnosis or duration of diabetes may not be available at all, may be inaccurate or may contain a large amount of missing data. Information on key confounding factors is rarely available for the nondiabetic or general population limiting comparisons with the population of people with diabetes. However comparisons that allow for differences in distribution of important demographic factors may be feasible using data for the whole population or a matched cohort study design. In summary, diabetes databases can be used to address important research questions. Understanding the strengths and limitations of this approach is crucial to interpret the findings appropriately. © 2016 Diabetes Technology Society.

  14. The ATLAS Distributed Data Management System & Databases

    Garonne, V; The ATLAS collaboration; Barisits, M; Beermann, T; Vigne, R; Serfon, C

    2013-01-01

    The ATLAS Distributed Data Management (DDM) System is responsible for the global management of petabytes of high energy physics data. The current system, DQ2, has a critical dependency on Relational Database Management Systems (RDBMS), like Oracle. RDBMS are well-suited to enforcing data integrity in online transaction processing applications, however, concerns have been raised about the scalability of its data warehouse-like workload. In particular, analysis of archived data or aggregation of transactional data for summary purposes is problematic. Therefore, we have evaluated new approaches to handle vast amounts of data. We have investigated a class of database technologies commonly referred to as NoSQL databases. This includes distributed filesystems, like HDFS, that support parallel execution of computational tasks on distributed data, as well as schema-less approaches via key-value stores, like HBase. In this talk we will describe our use cases in ATLAS, share our experiences with various databases used ...

  15. Wireless Sensor Networks Database: Data Management and Implementation

    Ping Liu

    2014-04-01

    Full Text Available As the core application of wireless sensor network technology, Data management and processing have become the research hotspot in the new database. This article studied mainly data management in wireless sensor networks, in connection with the characteristics of the data in wireless sensor networks, discussed wireless sensor network data query, integrating technology in-depth, proposed a mobile database structure based on wireless sensor network and carried out overall design and implementation for the data management system. In order to achieve the communication rules of above routing trees, network manager uses a simple maintenance algorithm of routing trees. Design ordinary node end, server end in mobile database at gathering nodes and mobile client end that can implement the system, focus on designing query manager, storage modules and synchronous module at server end in mobile database at gathering nodes.

  16. Scheme of database structure on decommissioning of the research reactor

    Park, H. S.; Park, S. K.; Kim, H. R.; Lee, D. K.; Jung, K. J.

    2001-01-01

    ISP (Information Strategy Planning), which is the first step of the whole database development, has been studied to manage effectively information and data related to the decommissioning activities of the Korea Research Reactor 1 and 2 (KRR-1 and 2). Since Korea has not acquired the technology of the decommissioning database management system, some record management system (RMS) of large nuclear facilities of national experience such as in the U.S.A, Japan, Belgium, and Russian were reviewed. In order to construct the database structure of the whole decommissioning activities such as the working information, radioactive waste treatment, and radiological surveying and analysis has been extracted from the whole dismantling process. These information and data will be used as the basic data to analyzed the matrix to find the entity relationship diagram and will contribute to the establishment of a business system design and the development of a decommissioning database system as well

  17. From document to database: modernizing requirements management

    Giajnorio, J.; Hamilton, S.

    2007-01-01

    The creation, communication, and management of design requirements are central to the successful completion of any large engineering project, both technically and commercially. Design requirements in the Canadian nuclear industry are typically numbered lists in multiple documents created using word processing software. As an alternative, GE Nuclear Products implemented a central requirements management database for a major project at Bruce Power. The database configured the off-the-shelf software product, Telelogic Doors, to GE's requirements structure. This paper describes the advantages realized by this scheme. Examples include traceability from customer requirements through to test procedures, concurrent engineering, and automated change history. (author)

  18. The Cocoa Shop: A Database Management Case

    Pratt, Renée M. E.; Smatt, Cindi T.

    2015-01-01

    This is an example of a real-world applicable case study, which includes background information on a small local business (i.e., TCS), description of functional business requirements, and sample data. Students are asked to design and develop a database to improve the management of the company's customers, products, and purchases by emphasizing…

  19. Biomedical databases: protecting privacy and promoting research.

    Wylie, Jean E; Mineau, Geraldine P

    2003-03-01

    When combined with medical information, large electronic databases of information that identify individuals provide superlative resources for genetic, epidemiology and other biomedical research. Such research resources increasingly need to balance the protection of privacy and confidentiality with the promotion of research. Models that do not allow the use of such individual-identifying information constrain research; models that involve commercial interests raise concerns about what type of access is acceptable. Researchers, individuals representing the public interest and those developing regulatory guidelines must be involved in an ongoing dialogue to identify practical models.

  20. Establishment of database system for management of KAERI wastes

    Shon, J. S.; Kim, K. J.; Ahn, S. J.

    2004-07-01

    Radioactive wastes generated by KAERI has various types, nuclides and characteristics. To manage and control these kinds of radioactive wastes, it comes to need systematic management of their records, efficient research and quick statistics. Getting information about radioactive waste generated and stored by KAERI is the basic factor to construct the rapid information system for national cooperation management of radioactive waste. In this study, Radioactive Waste Management Integration System (RAWMIS) was developed. It is is aimed at management of record of radioactive wastes, uplifting the efficiency of management and support WACID(Waste Comprehensive Integration Database System) which is a national radioactive waste integrated safety management system of Korea. The major information of RAWMIS supported by user's requirements is generation, gathering, transfer, treatment, and storage information for solid waste, liquid waste, gas waste and waste related to spent fuel. RAWMIS is composed of database, software (interface between user and database), and software for a manager and it was designed with Client/Server structure. RAWMIS will be a useful tool to analyze radioactive waste management and radiation safety management. Also, this system is developed to share information with associated companies. Moreover, it can be expected to support the technology of research and development for radioactive waste treatment

  1. Plant operation data collection and database management using NIC system

    Inase, S.

    1990-01-01

    The Nuclear Information Center (NIC), a division of the Central Research Institute of Electric Power Industry, collects nuclear power plant operation and maintenance information both in Japan and abroad and transmits the information to all domestic utilities so that it can be effectively utilized for safe plant operation and reliability enhancement. The collected information is entered into the database system after being key-worded by NIC. The database system, Nuclear Information database/Communication System (NICS), has been developed by NIC for storage and management of collected information. Objectives of keywords are retrieval and classification by the keyword categories

  2. Computer networks for financial activity management, control and statistics of databases of economic administration at the Joint Institute for Nuclear Research

    Tyupikova, T.V.; Samoilov, V.N.

    2003-01-01

    Modern information technologies urge natural sciences to further development. But it comes together with evaluation of infrastructures, to spotlight favorable conditions for the development of science and financial base in order to prove and protect legally new research. Any scientific development entails accounting and legal protection. In the report, we consider a new direction in software, organization and control of common databases on the example of the electronic document handling, which functions in some departments of the Joint Institute for Nuclear Research

  3. An evaluation of Birmingham Own Health® telephone care management service among patients with poorly controlled diabetes. a retrospective comparison with the General Practice Research Database

    Adab Peymané

    2011-09-01

    Full Text Available Abstract Background Telephone-based care management programmes have been shown to improve health outcomes in some chronic diseases. Birmingham Own Health® is a telephone-based care service (nurse-delivered motivational coaching and support for self-management and lifestyle change for patients with poorly controlled diabetes, delivered in Birmingham, UK. We used a novel method to evaluate its effectiveness in a real-life setting. Methods Retrospective cohort study in the UK. 473 patients aged ≥ 18 years with diabetes enrolled onto Birmingham Own Health® (intervention cohort and with > 90 days follow-up, were each matched by age and sex to up to 50 patients with diabetes registered with the General Practice Research Database (GPRD to create a pool of 21,052 controls (control cohort. Controls were further selected from the main control cohort, matching as close as possible to the cases for baseline test levels, followed by as close as possible length of follow-up (within +/-30 days limits and within +/-90 days baseline test date. The aim was to identify a control group with as similar distribution of prognostic factors to the cases as possible. Effect sizes were computed using linear regression analysis adjusting for age, sex, deprivation quintile, length of follow-up and baseline test levels. Results After adjusting for baseline values and other potential confounders, the intervention showed significant mean reductions among people with diabetes of 0.3% (95%CI 0.1, 0.4% in HbA1c; 3.5 mmHg (1.5, 5.5 in systolic blood pressure, 1.6 mmHg (0.4, 2.7 in diastolic blood pressure and 0.7 unit reduction (0.3, 1.0 in BMI, over a mean follow-up of around 10 months. Only small effects were seen on average on serum cholesterol levels (0.1 mmol/l reduction (0.1, 0.2. More marked effects were seen for each clinical outcome among patients with worse baseline levels. Conclusions Despite the limitations of the study design, the results are consistent with the

  4. Selection of nuclear power information database management system

    Zhang Shuxin; Wu Jianlei

    1996-01-01

    In the condition of the present database technology, in order to build the Chinese nuclear power information database (NPIDB) in the nuclear industry system efficiently at a high starting point, an important task is to select a proper database management system (DBMS), which is the hinge of the matter to build the database successfully. Therefore, this article explains how to build a practical information database about nuclear power, the functions of different database management systems, the reason of selecting relation database management system (RDBMS), the principles of selecting RDBMS, the recommendation of ORACLE management system as the software to build database and so on

  5. Storage and Database Management for Big Data

    2015-07-27

    cloud models that satisfy different problem 1.2. THE BIG DATA CHALLENGE 3 Enterprise Big Data - Interactive - On-demand - Virtualization - Java ...replication. Data loss can only occur if three drives fail prior to any one of the failures being corrected. Hadoop is written in Java and is installed in a...visible view into a dataset. There are many popular database management systems such as MySQL [4], PostgreSQL [63], and Oracle [5]. Most commonly

  6. SPIRE Data-Base Management System

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  7. Database Support for Workflow Management: The WIDE Project

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  8. GiSAO.db: a database for ageing research

    Grillari Johannes

    2011-05-01

    Full Text Available Abstract Background Age-related gene expression patterns of Homo sapiens as well as of model organisms such as Mus musculus, Saccharomyces cerevisiae, Caenorhabditis elegans and Drosophila melanogaster are a basis for understanding the genetic mechanisms of ageing. For an effective analysis and interpretation of expression profiles it is necessary to store and manage huge amounts of data in an organized way, so that these data can be accessed and processed easily. Description GiSAO.db (Genes involved in senescence, apoptosis and oxidative stress database is a web-based database system for storing and retrieving ageing-related experimental data. Expression data of genes and miRNAs, annotation data like gene identifiers and GO terms, orthologs data and data of follow-up experiments are stored in the database. A user-friendly web application provides access to the stored data. KEGG pathways were incorporated and links to external databases augment the information in GiSAO.db. Search functions facilitate retrieval of data which can also be exported for further processing. Conclusions We have developed a centralized database that is very well suited for the management of data for ageing research. The database can be accessed at https://gisao.genome.tugraz.at and all the stored data can be viewed with a guest account.

  9. Reldata - a tool for reliability database management

    Vinod, Gopika; Saraf, R.K.; Babar, A.K.; Sanyasi Rao, V.V.S.; Tharani, Rajiv

    2000-01-01

    Component failure, repair and maintenance data is a very important element of any Probabilistic Safety Assessment study. The credibility of the results of such study is enhanced if the data used is generated from operating experience of similar power plants. Towards this objective, a computerised database is designed, with fields such as, date and time of failure, component name, failure mode, failure cause, ways of failure detection, reactor operating power status, repair times, down time, etc. This leads to evaluation of plant specific failure rate, and on demand failure probability/unavailability for all components. Systematic data updation can provide a real time component reliability parameter statistics and trend analysis and this helps in planning maintenance strategies. A software package has been developed RELDATA, which incorporates the database management and data analysis methods. This report describes the software features and underlying methodology in detail. (author)

  10. HATCHES - a thermodynamic database and management system

    Cross, J.E.; Ewart, F.T.

    1990-03-01

    The Nirex Safety Assessment Research Programme has been compiling the thermodynamic data necessary to allow simulations of the aqueous behaviour of the elements important to radioactive waste disposal to be made. These data have been obtained from the literature, when available, and validated for the conditions of interest by experiment. In order to maintain these data in an accessible form and to satisfy quality assurance on all data used for assessments, a database has been constructed which resides on a personal computer operating under MS-DOS using the Ashton-Tate dBase III program. This database contains all the input data fields required by the PHREEQE program and, in addition, a body of text which describes the source of the data and the derivation of the PHREEQE input parameters from the source data. The HATCHES system consists of this database, a suite of programs to facilitate the searching and listing of data and a further suite of programs to convert the dBase III files to PHREEQE database format. (Author)

  11. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  12. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  13. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  14. Nuclear power plant reliability database management

    Meslin, Th.; Aufort, P.

    1996-04-01

    In the framework of the development of a probabilistic safety project on site (notion of living PSA), Saint Laurent des Eaux NPP implements a specific EDF reliability database. The main goals of this project at Saint Laurent des Eaux are: to expand risk analysis and to constitute an effective local basis of thinking about operating safety by requiring the participation of all departments of a power plant: analysis of all potential operating transients, unavailability consequences... that means to go further than a simple culture of applying operating rules; to involve nuclear power plant operators in experience feedback and its analysis, especially by following up behaviour of components and of safety functions; to allow plant safety managers to outline their decisions facing safety authorities for notwithstanding, preventive maintenance programme, operating incident evaluation. To hit these goals requires feedback data, tools, techniques and development of skills. The first step is to obtain specific reliability data on the site. Raw data come from plant maintenance management system which processes all maintenance activities and keeps in memory all the records of component failures and maintenance activities. Plant specific reliability data are estimated with a Bayesian model which combines these validated raw data with corporate generic data. This approach allow to provide reliability data for main components modelled in PSA, to check the consistency of the maintenance program (RCM), to verify hypothesis made at the design about component reliability. A number of studies, related to components reliability as well as decision making process of specific incident risk evaluation have been carried out. This paper provides also an overview of the process management set up on site from raw database to specific reliability database in compliance with established corporate objectives. (authors). 4 figs

  15. Nuclear power plant reliability database management

    Meslin, Th [Electricite de France (EDF), 41 - Saint-Laurent-des-Eaux (France); Aufort, P

    1996-04-01

    In the framework of the development of a probabilistic safety project on site (notion of living PSA), Saint Laurent des Eaux NPP implements a specific EDF reliability database. The main goals of this project at Saint Laurent des Eaux are: to expand risk analysis and to constitute an effective local basis of thinking about operating safety by requiring the participation of all departments of a power plant: analysis of all potential operating transients, unavailability consequences... that means to go further than a simple culture of applying operating rules; to involve nuclear power plant operators in experience feedback and its analysis, especially by following up behaviour of components and of safety functions; to allow plant safety managers to outline their decisions facing safety authorities for notwithstanding, preventive maintenance programme, operating incident evaluation. To hit these goals requires feedback data, tools, techniques and development of skills. The first step is to obtain specific reliability data on the site. Raw data come from plant maintenance management system which processes all maintenance activities and keeps in memory all the records of component failures and maintenance activities. Plant specific reliability data are estimated with a Bayesian model which combines these validated raw data with corporate generic data. This approach allow to provide reliability data for main components modelled in PSA, to check the consistency of the maintenance program (RCM), to verify hypothesis made at the design about component reliability. A number of studies, related to components reliability as well as decision making process of specific incident risk evaluation have been carried out. This paper provides also an overview of the process management set up on site from raw database to specific reliability database in compliance with established corporate objectives. (authors). 4 figs.

  16. Integrated Space Asset Management Database and Modeling

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  17. NGNP Risk Management Database: A Model for Managing Risk

    Collins, John

    2009-01-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft(reg s ign) Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool's design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  18. NGNP Risk Management Database: A Model for Managing Risk

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  19. Integrated Space Asset Management Database and Modeling

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  20. Features of TMR for a Successful Clinical and Research Database

    Pryor, David B.; Stead, William W.; Hammond, W. Edward; Califf, Robert M.; Rosati, Robert A.

    1982-01-01

    A database can be used for clinical practice and for research. The design of the database is important if both uses are to succeed. A clinical database must be efficient and flexible. A research database requires consistent observations recorded in a format which permits complete recall of the experience. In addition, the database should be designed to distinguish between missing data and negative responses, and to minimize transcription errors during the recording process.

  1. Databases

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  2. Land and Waste Management Research Publications

    Resources from the Science Inventory database of EPA's Office of Research and Development, as well as EPA's Science Matters journal, include research on managing contaminated sites and ground water modeling and decontamination technologies.

  3. Access database application in medical treatment management platform

    Wu Qingming

    2014-01-01

    For timely, accurate and flexible access to medical expenses data, we applied Microsoft Access 2003 database management software, and we finished the establishment of a management platform for medical expenses. By developing management platform for medical expenses, overall hospital costs for medical expenses can be controlled to achieve a real-time monitoring of medical expenses. Using the Access database management platform for medical expenses not only changes the management model, but also promotes a sound management system for medical expenses. (authors)

  4. The Network Configuration of an Object Relational Database Management System

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  5. Computerized nuclear material database management system for power reactors

    Cheng Binghao; Zhu Rongbao; Liu Daming; Cao Bin; Liu Ling; Tan Yajun; Jiang Jincai

    1994-01-01

    The software packages for nuclear material database management for power reactors are described. The database structure, data flow and model for management of the database are analysed. Also mentioned are the main functions and characterizations of the software packages, which are successfully installed and used at both the Daya Bay Nuclear Power Plant and the Qinshan Nuclear Power Plant for the purposed of handling nuclear material database automatically

  6. MonetDB: Two Decades of Research in Column-oriented Database Architectures

    Idreos, S.; Groffen, F.; Nes, N.; Manegold, S.; Mullender, S.; Kersten, M.

    2012-01-01

    MonetDB is a state-of-the-art open-source column-store database management system targeting applications in need for analytics over large collections of data. MonetDB is actively used nowadays in health care, in telecommunications as well as in scientific databases and in data management research,

  7. Portuguese food composition database quality management system.

    Oliveira, L M; Castanheira, I P; Dantas, M A; Porto, A A; Calhau, M A

    2010-11-01

    The harmonisation of food composition databases (FCDB) has been a recognised need among users, producers and stakeholders of food composition data (FCD). To reach harmonisation of FCDBs among the national compiler partners, the European Food Information Resource (EuroFIR) Network of Excellence set up a series of guidelines and quality requirements, together with recommendations to implement quality management systems (QMS) in FCDBs. The Portuguese National Institute of Health (INSA) is the national FCDB compiler in Portugal and is also a EuroFIR partner. INSA's QMS complies with ISO/IEC (International Organization for Standardisation/International Electrotechnical Commission) 17025 requirements. The purpose of this work is to report on the strategy used and progress made for extending INSA's QMS to the Portuguese FCDB in alignment with EuroFIR guidelines. A stepwise approach was used to extend INSA's QMS to the Portuguese FCDB. The approach included selection of reference standards and guides and the collection of relevant quality documents directly or indirectly related to the compilation process; selection of the adequate quality requirements; assessment of adequacy and level of requirement implementation in the current INSA's QMS; implementation of the selected requirements; and EuroFIR's preassessment 'pilot' auditing. The strategy used to design and implement the extension of INSA's QMS to the Portuguese FCDB is reported in this paper. The QMS elements have been established by consensus. ISO/IEC 17025 management requirements (except 4.5) and 5.2 technical requirements, as well as all EuroFIR requirements (including technical guidelines, FCD compilation flowchart and standard operating procedures), have been selected for implementation. The results indicate that the quality management requirements of ISO/IEC 17025 in place in INSA fit the needs for document control, audits, contract review, non-conformity work and corrective actions, and users' (customers

  8. Grantees Guide to Research Databases at IDRC

    . 7. 7. Creating search alerts. 9. 8. IDRC Digital Library (IDL). 11. 9. Key contacts. 12. Commercial databases conditions of use. These resources are governed by license agreements which restrict use to IDRC employees and grantees taking ...

  9. The Human Communication Research Centre dialogue database.

    Anderson, A H; Garrod, S C; Clark, A; Boyle, E; Mullin, J

    1992-10-01

    The HCRC dialogue database consists of over 700 transcribed and coded dialogues from pairs of speakers aged from seven to fourteen. The speakers are recorded while tackling co-operative problem-solving tasks and the same pairs of speakers are recorded over two years tackling 10 different versions of our two tasks. In addition there are over 200 dialogues recorded between pairs of undergraduate speakers engaged on versions of the same tasks. Access to the database, and to its accompanying custom-built search software, is available electronically over the JANET system by contacting liz@psy.glasgow.ac.uk, from whom further information about the database and a user's guide to the database can be obtained.

  10. Research Directions in Database Security IV

    1993-07-01

    second algorithm, which is based on multiversion timestamp ordering, is that high level transactions can be forced to read arbitrarily old data values...system. The first, the single ver- sion model, stores only the latest veision of each data item, while the second, the 88 multiversion model, stores... Multiversion Database Model In the standard database model, where there is only one version of each data item, all transactions compete for the most recent

  11. A multidisciplinary database for geophysical time series management

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  12. An Introduction to the DB Relational Database Management System

    Ward, J.R.

    1982-01-01

    This paper is an introductory guide to using the Db programs to maintain and query a relational database on the UNIX operating system. In the past decade. increasing interest has been shown in the development of relational database management systems. Db is an attempt to incorporate a flexible and powerful relational database system within the user environment presented by the UNIX operating system. The family of Db programs is useful for maintaining a database of information that i...

  13. Geo-scientific database for research and development purposes

    Tabani, P.; Mangeot, A.; Crabol, V.; Delage, P.; Dewonck, S.; Auriere, C.

    2012-01-01

    Document available in extended abstract form only. The Research and Development Division must manage, secure and reliable manner, a large number of data from scientific disciplines and diverse means of acquisition (observations, measurements, experiments, etc.). This management is particularly important for the Underground research Laboratory, the source of many recording continuous measurements. Thus, from its conception, Andra has implemented two management tools of scientific information, the 'Acquisition System and Data Management' [SAGD] and GEO database with its associated applications. Beyond its own needs, Andra wants to share its achievements with the scientific community, and it therefore provides the data stored in its databases or samples of rock or water when they are available. Acquisition and Data Management (SAGD) This system manages data from sensors installed at several sites. Some sites are on the surface (piezometric, atmospheric and environmental stations), the other are in the Underground Research Laboratory. This system also incorporates data from experiments in which Andra participates in Mont Terri Laboratory in Switzerland. S.A.G.D fulfils these objectives by: - Make available in real time on a single system, with scientists from Andra but also different partners or providers who need it, all experimental data from measurement points - Displaying the recorded data on temporal windows and specific time step, - Allowing remote control of the experimentations, - Ensuring the traceability of all recorded information, - Ensuring data storage in a data base. S.A.G.D has been deployed in the first experimental drift at -445 m in November 2004. It was subsequently extended to the underground Mont Terri laboratory in Switzerland in 2005, to the entire surface logging network of the Meuse / Haute-Marne Center in 2008 and to the environmental network in 2011. All information is acquired, stored and manage by a software called Geoscope. This software

  14. System factors influencing utilisation of Research4Life databases by ...

    This is a comprehensive investigation of the influence of system factors on utilisation of Research4Life databases. It is part of a doctoral dissertation. Research4Life databases are new innovative technologies being investigated in a new context – utilisation by NARIs scientists for research. The study adopted the descriptive ...

  15. Resource Survey Relational Database Management System

    National Oceanic and Atmospheric Administration, Department of Commerce — Mississippi Laboratories employ both enterprise and localized data collection systems for recording data. The databases utilized by these applications range from...

  16. Capacity development in food composition database management and nutritional research and education in Central and Eastern European, Middle Eastern and North African countries.

    Gurinović, M; Witthöft, C M; Tepšić, J; Ranić, M; Hulshof, P J M; Hollman, P C; Porubska, J; Gohar, A; Debeljak-Martačić, J; Petrović-Oggiano, G; Novaković, R; Glibetić, M; Oshaug, A

    2010-11-01

    Capacity development (CD) in food and nutrition is much more than formal training and includes human resource development, and organisational, institutional and legal framework development with the aim of enhancing nutrition-relevant knowledge and skills to support infrastructural development. The goal of the European Food Information Resource (EuroFIR) Network of Excellence has been to develop and integrate food composition data throughout Europe. EuroFIR joined forces in CD with the United Nations (UN) University and UN System Standing Committee on Nutrition, the Network for Capacity Development in Nutrition in Central and Eastern Europe, the Central and Eastern European Countries Food Data Systems network and with the Middle East and North African Capacity Building Initiative. The aim of this paper is to discuss an inventory of the status of food composition databases (FCDBs) and the training needs of compilers in non-EuroFIR countries in Central and Eastern Europe (CEE) and in the Middle East and North Africa (MENA), and to present the CD achieved through EuroFIR and other network collaborations. Two online questionnaires were created addressing the FCDB status and specific training needs in countries of the targeted regions. Data were collected during 2006-2008 and then analysed. Subsequently, CD activities were organised. Contacts were established in 19 CEE and 7 MENA countries, of which several had national food composition tables, but no electronic versions. Education, training, workshops, networking and the sharing of experiences were uniformly requested. Subsequently, CD activities in EuroFIR were organised focussing on food composition courses, exchange visits, workshops and individual training for PhD students, junior scientists and other staff categories, as well as conferences linked to food composition research and food information. To facilitate CD activities, EuroFIR has signed a Memorandum of Understanding with the Czech Republic, Hungary

  17. Privacy and Data-Based Research

    Ori Heffetz; Katrina Ligett

    2013-01-01

    What can we, as users of microdata, formally guarantee to the individuals (or firms) in our dataset, regarding their privacy? We retell a few stories, well-known in data-privacy circles, of failed anonymization attempts in publicly released datasets. We then provide a mostly informal introduction to several ideas from the literature on differential privacy, an active literature in computer science that studies formal approaches to preserving the privacy of individuals in statistical databases...

  18. Research Directions in Database Security, II

    1990-11-01

    5 Flexible Access Controls Bill Maimone of Oracle Corporation gave a presentation of Oracle’s new roles facility. The approach is apparently motivated ...See rule 5 for substitution of DAC mechanisms.) PS4 : Overclassification of data is to be avoided. PS5: Authorization to update data and create...of designing databases as op- posed to the abstract nature of operating system requirements. The primary motivation be- hind developing the Homework

  19. Research in Hospitality Management

    Research in Hospitality Management (RHM) is a peer-reviewed journal ... to the quintessential managerial areas of Finance, Human Resources, Operations, ... competency and career development of hospitality management students · EMAIL ...

  20. Enhanced DIII-D Data Management Through a Relational Database

    Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.

    2000-10-01

    A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.

  1. A user's manual for managing database system of tensile property

    Ryu, Woo Seok; Park, S. J.; Kim, D. H.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the tensile database system for managing the tensile property test data. The data base constructed the data produced from tensile property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The tensile database system was developed by internet method using Java, PL/SQL, JSP(Java Server Pages) tool

  2. Database basic design for safe management radioactive waste

    Son, D. C.; Ahn, K. I.; Jung, D. J.; Cho, Y. B.

    2003-01-01

    As the amount of radioactive waste and related information to be managed are increasing, some organizations are trying or planning to computerize the management on radioactive waste. When we consider that information on safe management of radioactive waste should be used in association with national radioactive waste management project, standardization of data form and its protocol is required, Korea Institute of Nuclear Safety(KINS) will establish and operate nationwide integrated database in order to effectively manage a large amount of information on national radioactive waste. This database allows not only to trace and manage the trend of radioactive waste occurrence and in storage but also to produce reliable analysis results for the quantity accumulated. Consequently, we can provide necessary information for national radioactive waste management policy and related industry's planing. This study explains the database design which is the essential element for information management

  3. MST radar data-base management

    Wickwar, V. B.

    1983-01-01

    Data management for Mesospheric-Stratospheric-Tropospheric, (MST) radars is addressed. An incoherent-scatter radar data base is discussed in terms of purpose, centralization, scope, and nature of the data base management system.

  4. Management Information Systems Research.

    Research on management information systems is illusive in many respects. Part of the basic research problem in MIS stems from the absence of standard...decision making. But the transition from these results to the realization of ’satisfactory’ management information systems remains difficult indeed. The...paper discusses several aspects of research on management information systems and reviews a selection of efforts that appear significant for future progress. (Author)

  5. Computer Application Of Object Oriented Database Management ...

    Object Oriented Systems (OOS) have been widely adopted in software engineering because of their superiority with respect to data extensibility. The present trend in the software engineering process (SEP) towards concurrent computing raises novel concerns for the facilities and technology available in database ...

  6. The Use of a Relational Database in Qualitative Research on Educational Computing.

    Winer, Laura R.; Carriere, Mario

    1990-01-01

    Discusses the use of a relational database as a data management and analysis tool for nonexperimental qualitative research, and describes the use of the Reflex Plus database in the Vitrine 2001 project in Quebec to study computer-based learning environments. Information systems are also discussed, and the use of a conceptual model is explained.…

  7. [The future of clinical laboratory database management system].

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  8. Nuclear data processing using a database management system

    Castilla, V.; Gonzalez, L.

    1991-01-01

    A database management system that permits the design of relational models was used to create an integrated database with experimental and evaluated nuclear data.A system that reduces the time and cost of processing was created for computers type EC or compatibles.A set of programs for the conversion from nuclear calculated data output format to EXFOR format was developed.A dictionary to perform a retrospective search in the ENDF database was created too

  9. International database on ageing management and life extension

    Ianko, L.; Lyssakov, V.; McLachlan, D.; Russell, J.; Mukhametshin, V.

    1995-01-01

    International database on ageing management and life extension for reactor pressure vessel materials (RPVM) is described with the emphasis on the following issues: requirements of the system; design concepts for RPVM database system; data collection, processing and storage; information retrieval and dissemination; RPVM information assessment and evaluation. 1 fig

  10. Managing Multiuser Database Buffers Using Data Mining Techniques

    Feng, L.; Lu, H.J.

    2004-01-01

    In this paper, we propose a data-mining-based approach to public buffer management for a multiuser database system, where database buffers are organized into two areas – public and private. While the private buffer areas contain pages to be updated by particular users, the public

  11. CALCOM Database for managing California Commercial Groundfish sample data

    National Oceanic and Atmospheric Administration, Department of Commerce — The CALCOM database is used by the California Cooperative Groundfish Survey to store and manage Commercial market sample data. This data is ultimately used to...

  12. Development of environment radiation database management system

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun [Daeduk College, Taejon (Korea, Republic of)

    1999-03-15

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation.

  13. Development of environment radiation database management system

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun

    1999-03-01

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation

  14. Development of a computational database for application in Probabilistic Safety Analysis of nuclear research reactors

    Macedo, Vagner dos Santos

    2016-01-01

    The objective of this work is to present the computational database that was developed to store technical information and process data on component operation, failure and maintenance for the nuclear research reactors located at the Nuclear and Energy Research Institute (Instituto de Pesquisas Energéticas e Nucleares, IPEN), in São Paulo, Brazil. Data extracted from this database may be applied in the Probabilistic Safety Analysis of these research reactors or in less complex quantitative assessments related to safety, reliability, availability and maintainability of these facilities. This database may be accessed by users of the corporate network, named IPEN intranet. Professionals who require the access to the database must be duly registered by the system administrator, so that they will be able to consult and handle the information. The logical model adopted to represent the database structure is an entity-relationship model, which is in accordance with the protocols installed in IPEN intranet. The open-source relational database management system called MySQL, which is based on the Structured Query Language (SQL), was used in the development of this work. The PHP programming language was adopted to allow users to handle the database. Finally, the main result of this work was the creation a web application for the component reliability database named PSADB, specifically developed for the research reactors of IPEN; furthermore, the database management system provides relevant information efficiently. (author)

  15. A framework for cross-observatory volcanological database management

    Aliotta, Marco Antonio; Amore, Mauro; Cannavò, Flavio; Cassisi, Carmelo; D'Agostino, Marcello; Dolce, Mario; Mastrolia, Andrea; Mangiagli, Salvatore; Messina, Giuseppe; Montalto, Placido; Fabio Pisciotta, Antonino; Prestifilippo, Michele; Rossi, Massimo; Scarpato, Giovanni; Torrisi, Orazio

    2017-04-01

    In the last years, it has been clearly shown how the multiparametric approach is the winning strategy to investigate the complex dynamics of the volcanic systems. This involves the use of different sensor networks, each one dedicated to the acquisition of particular data useful for research and monitoring. The increasing interest devoted to the study of volcanological phenomena led the constitution of different research organizations or observatories, also relative to the same volcanoes, which acquire large amounts of data from sensor networks for the multiparametric monitoring. At INGV we developed a framework, hereinafter called TSDSystem (Time Series Database System), which allows to acquire data streams from several geophysical and geochemical permanent sensor networks (also represented by different data sources such as ASCII, ODBC, URL etc.), located on the main volcanic areas of Southern Italy, and relate them within a relational database management system. Furthermore, spatial data related to different dataset are managed using a GIS module for sharing and visualization purpose. The standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common space and time scale. In order to share data between INGV observatories, and also with Civil Protection, whose activity is related on the same volcanic districts, we designed a "Master View" system that, starting from the implementation of a number of instances of the TSDSystem framework (one for each observatory), makes possible the joint interrogation of data, both temporal and spatial, on instances located in different observatories, through the use of web services technology (RESTful, SOAP). Similarly, it provides metadata for equipment using standard schemas (such as FDSN StationXML). The "Master View" is also responsible for managing the data policy through a "who owns what" system, which allows you to associate viewing/download of

  16. Presidential Libraries Museum Collection Management Database

    National Archives and Records Administration — MCMD serves as a descriptive catalog for the Presidential Libraries museum collections, and also supports a full range of museum collections management processes...

  17. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    Viegas, F; Nairz, A; Goossens, L [CERN, CH-1211 Geneve 23 (Switzerland); Malon, D; Cranshaw, J [Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, IL 60439 (United States); Dimitrov, G [DESY, D-22603 Hamburg (Germany); Nowak, M; Gamboa, C [Brookhaven National Laboratory, PO Box 5000 Upton, NY 11973-5000 (United States); Gallas, E [University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Wong, A [Triumf, 4004 Wesbrook Mall, Vancouver, BC, V6T 2A3 (Canada); Vinek, E [University of Vienna, Dr.-Karl-Lueger-Ring 1, 1010 Vienna (Austria)

    2010-04-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  18. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    Viegas, F; Nairz, A; Goossens, L; Malon, D; Cranshaw, J; Dimitrov, G; Nowak, M; Gamboa, C; Gallas, E; Wong, A; Vinek, E

    2010-01-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  19. The Vocational Guidance Research Database: A Scientometric Approach

    Flores-Buils, Raquel; Gil-Beltran, Jose Manuel; Caballer-Miedes, Antonio; Martinez-Martinez, Miguel Angel

    2012-01-01

    The scientometric study of scientific output through publications in specialized journals cannot be undertaken exclusively with the databases available today. For this reason, the objective of this article is to introduce the "Base de Datos de Investigacion en Orientacion Vocacional" [Vocational Guidance Research Database], based on the…

  20. Report of the SRC working party on databases and database management systems

    Crennell, K.M.

    1980-10-01

    An SRC working party, set up to consider the subject of support for databases within the SRC, were asked to identify interested individuals and user communities, establish which features of database management systems they felt were desirable, arrange demonstrations of possible systems and then make recommendations for systems, funding and likely manpower requirements. This report describes the activities and lists the recommendations of the working party and contains a list of databses maintained or proposed by those who replied to a questionnaire. (author)

  1. Insertion algorithms for network model database management systems

    Mamadolimov, Abdurashid; Khikmat, Saburov

    2017-12-01

    The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, forms partial order. When a database is large and a query comparison is expensive then the efficiency requirement of managing algorithms is minimizing the number of query comparisons. We consider updating operation for network model database management systems. We develop a new sequantial algorithm for updating operation. Also we suggest a distributed version of the algorithm.

  2. Managing the BaBar object oriented database

    Hasan, A.; Trunov, A.

    2001-01-01

    The BaBar experiment stores its data in an Object Oriented federated database supplied by Objectivity/DB(tm). This database is currently 350TB in size and is expected to increase considerably as the experiment matures. Management of this database requires careful planning and specialized tools in order to make the data available to physicists in an efficient and timely manner. The authors discuss the operational issues and management tools that were developed during the previous run to deal with this vast quantity of data at SLAC

  3. Database management system for large container inspection system

    Gao Wenhuan; Li Zheng; Kang Kejun; Song Binshan; Liu Fang

    1998-01-01

    Large Container Inspection System (LCIS) based on radiation imaging technology is a powerful tool for the Customs to check the contents inside a large container without opening it. The author has discussed a database application system, as a part of Signal and Image System (SIS), for the LCIS. The basic requirements analysis was done first. Then the selections of computer hardware, operating system, and database management system were made according to the technology and market products circumstance. Based on the above considerations, a database application system with central management and distributed operation features has been implemented

  4. Development of Krsko Severe Accident Management Database (SAMD)

    Basic, I.; Kocnar, R.

    1996-01-01

    Severe Accident Management is a framework to identify and implement the Emergency Response Capabilities that can be used to prevent or mitigate severe accidents and their consequences. Krsko Severe Accident Management Database documents the severe accident management activities which are developed in the NPP Krsko, based on the Krsko IPE (Individual Plant Examination) insights and Generic WOG SAMGs (Westinghouse Owners Group Severe Accident Management Guidance). (author)

  5. Relational Information Management Data-Base System

    Storaasli, O. O.; Erickson, W. J.; Gray, F. P.; Comfort, D. L.; Wahlstrom, S. O.; Von Limbach, G.

    1985-01-01

    DBMS with several features particularly useful to scientists and engineers. RIM5 interfaced with any application program written in language capable of Calling FORTRAN routines. Applications include data management for Space Shuttle Columbia tiles, aircraft flight tests, high-pressure piping, atmospheric chemistry, census, university registration, CAD/CAM Geometry, and civil-engineering dam construction.

  6. TaxMan: a taxonomic database manager

    Blaxter Mark

    2006-12-01

    Full Text Available Abstract Background Phylogenetic analysis of large, multiple-gene datasets, assembled from public sequence databases, is rapidly becoming a popular way to approach difficult phylogenetic problems. Supermatrices (concatenated multiple sequence alignments of multiple genes can yield more phylogenetic signal than individual genes. However, manually assembling such datasets for a large taxonomic group is time-consuming and error-prone. Additionally, sequence curation, alignment and assessment of the results of phylogenetic analysis are made particularly difficult by the potential for a given gene in a given species to be unrepresented, or to be represented by multiple or partial sequences. We have developed a software package, TaxMan, that largely automates the processes of sequence acquisition, consensus building, alignment and taxon selection to facilitate this type of phylogenetic study. Results TaxMan uses freely available tools to allow rapid assembly, storage and analysis of large, aligned DNA and protein sequence datasets for user-defined sets of species and genes. The user provides GenBank format files and a list of gene names and synonyms for the loci to analyse. Sequences are extracted from the GenBank files on the basis of annotation and sequence similarity. Consensus sequences are built automatically. Alignment is carried out (where possible, at the protein level and aligned sequences are stored in a database. TaxMan can automatically determine the best subset of taxa to examine phylogeny at a given taxonomic level. By using the stored aligned sequences, large concatenated multiple sequence alignments can be generated rapidly for a subset and output in analysis-ready file formats. Trees resulting from phylogenetic analysis can be stored and compared with a reference taxonomy. Conclusion TaxMan allows rapid automated assembly of a multigene datasets of aligned sequences for large taxonomic groups. By extracting sequences on the basis of

  7. Representing clinical communication knowledge through database management system integration.

    Khairat, Saif; Craven, Catherine; Gong, Yang

    2012-01-01

    Clinical communication failures are considered the leading cause of medical errors [1]. The complexity of the clinical culture and the significant variance in training and education levels form a challenge to enhancing communication within the clinical team. In order to improve communication, a comprehensive understanding of the overall communication process in health care is required. In an attempt to further understand clinical communication, we conducted a thorough methodology literature review to identify strengths and limitations of previous approaches [2]. Our research proposes a new data collection method to study the clinical communication activities among Intensive Care Unit (ICU) clinical teams with a primary focus on the attending physician. In this paper, we present the first ICU communication instrument, and, we introduce the use of database management system to aid in discovering patterns and associations within our ICU communications data repository.

  8. Application of cloud database in the management of clinical data of patients with skin diseases.

    Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning

    2015-04-01

    To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.

  9. Loss Database Architecture for Disaster Risk Management

    RIOS DIAZ FRANCISCO; MARIN FERRER MONTSERRAT

    2018-01-01

    The reformed Union civil protection legislation (Decision on a Union Civil Protection Mechanism), which entered into force on 1 January 2014, is paving the way for more resilient communities by including key actions related to disaster prevention such as developing national risk assessments and the refinement of risk management planning. Under the Decision, Member States agreed to “develop risk assessments at national or appropriate sub- national level and make available to the Commission a s...

  10. Research in Hospitality Management

    Research in Hospitality Management. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 5, No 1 (2015) >. Log in or Register to get access to full text downloads.

  11. Development of a Relational Database for Learning Management Systems

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  12. Database management in the new GANIL control system

    Lecorche, E.; Lermine, P.

    1993-01-01

    At the start of the new control system design, decision was made to manage the huge amount of data by means of a database management system. The first implementations built on the INGRES relational database are described. Real time and data management domains are shown, and problems induced by Ada/SQL interfacing are briefly discussed. Database management concerns the whole hardware and software configuration for the GANIL pieces of equipment and the alarm system either for the alarm configuration or for the alarm logs. An other field of application encompasses the beam parameter archiving as a function of the various kinds of beams accelerated at GANIL (ion species, energies, charge states). (author) 3 refs., 4 figs

  13. Using a database to manage resolution of comments on standards

    Holloran, R.W.; Kelley, R.P.

    1995-01-01

    Features of production systems that would enhance development and implementation of procedures and other standards were first suggested in 1988 described how a database could provide the features sought for managing the content of structured documents such as standards and procedures. This paper describes enhancements of the database that manage the more complex links associated with resolution of comments. Displaying the linked information on a computer display aids comment resolvers. A hardcopy report generated by the database permits others to independently evaluate the resolution of comments in context with the original text of the standard, the comment, and the revised text of the standard. Because the links are maintained by the database, consistency between the agreed-upon resolutions and the text of the standard can be maintained throughout the subsequent reviews of the standard. Each of the links is bidirectional; i.e., the relationships between any two documents can be viewed from the perspective of either document

  14. DOE technology information management system database study report

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  15. TRENDS: The aeronautical post-test database management system

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  16. Towards the management of the databases founded on descriptions ...

    The canonical model is defined in the concept language, developed in our research ... the notion of classes to produce descriptions which are, also, used in the reasoning process. ... Keys-Words: Descriptions logic/ Databases/ Semantics.

  17. Kingfisher: a system for remote sensing image database management

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  18. A Philosophy Research Database to Share Data Resources

    Jili Cheng

    2007-12-01

    Full Text Available Philosophy research used to rely mainly on the traditional published journals and newspapers for collecting or communicating data. However, because of financial limits or lack of capability to collect data, required published materials and even restricted materials and developing information from research projects often could not be obtained. The rise of digital techniques and Internet opportunities has allowed data resource sharing of philosophy research. However, although there are several ICPs with large-scale comprehensive commercial databases in the field in China, no real non-profit professional database for philosophy researchers exists. Therefore, in 2002, the Philosophy Institute of the Chinese Academy of Social Sciences began a project to build "The Database of Philosophy Research." Until Mar. 2006 the number of subsets had reached 30, with more than 30,000 records, retrieval services reached 6,000, and article-reading reached 30,000. Because of the concept of intellectual property, the service of the database is currently limited to the information held in CASS. Nevertheless, this is the first academic database for philosophy research, so its orientation is towards resource-sharing, leading users to data, and serving large number of demands from other provinces and departments.

  19. The use of modern databases in managing nuclear material inventories

    Behrens, R.G.

    1994-01-01

    The need for a useful nuclear materials database to assist in the management of nuclear materials within the Department of Energy (DOE) Weapons Complex is becoming significantly more important as the mission of the DOE Complex changes and both international safeguards and storage issues become drivers in determining how these materials are managed. A well designed nuclear material inventory database can provide the Nuclear Materials Manager with an essential cost effective tool for timely analysis and reporting of inventories. This paper discusses the use of databases as a management tool to meet increasing requirements for accurate and timely information on nuclear material inventories and related information. From the end user perspective, this paper discusses the rationale, philosophy, and technical requirements for an integrated database to meet the needs for a variety of users such as those working in the areas of Safeguards, Materials Control and Accountability (MC ampersand A), Nuclear Materials Management, Waste Management, materials processing, packaging and inspection, and interim/long term storage

  20. Fedora Content Modelling for Improved Services for Research Databases

    Elbæk, Mikael Karstensen; Heller, Alfred; Pedersen, Gert Schmeltz

    A re-implementation of the research database of the Technical University of Denmark, DTU, is based on Fedora. The backbone consists of content models for primary and secondary entities and their relationships, giving flexible and powerful extraction capabilities for interoperability and reporting....... By adopting such an abstract data model, the platform enables new and improved services for researchers, librarians and administrators....

  1. Development of the ageing management database of PUSPATI TRIGA reactor

    Ramli, Nurhayati, E-mail: nurhayati@nm.gov.my; Tom, Phongsakorn Prak; Husain, Nurfazila; Farid, Mohd Fairus Abd; Ramli, Shaharum [Reactor Technology Centre, Malaysian Nuclear Agency, MOSTI, Bangi, 43000 Kajang, Selangor (Malaysia); Maskin, Mazleha [Science Program, Faculty of Science and Technology, Universiti Kebangsaan Malaysia, Selangor (Malaysia); Adnan, Amirul Syazwan; Abidin, Nurul Husna Zainal [Faculty of Petroleum and Renewable Energy Engineering, Universiti Teknologi Malaysia (Malaysia)

    2016-01-22

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  2. Atlantic Canada's energy research and development website and database

    2005-01-01

    Petroleum Research Atlantic Canada maintains a website devoted to energy research and development in Atlantic Canada. The site can be viewed on the world wide web at www.energyresearch.ca. It includes a searchable database with information about researchers in Nova Scotia, their projects and published materials on issues related to hydrocarbons, alternative energy technologies, energy efficiency, climate change, environmental impacts and policy. The website also includes links to research funding agencies, external related databases and related energy organizations around the world. Nova Scotia-based users are invited to submit their academic, private or public research to the site. Before being uploaded into the database, a site administrator reviews and processes all new information. Users are asked to identify their areas of interest according to the following research categories: alternative or renewable energy technologies; climate change; coal; computer applications; economics; energy efficiency; environmental impacts; geology; geomatics; geophysics; health and safety; human factors; hydrocarbons; meteorology and oceanology (metocean) activities; petroleum operations in deep and shallow waters; policy; and power generation and supply. The database can be searched 5 ways according to topic, researchers, publication, projects or funding agency. refs., tabs., figs

  3. PACSY, a relational database management system for protein structure and chemical shift analysis.

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L

    2012-10-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu.

  4. PACSY, a relational database management system for protein structure and chemical shift analysis

    Lee, Woonghee, E-mail: whlee@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States); Yu, Wookyung [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Kim, Suhkmann [Pusan National University, Department of Chemistry and Chemistry Institute for Functional Materials (Korea, Republic of); Chang, Iksoo [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Lee, Weontae, E-mail: wlee@spin.yonsei.ac.kr [Yonsei University, Structural Biochemistry and Molecular Biophysics Laboratory, Department of Biochemistry (Korea, Republic of); Markley, John L., E-mail: markley@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States)

    2012-10-15

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  5. PACSY, a relational database management system for protein structure and chemical shift analysis

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu. PMID:22903636

  6. PACSY, a relational database management system for protein structure and chemical shift analysis

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L.

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  7. Development of the severe accident risk information database management system SARD

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies

  8. Development of the severe accident risk information database management system SARD

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies.

  9. Fusion research and technology records in INIS database

    Hillebrand, C.D.

    1998-01-01

    This article is a summary of a survey study ''''A survey on publications in Fusion Research and Technology. Science and Technology Indicators in Fusion R and T'''' by the same author on Fusion R and T records in the International Nuclear Information System (INIS) bibliographic database. In that study, for the first time, all scientometric and bibliometric information contained in a bibliographic database, using INIS records, is analyzed and quantified, specific to a selected field of science and technology. A variety of new science and technology indicators which can be used for evaluating research and development activities is also presented in that study that study

  10. National Levee Database: monitoring, vulnerability assessment and management in Italy

    Barbetta, Silvia; Camici, Stefania; Maccioni, Pamela; Moramarco, Tommaso

    2015-04-01

    A properly designed and constructed levees system can often be an effective device for repelling floodwaters and provide barriers against inundation to protect urbanized and industrial areas. However, the delineation of flooding-prone areas and the related hydraulic hazard mapping taking account of uncertainty (Apel et al., 2008) are usually developed with a scarce consideration of the possible occurrence of levee failures along river channels (Mazzoleni et al., 2014). Indeed, it is well known that flooding is frequently the result of levee failures that can be triggered by several factors, as: (1) overtopping, (2) scouring of the foundation, (3) seepage/piping of levee body/foundation, and (4) sliding of the foundation. Among these failure mechanisms that are influenced by the levee's geometrical configuration, hydraulic conditions (e.g. river level and seepage), and material properties (e.g. permeability, cohesion, porosity, compaction), the piping caused by seepage (ICOLD, http://www.icold-cigb.org) is considered one of the most dominant levee failure mechanisms (Colleselli F., 1994; Wallingford H. R., 2003). The difficulty of estimating the hydraulic parameters to properly describe the seepage line within the body and foundation of the levee implies that the study of the critical flood wave routing is typically carried out by assuming that the levee system is undamaged during the flood event. In this context, implementing and making operational a National Levee Database (NLD), effectively structured and continuously updated, becomes fundamental to have a searchable inventory of information about levees available as a key resource supporting decisions and actions affecting levee safety. The ItaliaN LEvee Database (INLED) has been recently developed by the Research Institute for Geo-Hydrological Protection (IRPI) for the Civil Protection Department of the Presidency of Council of Ministers. INLED has the main focus of collecting comprehensive information about

  11. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  12. Database to manage personal dosimetry Hospital Universitario de La Ribera

    Melchor, M.; Martinez, D.; Asensio, M.; Candela, F.; Camara, A.

    2011-01-01

    For the management of professionally exposed personnel dosimetry, da La are required for the use and return of dosimeters. in the Department of Radio Physics and Radiation Protection have designed and implemented a database management staff dosimetry Hospital and Area Health Centers. The specific objectives were easily import data from the National Center dosimetric dosimetry, consulting records in a simple dosimetry, dosimeters allow rotary handle, and also get reports from different periods of time to know the return data for users, services, etc.

  13. Heterogeneous Biomedical Database Integration Using a Hybrid Strategy: A p53 Cancer Research Database

    Vadim Y. Bichutskiy

    2006-01-01

    Full Text Available Complex problems in life science research give rise to multidisciplinary collaboration, and hence, to the need for heterogeneous database integration. The tumor suppressor p53 is mutated in close to 50% of human cancers, and a small drug-like molecule with the ability to restore native function to cancerous p53 mutants is a long-held medical goal of cancer treatment. The Cancer Research DataBase (CRDB was designed in support of a project to find such small molecules. As a cancer informatics project, the CRDB involved small molecule data, computational docking results, functional assays, and protein structure data. As an example of the hybrid strategy for data integration, it combined the mediation and data warehousing approaches. This paper uses the CRDB to illustrate the hybrid strategy as a viable approach to heterogeneous data integration in biomedicine, and provides a design method for those considering similar systems. More efficient data sharing implies increased productivity, and, hopefully, improved chances of success in cancer research. (Code and database schemas are freely downloadable, http://www.igb.uci.edu/research/research.html.

  14. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  15. Use of Knowledge Bases in Education of Database Management

    Radványi, Tibor; Kovács, Emod

    2008-01-01

    In this article we present a segment of Sulinet Digital Knowledgebase curriculum system in which you can find the sections of subject-matter which aid educating the database management. You can follow the order of the course from the beginning when some topics appearance and raise in elementary school, through the topics accomplish in secondary…

  16. CPU and cache efficient management of memory-resident databases

    Pirk, H.; Funke, F.; Grund, M.; Neumann, T.; Leser, U.; Manegold, S.; Kemper, A.; Kersten, M.L.

    2013-01-01

    Memory-Resident Database Management Systems (MRDBMS) have to be optimized for two resources: CPU cycles and memory bandwidth. To optimize for bandwidth in mixed OLTP/OLAP scenarios, the hybrid or Partially Decomposed Storage Model (PDSM) has been proposed. However, in current implementations,

  17. CPU and Cache Efficient Management of Memory-Resident Databases

    H. Pirk (Holger); F. Funke; M. Grund; T. Neumann (Thomas); U. Leser; S. Manegold (Stefan); A. Kemper (Alfons); M.L. Kersten (Martin)

    2013-01-01

    htmlabstractMemory-Resident Database Management Systems (MRDBMS) have to be optimized for two resources: CPU cycles and memory bandwidth. To optimize for bandwidth in mixed OLTP/OLAP scenarios, the hybrid or Partially Decomposed Storage Model (PDSM) has been proposed. However, in current

  18. Selecting a Relational Database Management System for Library Automation Systems.

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  19. Benefits of a relational database for computerized management

    Shepherd, W.W.

    1991-01-01

    This paper reports on a computerized relational database which is the basis for a hazardous materials information management system which is comprehensive, effective, flexible and efficient. The system includes product information for Material Safety Data Sheets (MSDSs), labels, shipping, and the environment and is used in Dowell Schlumberger (DS) operations worldwide for a number of programs including planning, training, emergency response and regulatory compliance

  20. Gas Hydrate Research Database and Web Dissemination Channel

    Micheal Frenkel; Kenneth Kroenlein; V Diky; R.D. Chirico; A. Kazakow; C.D. Muzny; M. Frenkel

    2009-09-30

    To facilitate advances in application of technologies pertaining to gas hydrates, a United States database containing experimentally-derived information about those materials was developed. The Clathrate Hydrate Physical Property Database (NIST Standard Reference Database {number_sign} 156) was developed by the TRC Group at NIST in Boulder, Colorado paralleling a highly-successful database of thermodynamic properties of molecular pure compounds and their mixtures and in association with an international effort on the part of CODATA to aid in international data sharing. Development and population of this database relied on the development of three components of information-processing infrastructure: (1) guided data capture (GDC) software designed to convert data and metadata into a well-organized, electronic format, (2) a relational data storage facility to accommodate all types of numerical and metadata within the scope of the project, and (3) a gas hydrate markup language (GHML) developed to standardize data communications between 'data producers' and 'data users'. Having developed the appropriate data storage and communication technologies, a web-based interface for both the new Clathrate Hydrate Physical Property Database, as well as Scientific Results from the Mallik 2002 Gas Hydrate Production Research Well Program was developed and deployed at http://gashydrates.nist.gov.

  1. Use of a Relational Database to Support Clinical Research: Application in a Diabetes Program

    Lomatch, Diane; Truax, Terry; Savage, Peter

    1981-01-01

    A database has been established to support conduct of clinical research and monitor delivery of medical care for 1200 diabetic patients as part of the Michigan Diabetes Research and Training Center (MDRTC). Use of an intelligent microcomputer to enter and retrieve the data and use of a relational database management system (DBMS) to store and manage data have provided a flexible, efficient method of achieving both support of small projects and monitoring overall activity of the Diabetes Center Unit (DCU). Simplicity of access to data, efficiency in providing data for unanticipated requests, ease of manipulations of relations, security and “logical data independence” were important factors in choosing a relational DBMS. The ability to interface with an interactive statistical program and a graphics program is a major advantage of this system. Out database currently provides support for the operation and analysis of several ongoing research projects.

  2. A database system for enhancing fuel records management capabilities

    Rieke, Phil; Razvi, Junaid

    1994-01-01

    The need to modernize the system of managing a large variety of fuel related data at the TRIGA Reactors Facility at General Atomics, as well as the need to improve NRC nuclear material reporting requirements, prompted the development of a database to cover all aspects of fuel records management. The TRIGA Fuel Database replaces (a) an index card system used for recording fuel movements, (b) hand calculations for uranium burnup, and (c) a somewhat aged and cumbersome system of recording fuel inspection results. It was developed using Microsoft Access, a relational database system for Windows. Instead of relying on various sources for element information, users may now review individual element statistics, record inspection results, calculate element burnup and more, all from within a single application. Taking full advantage of the ease-of-use features designed in to Windows and Access, the user can enter and extract information easily through a number of customized on screen forms, with a wide variety of reporting options available. All forms are accessed through a main 'Options' screen, with the options broken down by categories, including 'Elements', 'Special Elements/Devices', 'Control Rods' and 'Areas'. Relational integrity and data validation rules are enforced to assist in ensuring accurate and meaningful data is entered. Among other items, the database lets the user define: element types (such as FLIP or standard) and subtypes (such as fuel follower, instrumented, etc.), various inspection codes for standardizing inspection results, areas within the facility where elements are located, and the power factors associated with element positions within a reactor. Using fuel moves, power history, power factors and element types, the database tracks uranium burnup and plutonium buildup on a quarterly basis. The Fuel Database was designed with end-users in mind and does not force an operations oriented user to learn any programming or relational database theory in

  3. DANBIO-powerful research database and electronic patient record

    Hetland, Merete Lund

    2011-01-01

    an overview of the research outcome and presents the cohorts of RA patients. The registry, which is approved as a national quality registry, includes patients with RA, PsA and AS, who are followed longitudinally. Data are captured electronically from the source (patients and health personnel). The IT platform...... as an electronic patient 'chronicle' in routine care, and at the same time provides a powerful research database....

  4. Databases

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  5. MonetDB: Two Decades of Research in Column-oriented Database Architectures

    Idreos, Stratos; Groffen, Fabian; Nes, Niels; Manegold, Stefan; Mullender, Sjoerd; Kersten, Martin

    2012-01-01

    textabstractMonetDB is a state-of-the-art open-source column-store database management system targeting applications in need for analytics over large collections of data. MonetDB is actively used nowadays in health care, in telecommunications as well as in scientific databases and in data management research, accumulating on average more than 10,000 downloads on a monthly basis. This paper gives a brief overview of the MonetDB technology as it developed over the past two decades and the main r...

  6. Database And Interface Modifications: Change Management Without Affecting The Clients

    Peryt, M; Martin Marquez, M; Zaharieva, Z

    2011-01-01

    The first Oracle®-based Controls Configuration Database (CCDB) was developed in 1986, by which the controls system of CERN’s Proton Synchrotron became data-driven. Since then, this mission-critical system has evolved tremendously going through several generational changes in terms of the increasing complexity of the control system, software technologies and data models. Today, the CCDB covers the whole CERN accelerator complex and satisfies a much wider range of functional requirements. Despite its online usage, everyday operations of the machines must not be disrupted. This paper describes our approach with respect to dealing with change while ensuring continuity. How do we manage the database schema changes? How do we take advantage of the latest web deployed application development frameworks without alienating the users? How do we minimize impact on the dependent systems connected to databases through various APIs? In this paper we will provide our answers to these questions, and to many more.

  7. Linking international trademark databases to inform IP research and policy

    Petrie, P.

    2016-07-01

    Researchers and policy makers are concerned with many international issues regarding trademarks, such as trademark squatting, cluttering, and dilution. Trademark application data can provide an evidence base to inform government policy regarding these issues, and can also produce quantitative insights into economic trends and brand dynamics. Currently, national trademark databases can provide insight into economic and brand dynamics at the national level, but gaining such insight at an international level is more difficult due to a lack of internationally linked trademark data. We are in the process of building a harmonised international trademark database (the “Patstat of trademarks”), in which equivalent trademarks have been identified across national offices. We have developed a pilot database that incorporates 6.4 million U.S., 1.3 million Australian, and 0.5 million New Zealand trademark applications, spanning over 100 years. The database will be extended to incorporate trademark data from other participating intellectual property (IP) offices as they join the project. Confirmed partners include the United Kingdom, WIPO, and OHIM. We will continue to expand the scope of the project, and intend to include many more IP offices from around the world. In addition to building the pilot database, we have developed a linking algorithm that identifies equivalent trademarks (TMs) across the three jurisdictions. The algorithm can currently be applied to all applications that contain TM text; i.e. around 96% of all applications. In its current state, the algorithm successfully identifies ~ 97% of equivalent TMs that are known to be linked a priori, as they have shared international registration number through the Madrid protocol. When complete, the internationally linked trademark database will be a valuable resource for researchers and policy-makers in fields such as econometrics, intellectual property rights, and brand policy. (Author)

  8. NETMARK: A Schema-less Extension for Relational Databases for Managing Semi-structured Data Dynamically

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  9. The use of database management systems in particle physics

    Stevens, P H; Read, B J; Rittenberg, Alan

    1979-01-01

    Examines data-handling needs and problems in particle physics and looks at three very different efforts by the Particle Data Group (PDG) , the CERN-HERA Group in Geneva, and groups cooperating with ZAED in Germany at resolving these problems. The ZAED effort does not use a database management system (DBMS), the CERN-HERA Group uses an existing, limited capability DBMS, and PDG uses the Berkely Database Management (BDMS), which PDG itself designed and implemented with scientific data-handling needs in mind. The range of problems each group tried to resolve was influenced by whether or not a DBMS was available and by what capabilities it had. Only PDG has been able to systematically address all the problems. The authors discuss the BDMS- centered system PDG is now building in some detail. (12 refs).

  10. Implementation of a database for the management of radioactive sources

    MOHAMAD, M.

    2012-01-01

    In Madagascar, the application of nuclear technology continues to develop. In order to protect the human health and his environment against the harmful effects of the ionizing radiation, each user of radioactive sources has to implement a program of nuclear security and safety and to declare their sources at Regulatory Authority. This Authority must have access to all the informations relating to all the sources and their uses. This work is based on the elaboration of a software using python as programming language and SQlite as database. It makes possible to computerize the radioactive sources management.This application unifies the various existing databases and centralizes the activities of the radioactive sources management.The objective is to follow the movement of each source in the Malagasy territory in order to avoid the risks related on the use of the radioactive sources and the illicit traffic. [fr

  11. Use of SQL Databases to Support Human Resource Management

    Zeman, Jan

    2011-01-01

    Bakalářská práce se zaměřuje na návrh SQL databáze pro podporu Řízení lidských zdrojů a její následné vytvoření v programu MS SQL Server. This thesis focuses on the design of SQL database for support Human resources management and its creation in MS SQL Server. A

  12. Content Based Retrieval Database Management System with Support for Similarity Searching and Query Refinement

    2002-01-01

    to the OODBMS approach. The ORDBMS approach produced such research prototypes as Postgres [155], and Starburst [67] and commercial products such as...Kemnitz. The POSTGRES Next-Generation Database Management System. Communications of the ACM, 34(10):78–92, 1991. [156] Michael Stonebreaker and Dorothy

  13. Software configuration management plan for the Hanford site technical database

    GRAVES, N.J.

    1999-01-01

    The Hanford Site Technical Database (HSTD) is used as the repository/source for the technical requirements baseline and programmatic data input via the Hanford Site and major Hanford Project Systems Engineering (SE) activities. The Hanford Site SE effort has created an integrated technical baseline for the Hanford Site that supports SE processes at the Site and project levels which is captured in the HSTD. The HSTD has been implemented in Ascent Logic Corporation (ALC) Commercial Off-The-Shelf (COTS) package referred to as the Requirements Driven Design (RDD) software. This Software Configuration Management Plan (SCMP) provides a process and means to control and manage software upgrades to the HSTD system

  14. A Support Database System for Integrated System Health Management (ISHM)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  15. African Journal of Management Research

    Topics and themes appropriate for African Journal of Management Research will ... of management and organisational disciplines including: Finance, Operations, ... Marketing Services, Public Administration, Health Services Management, and ...

  16. Computerized database management system for breast cancer patients.

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.

  17. Development of a combined database for meta-epidemiological research

    Savović, Jelena; Harris, Ross J; Wood, Lesley

    2010-01-01

    or review. Unique identifiers were assigned to each reference and used to identify duplicate trials. Sets of meta-analyses with overlapping trials were identified and duplicates removed. Overlapping trials were used to examine agreement between assessments of trial characteristics. The combined database...... database will be used to examine the combined evidence on sources of bias in randomized controlled trials. The strategy used to remove overlap between meta-analyses may be of use for future empirical research. Copyright © 2010 John Wiley & Sons, Ltd.......Collections of meta-analyses assembled in meta-epidemiological studies are used to study associations of trial characteristics with intervention effect estimates. However, methods and findings are not consistent across studies. To combine data from 10 meta-epidemiological studies into a single...

  18. Research Groups & Research Subjects - RED | LSDB Archive [Life Science Database Archive metadata

    Full Text Available rch Groups & Research Subjects Data detail Data name Research Groups & Research Sub... Number of data entries 174 entries Data item Description Research ID Research ID (Subject number) Institute...tion Download License Update History of This Database Site Policy | Contact Us Research Groups & Research Subjects - RED | LSDB Archive ... ...switchLanguage; BLAST Search Image Search Home About Archive Update History Data List Contact us RED Resea... Organization Section Section (Department) User name User name Experimental title Experimental title (Rese

  19. The development of technical database of advanced spent fuel management process

    Ro, Seung Gy; Byeon, Kee Hoh; Song, Dae Yong; Park, Seong Won; Shin, Young Jun

    1999-03-01

    The purpose of this study is to develop the technical database system to provide useful information to researchers who study on the back end nuclear fuel cycle. Technical database of advanced spent fuel management process was developed for a prototype system in 1997. In 1998, this database system is improved into multi-user systems and appended special database which is composed of thermochemical formation data and reaction data. In this report, the detailed specification of our system design is described and the operating methods are illustrated as a user's manual. Also, expanding current system, or interfacing between this system and other system, this report is very useful as a reference. (Author). 10 refs., 18 tabs., 46 fig.

  20. The development of technical database of advanced spent fuel management process

    Ro, Seung Gy; Byeon, Kee Hoh; Song, Dae Yong; Park, Seong Won; Shin, Young Jun

    1999-03-01

    The purpose of this study is to develop the technical database system to provide useful information to researchers who study on the back end nuclear fuel cycle. Technical database of advanced spent fuel management process was developed for a prototype system in 1997. In 1998, this database system is improved into multi-user systems and appended special database which is composed of thermochemical formation data and reaction data. In this report, the detailed specification of our system design is described and the operating methods are illustrated as a user's manual. Also, expanding current system, or interfacing between this system and other system, this report is very useful as a reference. (Author). 10 refs., 18 tabs., 46 fig

  1. Managing research data

    2012-01-01

    Data management has become an essential requirement for information professionals. This title defines what is required to achieve a culture of effective data management offering advice on the skills required, legal and contractual obligations, strategies and management plans and the data management infrastructure of specialists and services.

  2. Landslide databases for applied landslide impact research: the example of the landslide database for the Federal Republic of Germany

    Damm, Bodo; Klose, Martin

    2014-05-01

    This contribution presents an initiative to develop a national landslide database for the Federal Republic of Germany. It highlights structure and contents of the landslide database and outlines its major data sources and the strategy of information retrieval. Furthermore, the contribution exemplifies the database potentials in applied landslide impact research, including statistics of landslide damage, repair, and mitigation. The landslide database offers due to systematic regional data compilation a differentiated data pool of more than 5,000 data sets and over 13,000 single data files. It dates back to 1137 AD and covers landslide sites throughout Germany. In seven main data blocks, the landslide database stores besides information on landslide types, dimensions, and processes, additional data on soil and bedrock properties, geomorphometry, and climatic or other major triggering events. A peculiarity of this landslide database is its storage of data sets on land use effects, damage impacts, hazard mitigation, and landslide costs. Compilation of landslide data is based on a two-tier strategy of data collection. The first step of information retrieval includes systematic web content mining and exploration of online archives of emergency agencies, fire and police departments, and news organizations. Using web and RSS feeds and soon also a focused web crawler, this enables effective nationwide data collection for recent landslides. On the basis of this information, in-depth data mining is performed to deepen and diversify the data pool in key landslide areas. This enables to gather detailed landslide information from, amongst others, agency records, geotechnical reports, climate statistics, maps, and satellite imagery. Landslide data is extracted from these information sources using a mix of methods, including statistical techniques, imagery analysis, and qualitative text interpretation. The landslide database is currently migrated to a spatial database system

  3. Research on Construction of Road Network Database Based on Video Retrieval Technology

    Wang Fengling

    2017-01-01

    Full Text Available Based on the characteristics of the video database and the basic structure of the video database and several typical video data models, the segmentation-based multi-level data model is used to describe the landscape information video database, the network database model and the road network management database system. Landscape information management system detailed design and implementation of a detailed preparation.

  4. Management Guidelines for Database Developers' Teams in Software Development Projects

    Rusu, Lazar; Lin, Yifeng; Hodosi, Georg

    Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.

  5. THE EVOLUTION OF RISK MANAGEMENT RESEARCH: CHANGES IN KNOWLEDGE MAPS

    Iwona Gorzeń-Mitka

    2017-01-01

    One of the leading trends in modern academic research is risk management. Over the years, the approach to risk management has changed and affected many different areas. This study aims to investigate changes in risk management and trends of risk management in the past 20 years. Risk management related publications from 1990 to 2016 were retrieved from the Web of Science and Scopus databases. VOS viewer software was used to analyse the research trend. Literature growth related to risk manageme...

  6. Protocol for developing a Database of Zoonotic disease Research in India (DoZooRI).

    Chatterjee, Pranab; Bhaumik, Soumyadeep; Chauhan, Abhimanyu Singh; Kakkar, Manish

    2017-12-10

    Zoonotic and emerging infectious diseases (EIDs) represent a public health threat that has been acknowledged only recently although they have been on the rise for the past several decades. On an average, every year since the Second World War, one pathogen has emerged or re-emerged on a global scale. Low/middle-income countries such as India bear a significant burden of zoonotic and EIDs. We propose that the creation of a database of published, peer-reviewed research will open up avenues for evidence-based policymaking for targeted prevention and control of zoonoses. A large-scale systematic mapping of the published peer-reviewed research conducted in India will be undertaken. All published research will be included in the database, without any prejudice for quality screening, to broaden the scope of included studies. Structured search strategies will be developed for priority zoonotic diseases (leptospirosis, rabies, anthrax, brucellosis, cysticercosis, salmonellosis, bovine tuberculosis, Japanese encephalitis and rickettsial infections), and multiple databases will be searched for studies conducted in India. The database will be managed and hosted on a cloud-based platform called Rayyan. Individual studies will be tagged based on key preidentified parameters (disease, study design, study type, location, randomisation status and interventions, host involvement and others, as applicable). The database will incorporate already published studies, obviating the need for additional ethical clearances. The database will be made available online, and in collaboration with multisectoral teams, domains of enquiries will be identified and subsequent research questions will be raised. The database will be queried for these and resulting evidence will be analysed and published in peer-reviewed journals. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise

  7. The FoodCast Research Image Database (FRIDa

    Francesco eForoni

    2013-03-01

    Full Text Available In recent years we have witnessed to an increasing interest in food processing and eating behaviors. This is probably due to several reasons. The biological relevance of food choices, the complexity of the food-rich environment in which we presently live (making food-intake regulation difficult, and the increasing health care cost due to illness associated with food (food hazards, food contamination, and aberrant food-intake. Despite the importance of the issues and the relevance of this research, comprehensive and validated databases of stimuli are rather limited, outdated, or not available for noncommercial purposes to independent researchers who aim at developing their own research program. The FoodCast Research Image Database (FRIDa we present here is comprised of 877 images from eight different categories: natural-food (e.g., strawberry, transformed-food (e.g., French fries, rotten-food (e.g., moldy banana, natural-nonfood items (e.g., pinecone, artificial food-related objects (e.g., teacup, artificial objects (e.g., guitar, animals (e.g., camel, and scenes (e.g., airport. FRIDa has been validated on a sample of healthy participants (N=73 on standard variables (e.g., valence, familiarity etc. as well as on other variables specifically related to food items (e.g., perceived calorie content; it also includes data on the visual features of the stimuli (e.g., brightness, high frequency power etc.. FRIDa is a well-controlled, flexible, validated, and freely available (http://foodcast.sissa.it/neuroscience/ tool for researchers in a wide range of academic fields and industry.

  8. Optimized Database of Higher Education Management Using Data Warehouse

    Spits Warnars

    2010-04-01

    Full Text Available The emergence of new higher education institutions has created the competition in higher education market, and data warehouse can be used as an effective technology tools for increasing competitiveness in the higher education market. Data warehouse produce reliable reports for the institution’s high-level management in short time for faster and better decision making, not only on increasing the admission number of students, but also on the possibility to find extraordinary, unconventional funds for the institution. Efficiency comparison was based on length and amount of processed records, total processed byte, amount of processed tables, time to run query and produced record on OLTP database and data warehouse. Efficiency percentages was measured by the formula for percentage increasing and the average efficiency percentage of 461.801,04% shows that using data warehouse is more powerful and efficient rather than using OLTP database. Data warehouse was modeled based on hypercube which is created by limited high demand reports which usually used by high level management. In every table of fact and dimension fields will be inserted which represent the loading constructive merge where the ETL (Extraction, Transformation and Loading process is run based on the old and new files.

  9. Influenza research database: an integrated bioinformatics resource for influenza virus research

    The Influenza Research Database (IRD) is a U.S. National Institute of Allergy and Infectious Diseases (NIAID)-sponsored Bioinformatics Resource Center dedicated to providing bioinformatics support for influenza virus research. IRD facilitates the research and development of vaccines, diagnostics, an...

  10. Privacy protection and public goods: building a genetic database for health research in Newfoundland and Labrador.

    Kosseim, Patricia; Pullman, Daryl; Perrot-Daley, Astrid; Hodgkinson, Kathy; Street, Catherine; Rahman, Proton

    2013-01-01

    To provide a legal and ethical analysis of some of the implementation challenges faced by the Population Therapeutics Research Group (PTRG) at Memorial University (Canada), in using genealogical information offered by individuals for its genetics research database. This paper describes the unique historical and genetic characteristics of the Newfoundland and Labrador founder population, which gave rise to the opportunity for PTRG to build the Newfoundland Genealogy Database containing digitized records of all pre-confederation (1949) census records of the Newfoundland founder population. In addition to building the database, PTRG has developed the Heritability Analytics Infrastructure, a data management structure that stores genotype, phenotype, and pedigree information in a single database, and custom linkage software (KINNECT) to perform pedigree linkages on the genealogy database. A newly adopted legal regimen in Newfoundland and Labrador is discussed. It incorporates health privacy legislation with a unique research ethics statute governing the composition and activities of research ethics boards and, for the first time in Canada, elevating the status of national research ethics guidelines into law. The discussion looks at this integration of legal and ethical principles which provides a flexible and seamless framework for balancing the privacy rights and welfare interests of individuals, families, and larger societies in the creation and use of research data infrastructures as public goods. The complementary legal and ethical frameworks that now coexist in Newfoundland and Labrador provide the legislative authority, ethical legitimacy, and practical flexibility needed to find a workable balance between privacy interests and public goods. Such an approach may also be instructive for other jurisdictions as they seek to construct and use biobanks and related research platforms for genetic research.

  11. Privacy protection and public goods: building a genetic database for health research in Newfoundland and Labrador

    Pullman, Daryl; Perrot-Daley, Astrid; Hodgkinson, Kathy; Street, Catherine; Rahman, Proton

    2013-01-01

    Objective To provide a legal and ethical analysis of some of the implementation challenges faced by the Population Therapeutics Research Group (PTRG) at Memorial University (Canada), in using genealogical information offered by individuals for its genetics research database. Materials and methods This paper describes the unique historical and genetic characteristics of the Newfoundland and Labrador founder population, which gave rise to the opportunity for PTRG to build the Newfoundland Genealogy Database containing digitized records of all pre-confederation (1949) census records of the Newfoundland founder population. In addition to building the database, PTRG has developed the Heritability Analytics Infrastructure, a data management structure that stores genotype, phenotype, and pedigree information in a single database, and custom linkage software (KINNECT) to perform pedigree linkages on the genealogy database. Discussion A newly adopted legal regimen in Newfoundland and Labrador is discussed. It incorporates health privacy legislation with a unique research ethics statute governing the composition and activities of research ethics boards and, for the first time in Canada, elevating the status of national research ethics guidelines into law. The discussion looks at this integration of legal and ethical principles which provides a flexible and seamless framework for balancing the privacy rights and welfare interests of individuals, families, and larger societies in the creation and use of research data infrastructures as public goods. Conclusion The complementary legal and ethical frameworks that now coexist in Newfoundland and Labrador provide the legislative authority, ethical legitimacy, and practical flexibility needed to find a workable balance between privacy interests and public goods. Such an approach may also be instructive for other jurisdictions as they seek to construct and use biobanks and related research platforms for genetic research. PMID

  12. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  13. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de

    2015-01-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  14. A database system for the management of severe accident risk information, SARD

    Ahn, K. I.; Kim, D. H.

    2003-01-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies

  15. A database system for the management of severe accident risk information, SARD

    Ahn, K. I.; Kim, D. H. [KAERI, Taejon (Korea, Republic of)

    2003-10-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies.

  16. METODE RESET PASSWORD LEVEL ROOT PADA RELATIONAL DATABASE MANAGEMENT SYSTEM (RDBMS MySQL

    Taqwa Hariguna

    2011-08-01

    Full Text Available Database merupakan sebuah hal yang penting untuk menyimpan data, dengan database organisasi akan mendapatkan keuntungan dalam beberapa hal, seperti kecepatan akases dan mengurangi penggunaan kertas, namun dengan implementasi database tidak jarang administrator database lupa akan password yang digunakan, hal ini akan mempersulit dalam proses penangganan database. Penelitian ini bertujuan untuk menggali cara mereset password level root pada relational database management system MySQL.

  17. Toward public volume database management: a case study of NOVA, the National Online Volumetric Archive

    Fletcher, Alex; Yoo, Terry S.

    2004-04-01

    Public databases today can be constructed with a wide variety of authoring and management structures. The widespread appeal of Internet search engines suggests that public information be made open and available to common search strategies, making accessible information that would otherwise be hidden by the infrastructure and software interfaces of a traditional database management system. We present the construction and organizational details for managing NOVA, the National Online Volumetric Archive. As an archival effort of the Visible Human Project for supporting medical visualization research, archiving 3D multimodal radiological teaching files, and enhancing medical education with volumetric data, our overall database structure is simplified; archives grow by accruing information, but seldom have to modify, delete, or overwrite stored records. NOVA is being constructed and populated so that it is transparent to the Internet; that is, much of its internal structure is mirrored in HTML allowing internet search engines to investigate, catalog, and link directly to the deep relational structure of the collection index. The key organizational concept for NOVA is the Image Content Group (ICG), an indexing strategy for cataloging incoming data as a set structure rather than by keyword management. These groups are managed through a series of XML files and authoring scripts. We cover the motivation for Image Content Groups, their overall construction, authorship, and management in XML, and the pilot results for creating public data repositories using this strategy.

  18. Status of research reactor spent fuel world-wide: Database summary

    Ritchie, I.G.

    1996-01-01

    Results complied in the research reactor spent fuel database are used to assess the status of research reactor spent fuel world-wide. Fuel assemblies, their types, enrichment, origin of enrichment and geological distribution among the industrialized and developed countries of the world are discussed. Fuel management practices in wet and dry storage facilities and the concerns of reactor operators about long-term storage of their spent fuel are presented and some of the activities carried out by the International Atomic Energy Agency to address the issues associated with research reactor spent fuel are outlined. (author). 4 refs, 17 figs, 4 tabs

  19. YPED: an integrated bioinformatics suite and database for mass spectrometry-based proteomics research.

    Colangelo, Christopher M; Shifman, Mark; Cheung, Kei-Hoi; Stone, Kathryn L; Carriero, Nicholas J; Gulcicek, Erol E; Lam, TuKiet T; Wu, Terence; Bjornson, Robert D; Bruce, Can; Nairn, Angus C; Rinehart, Jesse; Miller, Perry L; Williams, Kenneth R

    2015-02-01

    We report a significantly-enhanced bioinformatics suite and database for proteomics research called Yale Protein Expression Database (YPED) that is used by investigators at more than 300 institutions worldwide. YPED meets the data management, archival, and analysis needs of a high-throughput mass spectrometry-based proteomics research ranging from a single laboratory, group of laboratories within and beyond an institution, to the entire proteomics community. The current version is a significant improvement over the first version in that it contains new modules for liquid chromatography-tandem mass spectrometry (LC-MS/MS) database search results, label and label-free quantitative proteomic analysis, and several scoring outputs for phosphopeptide site localization. In addition, we have added both peptide and protein comparative analysis tools to enable pairwise analysis of distinct peptides/proteins in each sample and of overlapping peptides/proteins between all samples in multiple datasets. We have also implemented a targeted proteomics module for automated multiple reaction monitoring (MRM)/selective reaction monitoring (SRM) assay development. We have linked YPED's database search results and both label-based and label-free fold-change analysis to the Skyline Panorama repository for online spectra visualization. In addition, we have built enhanced functionality to curate peptide identifications into an MS/MS peptide spectral library for all of our protein database search identification results. Copyright © 2015 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  20. Crisis Management Research Summaries

    Brock, Stephen E., Ed.

    2009-01-01

    In this column, Crisis Management in the Schools Interest Group members summarize recent crisis management publications. The first article summarized was a meta-analysis of the risk factors associated with Posttraumatic Stress Disorder (PTSD) among adults. The second study looked at the presence of life stressors among students who were expelled…

  1. Research in Mobile Database Query Optimization and Processing

    Agustinus Borgy Waluyo

    2005-01-01

    Full Text Available The emergence of mobile computing provides the ability to access information at any time and place. However, as mobile computing environments have inherent factors like power, storage, asymmetric communication cost, and bandwidth limitations, efficient query processing and minimum query response time are definitely of great interest. This survey groups a variety of query optimization and processing mechanisms in mobile databases into two main categories, namely: (i query processing strategy, and (ii caching management strategy. Query processing includes both pull and push operations (broadcast mechanisms. We further classify push operation into on-demand broadcast and periodic broadcast. Push operation (on-demand broadcast relates to designing techniques that enable the server to accommodate multiple requests so that the request can be processed efficiently. Push operation (periodic broadcast corresponds to data dissemination strategies. In this scheme, several techniques to improve the query performance by broadcasting data to a population of mobile users are described. A caching management strategy defines a number of methods for maintaining cached data items in clients' local storage. This strategy considers critical caching issues such as caching granularity, caching coherence strategy and caching replacement policy. Finally, this survey concludes with several open issues relating to mobile query optimization and processing strategy.

  2. An object-oriented framework for managing cooperating legacy databases

    Balsters, H; de Brock, EO

    2003-01-01

    We describe a general semantic framework for precise specification of so-called database federations. A database federation provides for tight coupling of a collection of heterogeneous legacy databases into a global integrated system. Our approach to database federation is based on the UML/OCL data

  3. Demonstration of SLUMIS: a clinical database and management information system for a multi organ transplant program.

    Kurtz, M.; Bennett, T.; Garvin, P.; Manuel, F.; Williams, M.; Langreder, S.

    1991-01-01

    Because of the rapid evolution of the heart, heart/lung, liver, kidney and kidney/pancreas transplant programs at our institution, and because of a lack of an existing comprehensive database, we were required to develop a computerized management information system capable of supporting both clinical and research requirements of a multifaceted transplant program. SLUMIS (ST. LOUIS UNIVERSITY MULTI-ORGAN INFORMATION SYSTEM) was developed for the following reasons: 1) to comply with the reportin...

  4. Moving to Google Cloud: Renovation of Global Borehole Temperature Database for Climate Research

    Xiong, Y.; Huang, S.

    2013-12-01

    Borehole temperature comprises an independent archive of information on climate change which is complementary to the instrumental and other proxy climate records. With support from the international geothermal community, a global database of borehole temperatures has been constructed for the specific purpose of the study on climate change. Although this database has become an important data source in climate research, there are certain limitations partially because the framework of the existing borehole temperature database was hand-coded some twenty years ago. A database renovation work is now underway to take the advantages of the contemporary online database technologies. The major intended improvements include 1) dynamically linking a borehole site to Google Earth to allow for inspection of site specific geographical information; 2) dynamically linking an original key reference of a given borehole site to Google Scholar to allow for a complete list of related publications; and 3) enabling site selection and data download based on country, coordinate range, and contributor. There appears to be a good match between the enhancement requirements for this database and the functionalities of the newly released Google Fusion Tables application. Google Fusion Tables is a cloud-based service for data management, integration, and visualization. This experimental application can consolidate related online resources such as Google Earth, Google Scholar, and Google Drive for sharing and enriching an online database. It is user friendly, allowing users to apply filters and to further explore the internet for additional information regarding the selected data. The users also have ways to map, to chart, and to calculate on the selected data, and to download just the subset needed. The figure below is a snapshot of the database currently under Google Fusion Tables renovation. We invite contribution and feedback from the geothermal and climate research community to make the

  5. Managing Database Services: An Approach Based in Information Technology Services Availabilty and Continuity Management

    Leonardo Bastos Pontes

    2017-01-01

    Full Text Available This paper is held in the information technology services management environment, with a few ideas of information technology governance, and purposes to implement a hybrid model to manage the services of a database, based on the principles of information technology services management in a supplementary health operator. This approach utilizes fundamental nuances of services management guides, such as CMMI for Services, COBIT, ISO 20000, ITIL and MPS.BR for Services; it studies harmonically Availability and Continuity Management, as most part of the guides also do. This work has its importance because it keeps a good flow in the database and improves the agility of the systems in the accredited clinics in the health plan.

  6. Research Note on the Energy Infrastructure Attack Database (EIAD

    Jennifer Giroux

    2013-12-01

    Full Text Available The January 2013 attack on the In Amenas natural gas facility drew international attention. However this attack is part of a portrait of energy infrastructure targeting by non-state actors that spans the globe. Data drawn from the Energy Infrastructure Attack Database (EIAD shows that in the last decade there were, on average, nearly 400 annual attacks carried out by armed non-state actors on energy infrastructure worldwide, a figure that was well under 200 prior to 1999. This data reveals a global picture whereby violent non-state actors target energy infrastructures to air grievances, communicate to governments, impact state economic interests, or capture revenue in the form of hijacking, kidnapping ransoms, theft. And, for politically motivated groups, such as those engaged in insurgencies, attacking industry assets garners media coverage serving as a facilitator for international attention. This research note will introduce EIAD and position its utility within various research areas where the targeting of energy infrastructure, or more broadly energy infrastructure vulnerability, has been addressed, either directly or indirectly. We also provide a snapshot of the initial analysis of the data between 1980-2011, noting specific temporal and spatial trends, and then conclude with a brief discussion on the contribution of EIAD, highlighting future research trajectories. 

  7. EPRI research on accident management

    Oehlberg, R.N.; Chao, J.

    1991-01-01

    The paper discusses Nuclear Regulatory Commission (NRC) efforts regarding severe reactor accident management and the Nuclear Management and Resources Council (NUMAEX), activities. (EPRI) Electric Power Research Institute accident management program consists of the two products just mentioned plus one related to severe accident plant status information and the MAAP 4.0 computer code. These are briefly discussed

  8. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  9. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    Raied Salman

    2015-01-01

    In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed an...

  10. The Net Enabled Waste Management Database as an international source of radioactive waste management information

    Csullog, G.W.; Friedrich, V.; Miaw, S.T.W.; Tonkay, D.; Petoe, A.

    2002-01-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an integral part of the IAEA's policies and strategy related to the collection and dissemination of information, both internal to the IAEA in support of its activities and external to the IAEA (publicly available). The paper highlights the NEWMDB's role in relation to the routine reporting of status and trends in radioactive waste management, in assessing the development and implementation of national systems for radioactive waste management, in support of a newly developed indicator of sustainable development for radioactive waste management, in support of reporting requirements for the Joint Convention on the Safety of Spent Fuel Management and on the Safety of Radioactive Waste Management, in support of IAEA activities related to the harmonization of waste management information at the national and international levels and in relation to the management of spent/disused sealed radioactive sources. (author)

  11. Application of database management software to probabilistic risk assessment calculations

    Wyss, G.D.

    1993-01-01

    Probabilistic risk assessment (PRA) calculations require the management and processing of large amounts of information. This data normally falls into two general categories. For example, a commercial nuclear power plant PRA study makes use of plant blueprints and system schematics, formal plant safety analysis reports, incident reports, letters, memos, handwritten notes from plant visits, and even the analyst's ''engineering judgment''. This information must be documented and cross-referenced in order to properly execute and substantiate the models used in a PRA study. The first category is composed of raw data that is accumulated from equipment testing and operational experiences. These data describe the equipment, its service or testing conditions, its failure mode, and its performance history. The second category is composed of statistical distributions. These distributions can represent probabilities, frequencies, or values of important parameters that are not time-related. Probability and frequency distributions are often obtained by fitting raw data to an appropriate statistical distribution. Database management software is used to store both types of data so that it can be readily queried, manipulated, and archived. This paper provides an overview of the information models used for storing PRA data and illustrates the implementation of these models using examples from current PRA software packages

  12. Advances in probabilistic databases for uncertain information management

    Yan, Li

    2013-01-01

    This book covers a fast-growing topic in great depth and focuses on the technologies and applications of probabilistic data management. It aims to provide a single account of current studies in probabilistic data management. The objective of the book is to provide the state of the art information to researchers, practitioners, and graduate students of information technology of intelligent information processing, and at the same time serving the information technology professional faced with non-traditional applications that make the application of conventional approaches difficult or impossible.

  13. Cryptanalysis of Password Protection of Oracle Database Management System (DBMS)

    Koishibayev, Timur; Umarova, Zhanat

    2016-04-01

    This article discusses the currently available encryption algorithms in the Oracle database, also the proposed upgraded encryption algorithm, which consists of 4 steps. In conclusion we make an analysis of password encryption of Oracle Database.

  14. Practice databases and their uses in clinical research.

    Tierney, W M; McDonald, C J

    1991-04-01

    A few large clinical information databases have been established within larger medical information systems. Although they are smaller than claims databases, these clinical databases offer several advantages: accurate and timely data, rich clinical detail, and continuous parameters (for example, vital signs and laboratory results). However, the nature of the data vary considerably, which affects the kinds of secondary analyses that can be performed. These databases have been used to investigate clinical epidemiology, risk assessment, post-marketing surveillance of drugs, practice variation, resource use, quality assurance, and decision analysis. In addition, practice databases can be used to identify subjects for prospective studies. Further methodologic developments are necessary to deal with the prevalent problems of missing data and various forms of bias if such databases are to grow and contribute valuable clinical information.

  15. Database on epidemiological survey in high background radiation research

    Zhou Sunyuan; Guo Furong; Liu Yusheng

    1992-01-01

    In order to store and check the data of the health survey in high background radiation area (HBRA) and control area in Guangdong Province, and to use these data in future, three databases were set up by using RBASE 5000 database software. (1) HD: the database based on the household registers especially established for the health survey from 1979 to 1986, covering more than 160000 subjects and 2200000 data. (2) DC: the database based on the registration cards of deaths from cancers and all other diseases during the period of 1975-1986 including more than 10000 cases and 260000 data. (3) MCC: the database for the case-control study on mutation-related factors for four kinds of cancers (liver, stomach, lung cancers and leukemia), embracing 626 subjects and close to 90000 data. The data in the databases were checked up with the original records and compared with the manual analytical results

  16. Academic impact of a public electronic health database: bibliometric analysis of studies using the general practice research database.

    Yu-Chun Chen

    Full Text Available BACKGROUND: Studies that use electronic health databases as research material are getting popular but the influence of a single electronic health database had not been well investigated yet. The United Kingdom's General Practice Research Database (GPRD is one of the few electronic health databases publicly available to academic researchers. This study analyzed studies that used GPRD to demonstrate the scientific production and academic impact by a single public health database. METHODOLOGY AND FINDINGS: A total of 749 studies published between 1995 and 2009 with 'General Practice Research Database' as their topics, defined as GPRD studies, were extracted from Web of Science. By the end of 2009, the GPRD had attracted 1251 authors from 22 countries and been used extensively in 749 studies published in 193 journals across 58 study fields. Each GPRD study was cited 2.7 times by successive studies. Moreover, the total number of GPRD studies increased rapidly, and it is expected to reach 1500 by 2015, twice the number accumulated till the end of 2009. Since 17 of the most prolific authors (1.4% of all authors contributed nearly half (47.9% of GPRD studies, success in conducting GPRD studies may accumulate. The GPRD was used mainly in, but not limited to, the three study fields of "Pharmacology and Pharmacy", "General and Internal Medicine", and "Public, Environmental and Occupational Health". The UK and United States were the two most active regions of GPRD studies. One-third of GRPD studies were internationally co-authored. CONCLUSIONS: A public electronic health database such as the GPRD will promote scientific production in many ways. Data owners of electronic health databases at a national level should consider how to reduce access barriers and to make data more available for research.

  17. Ageing management database development for PWR NPP steam generator

    Liu Hongyun; Xu Liangjun; Xiong Changhuai; Wang Xianyuan

    2005-01-01

    Steam generator (SG) is one of the key safe important equipment of NPP, which is covered by NPP aging management program. Steam Generator Aging Management Dabatase (SGAMDB) is developed to provide necessary information for SG aging management. RINPO is developing SGAMDB for domestic NPP. This system contains information and data about SG design, manufacture, operation and maintenance. The information include NPP fundamental data, SG design data, SG aging mechanism, SG operation data, SG ISI data, SG maintenance data and SG evaluation interface. The system runs at the intranet of Qinshan-1 NPP with B/S mode. It can provide information inquire and fundamental analysis for NPP SG aging team and SG aging researcher's. In addition, it provides necessary information and data for SG aging analysis and evaluation, such as all pressure test process and flaws of tubes, and collects the analysis results. (authors)

  18. Construct Measurement in Management Research

    Nielsen, Bo Bernhard

    2014-01-01

    Far too often do management scholars resort to crude and often inappropriate measures of fundamental constructs in their research; an approach which calls in question the interpretation and validity of their findings. Scholars often legitimize poor choices in measurement with a lack of availability......, this research note raises important questions about the use of proxies in management research and argues for greater care in operationalizing constructs with particular attention to matching levels of theory and measurement....

  19. Intra-disciplinary differences in database coverage and the consequences for bibliometric research

    Faber Frandsen, Tove; Nicolaisen, Jeppe

    2008-01-01

    Bibliographic databases (including databases based on open access) are routinely used for bibliometric research. The value of a specific database depends to a large extent on the coverage of the discipline(s) under study. A number of studies have determined the coverage of databases in specific d...... and psychology). The point extends to include both the uneven coverage of specialties and research traditions. The implications for bibliometric research are discussed, and precautions which need to be taken are outlined. ...

  20. Database and applications security integrating information security and data management

    Thuraisingham, Bhavani

    2005-01-01

    This is the first book to provide an in-depth coverage of all the developments, issues and challenges in secure databases and applications. It provides directions for data and application security, including securing emerging applications such as bioinformatics, stream information processing and peer-to-peer computing. Divided into eight sections, each of which focuses on a key concept of secure databases and applications, this book deals with all aspects of technology, including secure relational databases, inference problems, secure object databases, secure distributed databases and emerging

  1. Building the Science of Research Management: What Can Research Management Learn from Education Research?

    Huang, Jun Song; Hung, Wei Loong

    2018-01-01

    Research management is an emerging field of study and its development is significant to the advancement of research enterprise. Developing the science of research management requires investigating social mechanisms involved in research management. Yet, studies on social mechanisms of research management is lacking in the literature. To address…

  2. Down syndrome: issues to consider in a national registry, research database and biobank.

    McCabe, Linda L; McCabe, Edward R B

    2011-01-01

    As the quality of life for individuals with Down syndrome continues to improve due to anticipatory healthcare, early intervention, mainstreaming in schools, and increased expectations, the lack of basic information regarding individuals with Down syndrome is being recognized, and the need to facilitate research through a national registry, research database and biobank is being discussed. We believe that there should not be ownership of the samples and information, but instead prefer stewardship of the samples and information to benefit the participants who provided them. We endorse a model with data and sample managers and a research review board to interface between the investigators and participants. Information and samples would be coded, and only a few data managers would know the relationship between the codes and identifying information. Research results once published should be included in an online newsletter. If appropriate, individual results should be shared with participants. A Down syndrome registry, research database and biobank should be accountable to participants, families, medical care providers, government, and funding sources. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Knowledge Exchange and Management Research

    Bager, Torben

    2018-01-01

    for ‘interesting’ discoveries has a potential to lift off papers with a high level of scientific rigor as well as a high level of relevance for practice. Originality: An outcome focus on the relationship between knowledge exchange activities and management research is to our knowledge new in the debate about......Purpose: The growing involvement of management researchers in knowledge exchange activities and collaborative research does not seem to be reflected in a growing academic output. The purpose of this paper is to explore barriers for academic output from these activities as well as the potential...... derived from knowledge exchange activities and Mode 2 research into academic papers such as low priority of case study research in leading management journals, a growing practice orientation in the research funding systems, methodological challenges due to limited researcher control, and disincentives...

  4. The Nordic prescription databases as a resource for pharmacoepidemiological research

    Wettermark, B; Zoëga, H; Furu, K

    2013-01-01

    All five Nordic countries have nationwide prescription databases covering all dispensed drugs, with potential for linkage to outcomes. The aim of this review is to present an overview of therapeutic areas studied and methods applied in pharmacoepidemiologic studies using data from these databases....

  5. Reactor core materials research and integrated material database establishment

    Ryu, Woo Seog; Jang, J. S.; Kim, D. W.

    2002-03-01

    Mainly two research areas were covered in this project. One is to establish the integrated database of nuclear materials, and the other is to study the behavior of reactor core materials, which are usually under the most severe condition in the operating plants. During the stage I of the project (for three years since 1999) in- and out of reactor properties of stainless steel, the major structural material for the core structures of PWR (Pressurized Water Reactor), were evaluated and specification of nuclear grade material was established. And the damaged core components from domestic power plants, e.g. orifice of CVCS, support pin of CRGT, etc. were investigated and the causes were revealed. To acquire more resistant materials to the nuclear environments, development of the alternative alloys was also conducted. For the integrated DB establishment, a task force team was set up including director of nuclear materials technology team, and projector leaders and relevant members from each project. The DB is now opened in public through the Internet

  6. Military Personnel: DOD Has Processes for Operating and Managing Its Sexual Assault Incident Database

    2017-01-01

    MILITARY PERSONNEL DOD Has Processes for Operating and Managing Its Sexual Assault Incident Database Report to...to DSAID’s system speed and ease of use; interfaces with MCIO databases ; utility as a case management tool; and users’ ability to query data and... Managing Its Sexual Assault Incident Database What GAO Found As of October 2013, the Department of Defense’s (DOD) Defense Sexual Assault Incident

  7. Academic Impact of a Public Electronic Health Database: Bibliometric Analysis of Studies Using the General Practice Research Database

    Chen, Yu-Chun; Wu, Jau-Ching; Haschler, Ingo; Majeed, Azeem; Chen, Tzeng-Ji; Wetter, Thomas

    2011-01-01

    Background Studies that use electronic health databases as research material are getting popular but the influence of a single electronic health database had not been well investigated yet. The United Kingdom's General Practice Research Database (GPRD) is one of the few electronic health databases publicly available to academic researchers. This study analyzed studies that used GPRD to demonstrate the scientific production and academic impact by a single public health database. Methodology and Findings A total of 749 studies published between 1995 and 2009 with ‘General Practice Research Database’ as their topics, defined as GPRD studies, were extracted from Web of Science. By the end of 2009, the GPRD had attracted 1251 authors from 22 countries and been used extensively in 749 studies published in 193 journals across 58 study fields. Each GPRD study was cited 2.7 times by successive studies. Moreover, the total number of GPRD studies increased rapidly, and it is expected to reach 1500 by 2015, twice the number accumulated till the end of 2009. Since 17 of the most prolific authors (1.4% of all authors) contributed nearly half (47.9%) of GPRD studies, success in conducting GPRD studies may accumulate. The GPRD was used mainly in, but not limited to, the three study fields of “Pharmacology and Pharmacy”, “General and Internal Medicine”, and “Public, Environmental and Occupational Health”. The UK and United States were the two most active regions of GPRD studies. One-third of GRPD studies were internationally co-authored. Conclusions A public electronic health database such as the GPRD will promote scientific production in many ways. Data owners of electronic health databases at a national level should consider how to reduce access barriers and to make data more available for research. PMID:21731733

  8. Human health risk assessment database, "the NHSRC toxicity value database": supporting the risk assessment process at US EPA's National Homeland Security Research Center.

    Moudgal, Chandrika J; Garrahan, Kevin; Brady-Roberts, Eletha; Gavrelis, Naida; Arbogast, Michelle; Dun, Sarah

    2008-11-15

    The toxicity value database of the United States Environmental Protection Agency's (EPA) National Homeland Security Research Center has been in development since 2004. The toxicity value database includes a compilation of agent property, toxicity, dose-response, and health effects data for 96 agents: 84 chemical and radiological agents and 12 biotoxins. The database is populated with multiple toxicity benchmark values and agent property information from secondary sources, with web links to the secondary sources, where available. A selected set of primary literature citations and associated dose-response data are also included. The toxicity value database offers a powerful means to quickly and efficiently gather pertinent toxicity and dose-response data for a number of agents that are of concern to the nation's security. This database, in conjunction with other tools, will play an important role in understanding human health risks, and will provide a means for risk assessors and managers to make quick and informed decisions on the potential health risks and determine appropriate responses (e.g., cleanup) to agent release. A final, stand alone MS ACESSS working version of the toxicity value database was completed in November, 2007.

  9. Human health risk assessment database, 'the NHSRC toxicity value database': Supporting the risk assessment process at US EPA's National Homeland Security Research Center

    Moudgal, Chandrika J.; Garrahan, Kevin; Brady-Roberts, Eletha; Gavrelis, Naida; Arbogast, Michelle; Dun, Sarah

    2008-01-01

    The toxicity value database of the United States Environmental Protection Agency's (EPA) National Homeland Security Research Center has been in development since 2004. The toxicity value database includes a compilation of agent property, toxicity, dose-response, and health effects data for 96 agents: 84 chemical and radiological agents and 12 biotoxins. The database is populated with multiple toxicity benchmark values and agent property information from secondary sources, with web links to the secondary sources, where available. A selected set of primary literature citations and associated dose-response data are also included. The toxicity value database offers a powerful means to quickly and efficiently gather pertinent toxicity and dose-response data for a number of agents that are of concern to the nation's security. This database, in conjunction with other tools, will play an important role in understanding human health risks, and will provide a means for risk assessors and managers to make quick and informed decisions on the potential health risks and determine appropriate responses (e.g., cleanup) to agent release. A final, stand alone MS ACESSS working version of the toxicity value database was completed in November, 2007

  10. Development of database management system for monitoring of radiation workers for actinides

    Kalyane, G.N.; Mishra, L.; Nadar, M.Y.; Singh, I.S.; Rao, D.D.

    2012-01-01

    Annually around 500 radiation workers are monitored for estimation of lung activities and internal dose due to Pu/Am and U from various divisions of Bhabha Atomic Research Centre (Trombay) and from PREFRE and A3F facilities (Tarapur) in lung counting laboratory located at Bhabha Atomic Research Centre hospital under Routine and Special monitoring program. A 20 cm diameter phoswich and an array of HPGe detector were used for this purpose. In case of positive contamination, workers are followed up and monitored using both the detection systems in different geometries. Management of this huge data becomes difficult and therefore an easily retrievable database system containing all the relevant data of the monitored radiation workers. Materials and methods: The database management system comprises of three main modules integrated together: 1) Apache server installed on a Windows (XP) platform (Apache version 2.2.17) 2) MySQL database management system (MySQL version 5.5.8) 3) PHP (Preformatted Hypertext) programming language (PHP version 5.3.5). All the 3 modules work together seamlessly as a single software program. The front end user interaction is through an user friendly and interactive local web page where internet connection is not required. This front page has hyperlinks to many other pages, which have different utilities for the user. The user has to log in using username and password. Results and Conclusions: Database management system is used for entering, updating and management of lung monitoring data of radiation workers, The program is having following utilities: bio-data entry of new subjects, editing of bio-data of old subjects (only one subject at a time), entry of counting data of that day's lung monitoring, retrieval of old records based on a number of parameters and filters like date of counting, employee number, division, counts fulfilling a given criterion, etc. and calculation of MEQ CWT (Muscle Equivalent Chest Wall Thickness), energy

  11. Fine-grained policy control in U.S. Army Research Laboratory (ARL) multimodal signatures database

    Bennett, Kelly; Grueneberg, Keith; Wood, David; Calo, Seraphin

    2014-06-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) consists of a number of colocated relational databases representing a collection of data from various sensors. Role-based access to this data is granted to external organizations such as DoD contractors and other government agencies through a client Web portal. In the current MMSDB system, access control is only at the database and firewall level. In order to offer finer grained security, changes to existing user profile schemas and authentication mechanisms are usually needed. In this paper, we describe a software middleware architecture and implementation that allows fine-grained access control to the MMSDB at a dataset, table, and row level. Result sets from MMSDB queries issued in the client portal are filtered with the use of a policy enforcement proxy, with minimal changes to the existing client software and database. Before resulting data is returned to the client, policies are evaluated to determine if the user or role is authorized to access the data. Policies can be authored to filter data at the row, table or column level of a result set. The system uses various technologies developed in the International Technology Alliance in Network and Information Science (ITA) for policy-controlled information sharing and dissemination1. Use of the Policy Management Library provides a mechanism for the management and evaluation of policies to support finer grained access to the data in the MMSDB system. The GaianDB is a policy-enabled, federated database that acts as a proxy between the client application and the MMSDB system.

  12. Research in Hospitality Management: Contact

    Principal Contact. Dr Sjoerd A Gehrels Editor-in-Chief Stenden Hotel Management School, Academy of International Hospitality Research, Leeuwarden, The Netherlands Email: sjoerd.gehrels@stenden.com ...

  13. UK experience of managing a radioactive materials transport event database

    Barton, N.J.; Barrett, J.A.

    1999-01-01

    A description is given of the transport event database RAMTED and the related annual accident and incident reports. This database covers accidents and incidents involving the transport of radioactive material in the UK from 1958 to the present day. The paper discusses the history and content of the database, the origin of event data contained in it, the criteria for inclusion and future developments. (author)

  14. Indigenous Research on Chinese Management

    Li, Peter Ping; Leung, Kwok; Chen, Chao C.

    2012-01-01

    We attempt to provide a definition and a typology of indigenous research on Chinese management as well as outline the general methodological approaches for this type of research. We also present an integrative summary of the four articles included in this special issue and show how they illustrate...... our definition and typology of indigenous research on Chinese management, as well as the various methodological approaches we advocate. Further, we introduce a commentary on the four articles from the perspective of engaged scholarship, and also three additional articles included in this issue....... Finally, we conclude with our suggestions for future indigenous research....

  15. Waste management research abstracts. Information on radioactive waste management research in progress or planned. Vol. 30

    2005-11-01

    This issue contains 90 abstracts that describe research in progress in the field of radioactive waste management. The abstracts present ongoing work in various countries and international organizations. Although the abstracts are indexed by country, some programmes are actually the result of co-operation among several countries. Indeed, a primary reason for providing this compilation of programmes, institutions and scientists engaged in research into radioactive waste management is to increase international co-operation and facilitate communications. Data provided by researchers for publication in WMRA 30 were entered into a research in progress database named IRAIS (International Research Abstracts Information System). The IRAIS database is available via the Internet at the following URL: http://www.iaea.org/programmes/irais/ This database will continue to be updated as new abstracts are submitted by researchers world-wide. The abstracts are listed by country (full name) in alphabetical order. All abstracts are in English. The volume includes six indexes: principal investigator, title, performing organization, descriptors (key words), topic codes and country

  16. Ultra-Structure database design methodology for managing systems biology data and analyses

    Hemminger Bradley M

    2009-08-01

    Full Text Available Abstract Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping. Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find

  17. Brassica ASTRA: an integrated database for Brassica genomic research.

    Love, Christopher G; Robinson, Andrew J; Lim, Geraldine A C; Hopkins, Clare J; Batley, Jacqueline; Barker, Gary; Spangenberg, German C; Edwards, David

    2005-01-01

    Brassica ASTRA is a public database for genomic information on Brassica species. The database incorporates expressed sequences with Swiss-Prot and GenBank comparative sequence annotation as well as secondary Gene Ontology (GO) annotation derived from the comparison with Arabidopsis TAIR GO annotations. Simple sequence repeat molecular markers are identified within resident sequences and mapped onto the closely related Arabidopsis genome sequence. Bacterial artificial chromosome (BAC) end sequences derived from the Multinational Brassica Genome Project are also mapped onto the Arabidopsis genome sequence enabling users to identify candidate Brassica BACs corresponding to syntenic regions of Arabidopsis. This information is maintained in a MySQL database with a web interface providing the primary means of interrogation. The database is accessible at http://hornbill.cspp.latrobe.edu.au.

  18. Towards efficient use of research resources: a nationwide database of ongoing primary care research projects in the Netherlands.

    Kortekaas, Marlous F; van de Pol, Alma C; van der Horst, Henriëtte E; Burgers, Jako S; Slort, Willemjan; de Wit, Niek J

    2014-04-01

    PURPOSE. Although in the last decades primary care research has evolved with great success, there is a growing need to prioritize the topics given the limited resources available. Therefore, we constructed a nationwide database of ongoing primary care research projects in the Netherlands, and we assessed if the distribution of research topics matched with primary care practice. We conducted a survey among the main primary care research centres in the Netherlands and gathered details of all ongoing primary care research projects. We classified the projects according to research topic, relation to professional guidelines and knowledge deficits, collaborative partners and funding source. Subsequently, we compared the frequency distribution of clinical topics of research projects to the prevalence of problems in primary care practice. We identified 296 ongoing primary care research projects from 11 research centres. Most projects were designed as randomized controlled trial (35%) or observational cohort (34%), and government funded mostly (60%). Thematically, most research projects addressed chronic diseases, mainly cardiovascular risk management (8%), depressive disorders (8%) and diabetes mellitus (7%). One-fifth of the projects was related to defined knowledge deficits in primary care guidelines. From a clinical primary care perspective, research projects on dermatological problems were significantly underrepresented (P = 0.01). This survey of ongoing projects demonstrates that primary care research has a firm basis in the Netherlands, with a strong focus on chronic disease. The fit with primary care practice can improve, and future research should address knowledge deficits in professional guidelines more.

  19. Growing dimensions. Spent fuel management at research reactors

    Ritchie, I.G.

    1998-01-01

    More than 550 nuclear research reactors are operating or shout down around the world. At many of these reactors, spent fuel from their operations is stored, pending decisions on its final disposition. In recent years, problems associated with this spent fuel storage have loomed larger in the international nuclear community. In efforts to determine the overall scope of problems and to develop a database on the subject, the IAEA has surveyed research reactor operators in its Member States. Information for the Research Reactor Spent Fuel Database (RRSFDB) so far has been obtained from a limited but representative number of research reactors. It supplements data already on hand in the Agency's more established Research Reactor Database (RRDB). Drawing upon these database resources, this article presents an overall picture of spent fuel management and storage at the world's research reactors, in the context of associated national and international programmes in the field

  20. A DICOM based radiotherapy plan database for research collaboration and reporting

    Westberg, J; Krogh, S; Brink, C; Vogelius, I R

    2014-01-01

    Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.

  1. A DICOM based radiotherapy plan database for research collaboration and reporting

    Westberg, J.; Krogh, S.; Brink, C.; Vogelius, I. R.

    2014-03-01

    Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.

  2. MonetDB: Two Decades of Research in Column-oriented Database Architectures

    S. Idreos (Stratos); F.E. Groffen (Fabian); N.J. Nes (Niels); S. Manegold (Stefan); K.S. Mullender (Sjoerd); M.L. Kersten (Martin)

    2012-01-01

    textabstractMonetDB is a state-of-the-art open-source column-store database management system targeting applications in need for analytics over large collections of data. MonetDB is actively used nowadays in health care, in telecommunications as well as in scientific databases and in data management

  3. Managing XML Data to optimize Performance into Object-Relational Databases

    Iuliana BOTHA

    2011-06-01

    Full Text Available This paper propose some possibilities for manage XML data in order to optimize performance into object-relational databases. It is detailed the possibility of storing XML data into such databases, using for exemplification an Oracle database and there are tested some optimizing techniques of the queries over XMLType tables, like indexing and partitioning tables.

  4. The Government Finance Database: A Common Resource for Quantitative Research in Public Financial Analysis.

    Pierson, Kawika; Hand, Michael L; Thompson, Fred

    2015-01-01

    Quantitative public financial management research focused on local governments is limited by the absence of a common database for empirical analysis. While the U.S. Census Bureau distributes government finance data that some scholars have utilized, the arduous process of collecting, interpreting, and organizing the data has led its adoption to be prohibitive and inconsistent. In this article we offer a single, coherent resource that contains all of the government financial data from 1967-2012, uses easy to understand natural-language variable names, and will be extended when new data is available.

  5. Research in Institutional Economics in Management Science

    Foss, Kirsten; Foss, Nicolai Juul

    This report maps research in institutional economics in management science in the European Union for the 1995 to 2002 period. The reports applies Internet search based on a university listing, search on journal databases, key informants and an internet-based survey. 195 researchers are identified....... In (sub-)disciplinary terms, organization, strategy, corporate governance, and international business are the major areas of application of institutional economics ideas. In terms of countries, the EU strongholds are Holland, Denmark, UK, and Germany. There is apparently no or very little relevant...... research in Ireland, Portugal, Luxembourg and Greece. Based on the findings of the report, it seems warranted to characterize the EU research effort in the field as being rather dispersed and uncoordinated. Thus, there are no specialized journals, associations or PhD courses. This state of affairs...

  6. Design of database management system for 60Co container inspection system

    Liu Jinhui; Wu Zhifang

    2007-01-01

    The function of the database management system has been designed according to the features of cobalt-60 container inspection system. And the software related to the function has been constructed. The database querying and searching are included in the software. The database operation program is constructed based on Microsoft SQL server and Visual C ++ under Windows 2000. The software realizes database querying, image and graph displaying, statistic, report form and its printing, interface designing, etc. The software is powerful and flexible for operation and information querying. And it has been successfully used in the real database management system of cobalt-60 container inspection system. (authors)

  7. Data management for environmental research

    Strand, R.H.

    1976-01-01

    The objective of managing environmental research data is to develop a resource sufficient for the study and potential solution of environmental problems. Consequently, environmnetal data management must include a broad spectrum of activities ranging from statistical analysis and modeling, through data set archiving to computer hardware procurement. This paper briefly summarizes the data management requirements for environmental research and the techniques and automated procedures which are currently used by the Environmental Sciences Division at Oak Ridge National Laboratory. Included in these requirements are readily retrievable data, data indexed by categories for retrieval and application, data documentation (including collection methods), design and error bounds, easily used analysis and display programs, and file manipulation routines. The statistical analysis system (SAS) and other systems provide the automated procedures and techniques for analysis and management of environmental research data

  8. Contradictions in qualitative management research

    Hansen, Per Richard; Dorland, Jens

    2016-01-01

    and remove them from the analytical work. The purpose of this paper is to re-visit and re-introduce a dissensus-based management research strategy in order to analytically be able to work with what appear to be contradictions and misinformation in qualitative research accounts, and give them a more profound...

  9. Enabling On-Demand Database Computing with MIT SuperCloud Database Management System

    2015-09-15

    arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created

  10. Handbook of Collaborative Management Research

    Shani, A B Rami B; Pasmore, William A A; Stymne, Dr Bengt; Adler, Niclas

    2007-01-01

    This handbook provides the latest thinking, methodologies and cases in the rapidly growing area of collaborative management research. What makes collaborative management research different is its emphasis on creating a close partnership between scholars and practitioners in the search for knowledge concerning organizations and complex systems. In the ideal situation, scholars and their managerial partners would work together to define the research focus, develop the methods to be used for data collection, participate equally in the analysis of data, and work together in the application and dis

  11. Metabolonote: A wiki-based database for managing hierarchical metadata of metabolome analyses

    Takeshi eAra

    2015-04-01

    Full Text Available Metabolomics—technology for comprehensive detection of small molecules in an organism—lags behind the other omics in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata, existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called TogoMD, with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers' understanding and use of data, but also submitters' motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitates the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http://metabolonote.kazusa.or.jp/.

  12. Knowledge production status of Iranian researchers in the gastric cancer area: based on the medline database.

    Ghojazadeh, Morteza; Naghavi-Behzad, Mohammad; Nasrolah-Zadeh, Raheleh; Bayat-Khajeh, Parvaneh; Piri, Reza; Mirnia, Keyvan; Azami-Aghdash, Saber

    2014-01-01

    Scientometrics is a useful method for management of financial and human resources and has been applied many times in medical sciences during recent years. The aim of this study was to investigate the status of science production by Iranian scientists in the gastric cancer field based on the Medline database. In this descriptive-cross sectional study Iranian science production concerning gastric cancer during 2000-2011 was investigated based on Medline. After two stages of searching, 121 articles were found, then we reviewed publication date, authors names, journal title, impact factor (IF), and cooperation coefficient between researchers. SPSS.19 was used for statistical analysis. There was a significant increase in published articles about gastric cancer by Iranian researchers in Medline database during 2006-2011. Mean cooperation coefficient between researchers was 6.14±3.29 person per article. Articles of this field were published in 19 countries and 56 journals. Those basex in Thailand, England, and America had the most published Iranian articles. Tehran University of Medical Sciences and Mohammadreza Zali had the most outstanding role in publishing scientific articles. According to results of this study, improving cooperation of researchers in conducting research and scientometric studies about other fields may have an important role in increasing both quality and quantity of published studies.

  13. Computer-Aided Systems Engineering for Flight Research Projects Using a Workgroup Database

    Mizukami, Masahi

    2004-01-01

    An online systems engineering tool for flight research projects has been developed through the use of a workgroup database. Capabilities are implemented for typical flight research systems engineering needs in document library, configuration control, hazard analysis, hardware database, requirements management, action item tracking, project team information, and technical performance metrics. Repetitive tasks are automated to reduce workload and errors. Current data and documents are instantly available online and can be worked on collaboratively. Existing forms and conventional processes are used, rather than inventing or changing processes to fit the tool. An integrated tool set offers advantages by automatically cross-referencing data, minimizing redundant data entry, and reducing the number of programs that must be learned. With a simplified approach, significant improvements are attained over existing capabilities for minimal cost. By using a workgroup-level database platform, personnel most directly involved in the project can develop, modify, and maintain the system, thereby saving time and money. As a pilot project, the system has been used to support an in-house flight experiment. Options are proposed for developing and deploying this type of tool on a more extensive basis.

  14. Collaborative Research and Behavioral Management

    Schapiro, Steve; Brosnan, Sarah F.; Hopkins, William D

    2017-01-01

    The behavioral management of captive nonhuman primates (NHPs) can be significantly enhanced through synergistic relationships with noninvasive research projects. Many behavioral and cognitive research procedures are challenging and enriching (physically, cognitively, and/or socially......) for the animals (Hopper et al. 2016; Hopkins and Latzman 2017) without involving any invasive (surgical, biopsy, etc.) procedures. Noninvasive behavioral research programs present the primates with opportunities to choose to voluntarily participate (or not), providing them with greater control over...

  15. Development of a database system for the management of non-treated radioactive waste

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso, E-mail: ajp@cdtn.br, E-mail: cbf@cdtn.br, E-mail: vc@cdtn.br, E-mail: pos@cdtn.br, E-mail: seless@cdtn.br, E-mail: hauczmj@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  16. Development of a database system for the management of non-treated radioactive waste

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso

    2017-01-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  17. Health technology management: a database analysis as support of technology managers in hospitals.

    Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi

    2011-01-01

    Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy.

  18. The Erasmus insurance case and a related questionnaire for distributed database management systems

    S.C. van der Made-Potuijt

    1990-01-01

    textabstractThis is the third report concerning transaction management in the database environment. In the first report the role of the transaction manager in protecting the integrity of a database has been studied [van der Made-Potuijt 1989]. In the second report a model has been given for a

  19. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  20. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  1. Database application research in real-time data access of accelerator control system

    Chen Guanghua; Chen Jianfeng; Wan Tianmin

    2012-01-01

    The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

  2. Research on reliability management systems for Nuclear Power Plant

    Maki, Nobuo

    2000-01-01

    Investigation on a reliability management system for Nuclear Power Plants (NPPs) has been performed on national and international archived documents as well as on current status of studies at Idaho National Engineering and Environmental Laboratory (INEEL), US NPPs (McGuire, Seabrook), a French NPP (St. Laurent-des-Eaux), Japan Atomic Energy Research Institute (JAERI), Central Research Institute of Electric Power Industries (CRIEPI), and power plant manufacturers in Japan. As a result of the investigation, the following points were identified: (i) A reliability management system is composed of a maintenance management system to inclusively manage maintenance data, and an anomalies information and reliability data management system to extract data from maintenance results stored in the maintenance management system and construct a reliability database. (ii) The maintenance management system, which is widely-used among NPPs in the US and Europe, is an indispensable system for the increase of maintenance reliability. (iii) Maintenance management methods utilizing reliability data like Reliability Centered Maintenance are applied for NPP maintenance in the US and Europe, and contributing to cost saving. Maintenance templates are effective in the application process. In addition, the following points were proposed on the design of the system: (i) A detailed database on specifications of facilities and components is necessary for the effective use of the system. (ii) A demand database is indispensable for the application of the methods. (iii) Full-time database managers are important to maintain the quality of the reliability data. (author)

  3. Grantees Guide for Research Database at IDRC (English)

    Commercial databases conditions of use. These resources are governed by license agreements which restrict use to IDRC employees and grantees taking part in projects funded by. IDRC. It is the responsibility of each user to use these products only for individual, noncommercial use without systematically downloading ...

  4. Research and Implementation of Distributed Database HBase Monitoring System

    Guo Lisi

    2017-01-01

    Full Text Available With the arrival of large data age, distributed database HBase becomes an important tool for storing data in massive data age. The normal operation of HBase database is an important guarantee to ensure the security of data storage. Therefore designing a reasonable HBase monitoring system is of great significance in practice. In this article, we introduce the solution, which contains the performance monitoring and fault alarm function module, to meet a certain operator’s demand of HBase monitoring database in their actual production projects. We designed a monitoring system which consists of a flexible and extensible monitoring agent, a monitoring server based on SSM architecture, and a concise monitoring display layer. Moreover, in order to deal with the problem that pages renders too slow in the actual operation process, we present a solution: reducing the SQL query. It has been proved that reducing SQL query can effectively improve system performance and user experience. The system work well in monitoring the status of HBase database, flexibly extending the monitoring index, and issuing a warning when a fault occurs, so that it is able to improve the working efficiency of the administrator, and ensure the smooth operation of the project.

  5. On the use of databases about research performance

    Rodela, Romina

    2016-01-01

    The accuracy of interdisciplinarity measurements depends on how well the data is used for this purpose and whether it can meaningfully inform about work that crosses disciplinary domains. At present, there are no ad hoc databases compiling information only and exclusively about interdisciplinary

  6. Dynamic tables: an architecture for managing evolving, heterogeneous biomedical data in relational database management systems.

    Corwin, John; Silberschatz, Avi; Miller, Perry L; Marenco, Luis

    2007-01-01

    Data sparsity and schema evolution issues affecting clinical informatics and bioinformatics communities have led to the adoption of vertical or object-attribute-value-based database schemas to overcome limitations posed when using conventional relational database technology. This paper explores these issues and discusses why biomedical data are difficult to model using conventional relational techniques. The authors propose a solution to these obstacles based on a relational database engine using a sparse, column-store architecture. The authors provide benchmarks comparing the performance of queries and schema-modification operations using three different strategies: (1) the standard conventional relational design; (2) past approaches used by biomedical informatics researchers; and (3) their sparse, column-store architecture. The performance results show that their architecture is a promising technique for storing and processing many types of data that are not handled well by the other two semantic data models.

  7. National information network and database system of hazardous waste management in China

    Ma Hongchang [National Environmental Protection Agency, Beijing (China)

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry, and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.

  8. Outcomes research in amyotrophic lateral sclerosis: lessons learned from the amyotrophic lateral sclerosis clinical assessment, research, and education database.

    Miller, Robert G; Anderson, Fred; Brooks, Benjamin Rix; Mitsumoto, Hiroshi; Bradley, Walter G; Ringel, Steven P

    2009-01-01

    To examine the care of patients with ALS following the publication of the standardized recommendations for the management of patients with amyotrophic lateral sclerosis (ALS) published in 1999 by the American Academy of Neurology. Specific aspects of ALS patient management have been evaluated serially using a national Amyotrophic Lateral Sclerosis Clinical Assessment, Research, and Education (ALS CARE) database to encourage compliance with these recommendations and to assure continuing quality improvement. The most recent analysis of 5,600 patients shows interesting epidemiological observations and treatment trends. Proper management of many ALS symptoms has increased substantially since the first publication of the guidelines, and awareness of pseudobulbar affect has increased. Other recommendations are underutilized: Only 9% undergo percutaneous endoscopic gastrostomy, although this procedure was recommended in 22% of patients; and noninvasive positive pressure ventilation was used by only 21% of patients despite being associated with improved 5-year survival rates. This observational database has been a useful tool in monitoring compliance with the standard of care for patients with ALS and may have resulted in greater adherence to guidelines.

  9. Annual report of the Management Research Center

    1987-01-01

    Research on the management of new forms of automation; industrial management; the definition of a new product range; economic management; personnel management; and management of cultural enterprises is presented [fr

  10. StreetTiVo: Using a P2P XML Database System to Manage Multimedia Data in Your Living Room

    Zhang, Ying; de Vries, A.P.; Boncz, P.; Hiemstra, Djoerd; Ordelman, Roeland J.F.; Li, Qing; Feng, Ling; Pei, Jian; Wang, Sean X.

    StreetTiVo is a project that aims at bringing research results into the living room; in particular, a mix of current results in the areas of Peer-to-Peer XML Database Management System (P2P XDBMS), advanced multimedia analysis techniques, and advanced information re- trieval techniques. The project

  11. KALIMER database development

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  12. KALIMER database development

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  13. Hydrometeorological Database (HMDB) for Practical Research in Ecology

    Novakovskiy, A; Elsakov, V

    2014-01-01

    The regional HydroMeteorological DataBase (HMDB) was designed for easy access to climate data via the Internet. It contains data on various climatic parameters (temperature, precipitation, pressure, humidity, and wind strength and direction) from 190 meteorological stations in Russia and bordering countries for a period of instrumental observations of over 100 years. Open sources were used to ingest data into HMDB. An analytical block was also developed to perform the most common statistical ...

  14. FmMDb: a versatile database of foxtail millet markers for millets and bioenergy grasses research.

    Venkata Suresh B

    Full Text Available The prominent attributes of foxtail millet (Setaria italica L. including its small genome size, short life cycle, inbreeding nature, and phylogenetic proximity to various biofuel crops have made this crop an excellent model system to investigate various aspects of architectural, evolutionary and physiological significances in Panicoid bioenergy grasses. After release of its whole genome sequence, large-scale genomic resources in terms of molecular markers were generated for the improvement of both foxtail millet and its related species. Hence it is now essential to congregate, curate and make available these genomic resources for the benefit of researchers and breeders working towards crop improvement. In view of this, we have constructed the Foxtail millet Marker Database (FmMDb; http://www.nipgr.res.in/foxtail.html, a comprehensive online database for information retrieval, visualization and management of large-scale marker datasets with unrestricted public access. FmMDb is the first database which provides complete marker information to the plant science community attempting to produce elite cultivars of millet and bioenergy grass species, thus addressing global food insecurity.

  15. Information flow in the DAMA project beyond database managers: information flow managers

    Russell, Lucian; Wolfson, Ouri; Yu, Clement

    1996-12-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point of sale information, is being considered in the Demand Activated Manufacturing Project (DAMA) of the American Textile Partnership (AMTEX) project. A scenario is examined in which 100 000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26 000 suppliers through the use of bill of materials explosions at four levels of detail. Enabling this communication requires an approach that shares common features with both workflows and database management. A new paradigm, the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced so as to keep estimates of demand as current as possible.

  16. A user's manual for the database management system of impact property

    Ryu, Woo Seok; Park, S. J.; Kong, W. S.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the impact database system for managing the impact property test data. The data base constructed the data produced from impact property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The impact database system was developed by internet method using jsp(Java Server pages) tool

  17. Research in Hospitality Management: Submissions

    Before submitting a manuscript, authors should peruse and consult a recent issue of the Journal for format and style. ... The submission of a manuscript by the authors implies that they automatically agree to assign exclusive copyright to the publishers of the Research in Hospitality Management, NISC (Pty) Ltd. There are no ...

  18. ASFA database: A tool for marine science researchers

    Tapaswi, M.P.

    stream_size 5 stream_content_type text/plain stream_name Trg_Course_Coastal_Zone_Manage_1993_65.pdf.txt stream_source_info Trg_Course_Coastal_Zone_Manage_1993_65.pdf.txt Content-Encoding ISO-8859-1 Content-Type text/plain; charset...

  19. Research on spatio-temporal database techniques for spatial information service

    Zhao, Rong; Wang, Liang; Li, Yuxiang; Fan, Rongshuang; Liu, Ping; Li, Qingyuan

    2007-06-01

    Geographic data should be described by spatial, temporal and attribute components, but the spatio-temporal queries are difficult to be answered within current GIS. This paper describes research into the development and application of spatio-temporal data management system based upon GeoWindows GIS software platform which was developed by Chinese Academy of Surveying and Mapping (CASM). Faced the current and practical requirements of spatial information application, and based on existing GIS platform, one kind of spatio-temporal data model which integrates vector and grid data together was established firstly. Secondly, we solved out the key technique of building temporal data topology, successfully developed a suit of spatio-temporal database management system adopting object-oriented methods. The system provides the temporal data collection, data storage, data management and data display and query functions. Finally, as a case study, we explored the application of spatio-temporal data management system with the administrative region data of multi-history periods of China as the basic data. With all the efforts above, the GIS capacity of management and manipulation in aspect of time and attribute of GIS has been enhanced, and technical reference has been provided for the further development of temporal geographic information system (TGIS).

  20. Management of research and development project

    Go, Seok Hwa; Hong Jeong Yu; Hyun, Byeong Hwan

    2010-12-01

    This book introduces summary on management of research and development project, prepare of research and development with investigation and analysis of paper, patent and trend of technology, structure of project, management model, management of project, management of project range, management of project time, management of project cost, management of project goods, management of project manpower, management of communication, management of project risk, management of project supply, management of outcome of R and D, management of apply and enroll of patent and management of technology transfer.

  1. Knowledge Management through a Fully Extensible, Schema Independent, XML Database

    Direen, H

    2001-01-01

    ... (databases in particular) is that the context must be predefined. In a field that is developing as fast as bioinformatics, it is as impossible to predefine all of the context as it is to predefine all of the data that is being...

  2. Knowledge Based Engineering for Spatial Database Management and Use

    Peuquet, D. (Principal Investigator)

    1984-01-01

    The use of artificial intelligence techniques that are applicable to Geographic Information Systems (GIS) are examined. Questions involving the performance and modification to the database structure, the definition of spectra in quadtree structures and their use in search heuristics, extension of the knowledge base, and learning algorithm concepts are investigated.

  3. USDA food and nutrient databases provide the infrastructure for food and nutrition research, policy, and practice.

    Ahuja, Jaspreet K C; Moshfegh, Alanna J; Holden, Joanne M; Harris, Ellen

    2013-02-01

    The USDA food and nutrient databases provide the basic infrastructure for food and nutrition research, nutrition monitoring, policy, and dietary practice. They have had a long history that goes back to 1892 and are unique, as they are the only databases available in the public domain that perform these functions. There are 4 major food and nutrient databases released by the Beltsville Human Nutrition Research Center (BHNRC), part of the USDA's Agricultural Research Service. These include the USDA National Nutrient Database for Standard Reference, the Dietary Supplement Ingredient Database, the Food and Nutrient Database for Dietary Studies, and the USDA Food Patterns Equivalents Database. The users of the databases are diverse and include federal agencies, the food industry, health professionals, restaurants, software application developers, academia and research organizations, international organizations, and foreign governments, among others. Many of these users have partnered with BHNRC to leverage funds and/or scientific expertise to work toward common goals. The use of the databases has increased tremendously in the past few years, especially the breadth of uses. These new uses of the data are bound to increase with the increased availability of technology and public health emphasis on diet-related measures such as sodium and energy reduction. Hence, continued improvement of the databases is important, so that they can better address these challenges and provide reliable and accurate data.

  4. Applying AN Object-Oriented Database Model to a Scientific Database Problem: Managing Experimental Data at Cebaf.

    Ehlmann, Bryon K.

    Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.

  5. Delivering research output to the user using ICT services: Marine contamination database web interface

    Abdul Muin Abdul Rahman; Abdul Khalik Wood; Zaleha Hashim; Burhanuddin Ahmad; Saaidi Ismail; Mohamad Safuan Sulaiman; Md Suhaimi Elias

    2010-01-01

    This project is about developing a web-based interface for accessing the Marine Contamination database records. The system contains of information pertaining to the occurrence of contaminants and natural elements in the marine eco-system based on samples taken at various locations within the shores of Malaysia in the form of sediment, seawater and marine biota. It represents a systematic approach for recording, storing and managing the vast amount of marine environmental data collected as output of the Marine Contamination and Transport Phenomena Research Project since 1990. The resultant collection of data is to form the background information (or baseline data) which could later be used to monitor the level of marine environmental pollutions around the country. Data collected from the various sampling and related laboratory activities are previously kept in conventional forms such as Excel worksheets and other documents, both in digital and/or paper form. With the help of modern database storage and retrieval techniques, the task of storage and retrieval of data has been made easier and manageable. It can also provide easy access to other parties who are interested in the data. (author)

  6. The Camden & Islington Research Database: Using electronic mental health records for research.

    Werbeloff, Nomi; Osborn, David P J; Patel, Rashmi; Taylor, Matthew; Stewart, Robert; Broadbent, Matthew; Hayes, Joseph F

    2018-01-01

    Electronic health records (EHRs) are widely used in mental health services. Case registers using EHRs from secondary mental healthcare have the potential to deliver large-scale projects evaluating mental health outcomes in real-world clinical populations. We describe the Camden and Islington NHS Foundation Trust (C&I) Research Database which uses the Clinical Record Interactive Search (CRIS) tool to extract and de-identify routinely collected clinical information from a large UK provider of secondary mental healthcare, and demonstrate its capabilities to answer a clinical research question regarding time to diagnosis and treatment of bipolar disorder. The C&I Research Database contains records from 108,168 mental health patients, of which 23,538 were receiving active care. The characteristics of the patient population are compared to those of the catchment area, of London, and of England as a whole. The median time to diagnosis of bipolar disorder was 76 days (interquartile range: 17-391) and median time to treatment was 37 days (interquartile range: 5-194). Compulsory admission under the UK Mental Health Act was associated with shorter intervals to diagnosis and treatment. Prior diagnoses of other psychiatric disorders were associated with longer intervals to diagnosis, though prior diagnoses of schizophrenia and related disorders were associated with decreased time to treatment. The CRIS tool, developed by the South London and Maudsley NHS Foundation Trust (SLaM) Biomedical Research Centre (BRC), functioned very well at C&I. It is reassuring that data from different organizations deliver similar results, and that applications developed in one Trust can then be successfully deployed in another. The information can be retrieved in a quicker and more efficient fashion than more traditional methods of health research. The findings support the secondary use of EHRs for large-scale mental health research in naturalistic samples and settings investigated across large

  7. Data management and database framework for the MICE experiment

    Martyniak, J.; Nebrensky, J. J.; Rajaram, D.; MICE Collaboration

    2017-10-01

    The international Muon Ionization Cooling Experiment (MICE) currently operating at the Rutherford Appleton Laboratory in the UK, is designed to demonstrate the principle of muon ionization cooling for application to a future Neutrino Factory or Muon Collider. We present the status of the framework for the movement and curation of both raw and reconstructed data. A raw data-mover has been designed to safely upload data files onto permanent tape storage as soon as they have been written out. The process has been automated, and checks have been built in to ensure the integrity of data at every stage of the transfer. The data processing framework has been recently redesigned in order to provide fast turnaround of reconstructed data for analysis. The automated reconstruction is performed on a dedicated machine in the MICE control room and any reprocessing is done at Tier-2 Grid sites. In conjunction with this redesign, a new reconstructed-data-mover has been designed and implemented. We also review the implementation of a robust database system that has been designed for MICE. The processing of data, whether raw or Monte Carlo, requires accurate knowledge of the experimental conditions. MICE has several complex elements ranging from beamline magnets to particle identification detectors to superconducting magnets. A Configuration Database, which contains information about the experimental conditions (magnet currents, absorber material, detector calibrations, etc.) at any given time has been developed to ensure accurate and reproducible simulation and reconstruction. A fully replicated, hot-standby database system has been implemented with a firewall-protected read-write master running in the control room, and a read-only slave running at a different location. The actual database is hidden from end users by a Web Service layer, which provides platform and programming language-independent access to the data.

  8. The research of network database security technology based on web service

    Meng, Fanxing; Wen, Xiumei; Gao, Liting; Pang, Hui; Wang, Qinglin

    2013-03-01

    Database technology is one of the most widely applied computer technologies, its security is becoming more and more important. This paper introduced the database security, network database security level, studies the security technology of the network database, analyzes emphatically sub-key encryption algorithm, applies this algorithm into the campus-one-card system successfully. The realization process of the encryption algorithm is discussed, this method is widely used as reference in many fields, particularly in management information system security and e-commerce.

  9. A survey of the use of database management systems in accelerator projects

    Poole, John; Strubin, Pierre M

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accele...

  10. Discussions about acceptance of the free software for management and creation of referencial database for papers

    Flavio Ribeiro Córdula

    2016-03-01

    Full Text Available Objective. This research aimed to determine the degree of acceptance, by the use of the Technology Acceptance Model - TAM, of the developed software, which allows the construction and database management of scientific articles aimed at assisting in the dissemination and retrieval of stored scientific production in digital media. Method. The research is characterized as quantitative, since the TAM, which guided this study is essentially quantitative. A questionnaire developed according to TAM guidelines was used as a tool for data collection. Results. It was possible to verify that this software, despite the need of fixes and improvements inherent to this type of tool, obtained a relevant degree of acceptance by the sample studied. Conciderations. It also should be noted that although this research has been directed to scholars in the field of information science, the idea that justified the creation of the software used in this study might contribute to the development of science in any field of knowledge, aiming at the optimization results of a search conducted in a specialized database can provide.

  11. The Prototype Automated Research Management System (ARMS)

    Prekop, Paul

    2004-01-01

    Automated Research Management System (ARMS) is a knowledge management application designed to address many of the knowledge management problems identified by SmartWays and FASSP's Knowledge Management Review...

  12. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  13. Report on the first Twente Data Management Workshop on XML Databases and Information Retrieval

    Hiemstra, Djoerd; Mihajlovic, V.

    2004-01-01

    The Database Group of the University of Twente initiated a new series of workshops called Twente Data Management workshops (TDM), starting with one on XML Databases and Information Retrieval which took place on 21 June 2004 at the University of Twente. We have set ourselves two goals for the

  14. Management of research reactor ageing

    1995-03-01

    As of December 1993, about one quarter of the operating research reactors were over 30 years old. The long life of research reactors has raised some concern amongst research reactor operators, regulators and, to some extent, the general public. The International Atomic Energy Agency commenced activities on the topic of research reactor ageing by appointing an internal working group in 1988 and convening a Consultants Meeting in 1989. The subject was also discussed at an international symposium and a regional seminar held in 1989 and 1992 respectively. A draft document incorporating information and experience exchanged at the above meetings was reviewed by a Technical Committee Meeting held in Vienna in 1992. The present TECDOC is the outcome of this meeting and contains recommendations, guidelines and information on the management of research reactor ageing, which should be used in conjunction with related publications of the IAEA Research Reactor Safety Programme, which are referenced throughout the text. This TECDOC will be of interest to operators and regulators involved with the safe operation of any type of research reactor to (a) understand the behaviour and influence of ageing mechanisms on the reactor structures, systems and components; (b) detect and assess the effect of ageing; (c) establish preventive and corrective measures to mitigate these effects; and (d) make decisions aimed at the safe and continued operation of a research reactor. 32 refs, tabs

  15. Management of research reactor ageing

    NONE

    1995-03-01

    As of December 1993, about one quarter of the operating research reactors were over 30 years old. The long life of research reactors has raised some concern amongst research reactor operators, regulators and, to some extent, the general public. The International Atomic Energy Agency commenced activities on the topic of research reactor ageing by appointing an internal working group in 1988 and convening a Consultants Meeting in 1989. The subject was also discussed at an international symposium and a regional seminar held in 1989 and 1992 respectively. A draft document incorporating information and experience exchanged at the above meetings was reviewed by a Technical Committee Meeting held in Vienna in 1992. The present TECDOC is the outcome of this meeting and contains recommendations, guidelines and information on the management of research reactor ageing, which should be used in conjunction with related publications of the IAEA Research Reactor Safety Programme, which are referenced throughout the text. This TECDOC will be of interest to operators and regulators involved with the safe operation of any type of research reactor to (a) understand the behaviour and influence of ageing mechanisms on the reactor structures, systems and components; (b) detect and assess the effect of ageing; (c) establish preventive and corrective measures to mitigate these effects; and (d) make decisions aimed at the safe and continued operation of a research reactor. 32 refs, tabs.

  16. Romanian contribution to research infrastructure database for EPOS

    Ionescu, Constantin; Craiu, Andreea; Tataru, Dragos; Balan, Stefan; Muntean, Alexandra; Nastase, Eduard; Oaie, Gheorghe; Asimopolos, Laurentiu; Panaiotu, Cristian

    2014-05-01

    European Plate Observation System - EPOS is a long-term plan to facilitate integrated use of data, models and facilities from mainly distributed existing, but also new, research infrastructures for solid Earth Science. In EPOS Preparatory Phase were integrated the national Research Infrastructures at pan European level in order to create the EPOS distributed research infrastructures, structure in which, at the present time, Romania participates by means of the earth science research infrastructures of the national interest declared on the National Roadmap. The mission of EPOS is to build an efficient and comprehensive multidisciplinary research platform for solid Earth Sciences in Europe and to allow the scientific community to study the same phenomena from different points of view, in different time periods and spatial scales (laboratory and field experiments). At national scale, research and monitoring infrastructures have gathered a vast amount of geological and geophysical data, which have been used by research networks to underpin our understanding of the Earth. EPOS promotes the creation of comprehensive national and regional consortia, as well as the organization of collective actions. To serve the EPOS goals, in Romania a group of National Research Institutes, together with their infrastructures, gathered in an EPOS National Consortium, as follows: 1. National Institute for Earth Physics - Seismic, strong motion, GPS and Geomagnetic network and Experimental Laboratory; 2. National Institute of Marine Geology and Geoecology - Marine Research infrastructure and Euxinus integrated regional Black Sea observation and early-warning system; 3. Geological Institute of Romania - Surlari National Geomagnetic Observatory and National lithoteque (the latter as part of the National Museum of Geology) 4. University of Bucharest - Paleomagnetic Laboratory After national dissemination of EPOS initiative other Research Institutes and companies from the potential

  17. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    Surasak Saokaew

    Full Text Available Health technology assessment (HTA has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced.Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided.Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources.Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  18. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    Saokaew, Surasak; Sugimoto, Takashi; Kamae, Isao; Pratoomsoot, Chayanin; Chaiyakunapruk, Nathorn

    2015-01-01

    Health technology assessment (HTA) has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced. Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided. Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources. Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  19. THE KNOWLEDGE MANAGEMENT FOR BEST PRACTICES SHARING IN A DATABASE AT THE TRIBUNAL REGIONAL FEDERAL DA PRIMEIRA REGIÃO

    Márcia Mazo Santos de Miranda

    2010-08-01

    Full Text Available A quick, effective and powerful alternative for knowledge management is the systematic sharing of best practices. This study identified in the literature recommendations for structuring a best practices database and summarized the benefits of its deployment to the Tribunal Regional Federal da Primeira Região TRF - 1ª Região. A It was conducted a quantitative research was then carried out, with the distribuition of where questionnaires were distributed to federal judges of the TRF- 1ª Região, which was divided into 4 parts: magistrate profile, flow of knowledge / information, internal environment, organizational facilitator. As a result, we identified the need to have a best practices database in the Institution for the organizational knowledge identification, transfer and sharing. The conclusion presents recommendations for the development of the database and highlights its importance for knowledge management in an organization.

  20. Design and Performance of a Xenobiotic Metabolism Database Manager for Building Metabolic Pathway Databases

    A major challenge for scientists and regulators is accounting for the metabolic activation of chemicals that may lead to increased toxicity. Reliable forecasting of chemical metabolism is a critical factor in estimating a chemical’s toxic potential. Research is underway to develo...

  1. NaKnowBaseTM: The EPA Nanomaterials Research Database

    The ability to predict the environmental and health implications of engineered nanomaterials is an important research priority due to the exponential rate at which nanotechnology is being incorporated into consumer, industrial and biomedical applications. To address this need and...

  2. Design and implementation of component reliability database management system for NPP

    Kim, S. H.; Jung, J. K.; Choi, S. Y.; Lee, Y. H.; Han, S. H.

    1999-01-01

    KAERI is constructing the component reliability database for Korean nuclear power plant. This paper describes the development of data management tool, which runs for component reliability database. This is running under intranet environment and is used to analyze the failure mode and failure severity to compute the component failure rate. Now we are developing the additional modules to manage operation history, test history and algorithms for calculation of component failure history and reliability

  3. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    Wolery, T W; Sutton, M

    2011-09-19

    , meaning that they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  4. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    Wolery, T.W.; Sutton, M.

    2011-01-01

    they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  5. Zebrafish Database: Customizable, Free, and Open-Source Solution for Facility Management.

    Yakulov, Toma Antonov; Walz, Gerd

    2015-12-01

    Zebrafish Database is a web-based customizable database solution, which can be easily adapted to serve both single laboratories and facilities housing thousands of zebrafish lines. The database allows the users to keep track of details regarding the various genomic features, zebrafish lines, zebrafish batches, and their respective locations. Advanced search and reporting options are available. Unique features are the ability to upload files and images that are associated with the respective records and an integrated calendar component that supports multiple calendars and categories. Built on the basis of the Joomla content management system, the Zebrafish Database is easily extendable without the need for advanced programming skills.

  6. Land, Oil Spill, and Waste Management Research Publications in the Science Inventory

    Resources from the Science Inventory database of EPA's Office of Research and Development, as well as EPA's Science Matters journal, include research on managing contaminated sites and ground water modeling and decontamination technologies.

  7. DANBIO-powerful research database and electronic patient record

    Hetland, Merete Lund

    2011-01-01

    is based on open-source software. Via a unique personal identification code, linkage with various national registers is possible for research purposes. Since the year 2000, more than 10,000 patients have been included. The main focus of research has been on treatment efficacy and drug survival. Compared...... an overview of the research outcome and presents the cohorts of RA patients. The registry, which is approved as a national quality registry, includes patients with RA, PsA and AS, who are followed longitudinally. Data are captured electronically from the source (patients and health personnel). The IT platform...... with RA patients, who were on conventional treatment with DMARDs, the patients who started biological treatment were younger, had longer disease duration, higher disease activity, tried more DMARDs and received more prednisolone. Also, more patients on biological therapy were seropositive and had erosive...

  8. Computer application for database management and networking of service radio physics

    Ferrando Sanchez, A.; Cabello Murillo, E.; Diaz Fuentes, R.; Castro Novais, J.; Clemente Gutierrez, F.; Casa de Juan, M. A. de la; Adaimi Hernandez, P.

    2011-01-01

    The databases in the quality control prove to be a powerful tool for recording, management and statistical process control. Developed in a Windows environment and under Access (Micros of Office) our service implements this philosophy on the canter's computer network. A computer that acts as the server provides the database to the treatment units to record quality control measures daily and incidents. To remove shortcuts stop working with data migration, possible use of duplicate and erroneous data loss because of errors in network connections, which are common problems, we proceeded to manage connections and access to databases ease of maintenance and use horn to all service personnel.

  9. Role of Database Management Systems in Selected Engineering Institutions of Andhra Pradesh: An Analytical Survey

    Kutty Kumar

    2016-06-01

    Full Text Available This paper aims to analyze the function of database management systems from the perspective of librarians working in engineering institutions in Andhra Pradesh. Ninety-eight librarians from one hundred thirty engineering institutions participated in the study. The paper reveals that training by computer suppliers and software packages are the significant mode of acquiring DBMS skills by librarians; three-fourths of the librarians are postgraduate degree holders. Most colleges use database applications for automation purposes and content value. Electrical problems and untrained staff seem to be major constraints faced by respondents for managing library databases.

  10. Development of a Framework for Multimodal Research: Creation of a Bibliographic Database

    Coovert, Michael D; Gray, Ashley A; Elliott, Linda R; Redden, Elizabeth S

    2007-01-01

    .... The results of the overall effort, the multimodal framework and article tracking sheet, bibliographic database, and searchable multimodal database make substantial and valuable contributions to the accumulation and interpretation of multimodal research. References collected in this effort are listed in the appendix.

  11. A database to manage flood risk in Catalonia

    Echeverria, S.; Toldrà, R.; Verdaguer, I.

    2009-09-01

    We call priority action spots those local sites where heavy rain, increased river flow, sea storms and other flooding phenomena can cause human casualties or severe damage to property. Some examples are campsites, car parks, roads, chemical factories… In order to keep to a minimum the risk of these spots, both a prevention programme and an emergency response programme are required. The flood emergency plan of Catalonia (INUNCAT) prepared in 2005 included already a listing of priority action spots compiled by the Catalan Water Agency (ACA), which was elaborated taking into account past experience, hydraulic studies and information available by several knowledgeable sources. However, since land use evolves with time this listing of priority action spots has become outdated and incomplete. A new database is being built. Not only does this new database update and expand the previous listing, but adds to each entry information regarding prevention measures and emergency response: which spots are the most hazardous, under which weather conditions problems arise, which ones should have their access closed as soon as these conditions are forecast or actually given, which ones should be evacuated, who is in charge of the preventive actions or emergency response and so on. Carrying out this programme has to be done with the help and collaboration of all the organizations involved, foremost with the local authorities in the areas at risk. In order to achieve this goal a suitable geographical information system is necessary which can be easily used by all actors involved in this project. The best option has turned out to be the Spatial Data Infrastructure of Catalonia (IDEC), a platform to share spatial data on the Internet involving the Generalitat de Catalunya, Localret (a consortium of local authorities that promotes information technology) and other institutions.

  12. USING THE INTERNATIONAL SCIENTOMETRIC DATABASES OF OPEN ACCESS IN SCIENTIFIC RESEARCH

    O. Galchevska

    2015-05-01

    Full Text Available In the article the problem of the use of international scientometric databases in research activities as web-oriented resources and services that are the means of publication and dissemination of research results is considered. Selection criteria of scientometric platforms of open access in conducting scientific researches (coverage Ukrainian scientific periodicals and publications, data accuracy, general characteristics of international scientometrics database, technical, functional characteristics and their indexes are emphasized. The review of the most popular scientometric databases of open access Google Scholar, Russian Scientific Citation Index (RSCI, Scholarometer, Index Copernicus (IC, Microsoft Academic Search is made. Advantages of usage of International Scientometrics database Google Scholar in conducting scientific researches and prospects of research that are in the separation of cloud information and analytical services of the system are determined.

  13. A relational database for personnel radiation exposure management

    David, W.; Miller, P.D.

    1993-01-01

    In-house utility personnel developed a relational data base for personnel radiation exposure management computer system during a 2 1/2 year period. The (PREM) Personnel Radiation Exposure Management System was designed to meet current Nuclear Regulatory Commission (NRC) requirements related to radiological access control, Radiation Work Permits (RWP) management, automated personnel dosimetry reporting, ALARA planning and repetitive job history dose archiving. The system has been operational for the past 18 months which includes a full refueling outage at Clinton Power Station. The Radiation Protection Department designed PREM to establish a software platform for implementing future revisions to 10CFR20 in 1993. Workers acceptance of the system has been excellent. Regulatory officials have given the system high marks as a radiological tool because of the system's ability to track the entire job from start to finish

  14. Development of a marketing strategy for the Coal Research Establishment`s emissions monitoring database

    Beer, A.D.; Hughes, I.S.C. [British Coal Corporation, Stoke Orchard (United Kingdom). Coal Research Establishment

    1995-06-01

    A summary is presented of the results of work conducted by the UK`s Coal Research Establishment (CRE) between April 1994 and December 1994 following the completion of a project on the utilisation and publication of an emissions monitoring database. The database contains emissions data for most UK combustion plant, gathered over the past 10 years. The aim of this further work was to identify the strengths and weaknesses of CRE`s database, to investigate potential additional sources of data, and to develop a strategy for marketing the information contained within the database to interested parties. 3 figs.

  15. Managing Consistency Anomalies in Distributed Integrated Databases with Relaxed ACID Properties

    Frank, Lars; Ulslev Pedersen, Rasmus

    2014-01-01

    In central databases the consistency of data is normally implemented by using the ACID (Atomicity, Consistency, Isolation and Durability) properties of a DBMS (Data Base Management System). This is not possible if distributed and/or mobile databases are involved and the availability of data also...... has to be optimized. Therefore, we will in this paper use so called relaxed ACID properties across different locations. The objective of designing relaxed ACID properties across different database locations is that the users can trust the data they use even if the distributed database temporarily...... is inconsistent. It is also important that disconnected locations can operate in a meaningful way in socalled disconnected mode. A database is DBMS consistent if its data complies with the consistency rules of the DBMS's metadata. If the database is DBMS consistent both when a transaction starts and when it has...

  16. Databases in the documentation management for big industrial projects

    Cauchet, A.; Chevillard, F.; Parisot, Y.; Tirefort, C.

    1990-05-01

    The documentation management of a big industrial project involves a continuous update of information, both in the study and realization phase or in the operation phase. The organization of the technical documentation for big industrial projects requests complex information systems. In the first part of this paper are presented the methods appropriate for the analysis of documentation management procedures and in the second part are presented the tools by the combination of which a documentation system for the user is provided. The case of the documentation centres for the Hague reprocessing plant is described

  17. Operations management research methodologies using quantitative modeling

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  18. Data management and database structure at the ARS Culture Collection

    The organization and management of collection data for the 96,000 strains held in the ARS Culture Collection has been an ongoing process. Originally, the records for the four separate collections were maintained by individual curators in notebooks and/or card files and subsequently on the National C...

  19. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

  20. A Relational Database of WHO Mortality Data Prepared to Facilitate Global Mortality Research

    Albert de Roos

    2015-09-01

    Full Text Available Detailed world mortality data such as collected by the World Health Organization gives a wealth of information about causes of death worldwide over a time span of 60 year. However, the raw mortality data in text format as provided by the WHO is not directly suitable for systematic research and data mining. In this Data Paper, a relational database is presented that is created from the raw WHO mortality data set and includes mortality rates, an ICD-code table and country reference data. This enriched database, as a corpus of global mortality data, can be readily imported in relational databases but can also function as the data source for other types of databases. The use of this database can therefore greatly facilitate global epidemiological research that may provide new clues to genetic or environmental factors in the origins of diseases.

  1. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available

  2. Noise data management using commercially available data-base software

    Damiano, B.; Thie, J.A.

    1988-01-01

    A data base has been created using commercially available software to manage the data collected by an automated noise data acquisition system operated by Oak Ridge National Laboratory at the Fast Flux Test Facility (FFTF). The data base was created to store, organize, and retrieve selected features of the nuclear and process signal noise data, because the large volume of data collected by the automated system makes manual data handling and interpretation based on visual examination of noise signatures impractical. Compared with manual data handling, use of the data base allows the automatically collected data to be utilized more fully and effectively. The FFTF noise data base uses the Oracle Relational Data Base Management System implemented on a desktop personal computer

  3. A meta-database comparison from various European Research and Monitoring Networks dedicated to forest sites

    Danielewska, A.; Clarke, N.; Olejnik, Janusz; Hansen, K.; de Vries, W.; Lundin, L.; Tuovinen, J-P.; Fischer, R.; Urbaniak, M.; Paoletti, E.

    2013-01-01

    Roč. 6, JAN 2013 (2013), s. 1-9 ISSN 1971-7458 Institutional support: RVO:67179843 Keywords : Research and Monitoring Network * Meta- database * Forest * Monitoring Subject RIV: EH - Ecology, Behaviour Impact factor: 1.150, year: 2013

  4. SeedStor: A Germplasm Information Management System and Public Database.

    Horler, R S P; Turner, A S; Fretter, P; Ambrose, M

    2018-01-01

    SeedStor (https://www.seedstor.ac.uk) acts as the publicly available database for the seed collections held by the Germplasm Resources Unit (GRU) based at the John Innes Centre, Norwich, UK. The GRU is a national capability supported by the Biotechnology and Biological Sciences Research Council (BBSRC). The GRU curates germplasm collections of a range of temperate cereal, legume and Brassica crops and their associated wild relatives, as well as precise genetic stocks, near-isogenic lines and mapping populations. With >35,000 accessions, the GRU forms part of the UK's plant conservation contribution to the Multilateral System (MLS) of the International Treaty for Plant Genetic Resources for Food and Agriculture (ITPGRFA) for wheat, barley, oat and pea. SeedStor is a fully searchable system that allows our various collections to be browsed species by species through to complicated multipart phenotype criteria-driven queries. The results from these searches can be downloaded for later analysis or used to order germplasm via our shopping cart. The user community for SeedStor is the plant science research community, plant breeders, specialist growers, hobby farmers and amateur gardeners, and educationalists. Furthermore, SeedStor is much more than a database; it has been developed to act internally as a Germplasm Information Management System that allows team members to track and process germplasm requests, determine regeneration priorities, handle cost recovery and Material Transfer Agreement paperwork, manage the Seed Store holdings and easily report on a wide range of the aforementioned tasks. © The Author(s) 2017. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.

  5. Security and Health Research Databases: The Stakeholders and Questions to Be Addressed

    Stewart, Sara

    2006-01-01

    Health research database security issues abound. Issues include subject confidentiality, data ownership, data integrity and data accessibility. There are also various stakeholders in database security. Each of these stakeholders has a different set of concerns and responsibilities when dealing with security issues. There is an obvious need for training in security issues, so that these issues may be addressed and health research will move on without added obstacles based on misunderstanding s...

  6. Security and health research databases: the stakeholders and questions to be addressed.

    Stewart, Sara

    2006-01-01

    Health research database security issues abound. Issues include subject confidentiality, data ownership, data integrity and data accessibility. There are also various stakeholders in database security. Each of these stakeholders has a different set of concerns and responsibilities when dealing with security issues. There is an obvious need for training in security issues, so that these issues may be addressed and health research will move on without added obstacles based on misunderstanding security methods and technologies.

  7. Standardizing terminology and definitions of medication adherence and persistence in research employing electronic databases.

    Raebel, Marsha A; Schmittdiel, Julie; Karter, Andrew J; Konieczny, Jennifer L; Steiner, John F

    2013-08-01

    To propose a unifying set of definitions for prescription adherence research utilizing electronic health record prescribing databases, prescription dispensing databases, and pharmacy claims databases and to provide a conceptual framework to operationalize these definitions consistently across studies. We reviewed recent literature to identify definitions in electronic database studies of prescription-filling patterns for chronic oral medications. We then develop a conceptual model and propose standardized terminology and definitions to describe prescription-filling behavior from electronic databases. The conceptual model we propose defines 2 separate constructs: medication adherence and persistence. We define primary and secondary adherence as distinct subtypes of adherence. Metrics for estimating secondary adherence are discussed and critiqued, including a newer metric (New Prescription Medication Gap measure) that enables estimation of both primary and secondary adherence. Terminology currently used in prescription adherence research employing electronic databases lacks consistency. We propose a clear, consistent, broadly applicable conceptual model and terminology for such studies. The model and definitions facilitate research utilizing electronic medication prescribing, dispensing, and/or claims databases and encompasses the entire continuum of prescription-filling behavior. Employing conceptually clear and consistent terminology to define medication adherence and persistence will facilitate future comparative effectiveness research and meta-analytic studies that utilize electronic prescription and dispensing records.

  8. Use of an INGRES database to implement the beam parameter management at GANIL

    Gillette, P.; Lecorche, E.; Lermine, P.; Maugeais, C.; Leboucher, Ch.; Moscatello, M.H.; Pain, P.

    1995-12-31

    Since the beginning of the operation driven by the new Ganil control system in February 1993, the relational database management system (RDBMS) Ingres has been more and more widely used. The most significant application relying on the RDBMS is the new beam parameter management which has been entirely redesigned. It has been operational since the end of the machine shutdown in July this year. After a short recall of the use of Ingres inside the control system, the organization of the parameter management is presented. Then the database implementation is shown, including the way how the physical aspects of the Ganil tuning have been integrated in such an environment. (author). 2 refs.

  9. Use of an INGRES database to implement the beam parameter management at GANIL

    Gillette, P; Lecorche, E; Lermine, P; Maugeais, C; Leboucher, Ch; Moscatello, M H; Pain, P

    1996-12-31

    Since the beginning of the operation driven by the new Ganil control system in February 1993, the relational database management system (RDBMS) Ingres has been more and more widely used. The most significant application relying on the RDBMS is the new beam parameter management which has been entirely redesigned. It has been operational since the end of the machine shutdown in July this year. After a short recall of the use of Ingres inside the control system, the organization of the parameter management is presented. Then the database implementation is shown, including the way how the physical aspects of the Ganil tuning have been integrated in such an environment. (author). 2 refs.

  10. Obstetrical ultrasound data-base management system by using personal computer

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  11. Use of an INGRES database to implement the beam parameter management at GANIL

    Gillette, P.; Lecorche, E.; Lermine, P.; Maugeais, C.; Leboucher, Ch.; Moscatello, M.H.; Pain, P.

    1995-01-01

    Since the beginning of the operation driven by the new Ganil control system in February 1993, the relational database management system (RDBMS) Ingres has been more and more widely used. The most significant application relying on the RDBMS is the new beam parameter management which has been entirely redesigned. It has been operational since the end of the machine shutdown in July this year. After a short recall of the use of Ingres inside the control system, the organization of the parameter management is presented. Then the database implementation is shown, including the way how the physical aspects of the Ganil tuning have been integrated in such an environment. (author)

  12. Nuclear plant operations, maintenance, and configuration management using three-dimensional computer graphics and databases

    Tutos, N.C.; Reinschmidt, K.F.

    1987-01-01

    Stone and Webster Engineering Corporation has developed the Plant Digital Model concept as a new approach to Configuration Mnagement of nuclear power plants. The Plant Digital Model development is a step-by-step process, based on existing manual procedures and computer applications, and is fully controllable by the plant managers and engineers. The Plant Digital Model is based on IBM computer graphics and relational database management systems, and therefore can be easily integrated with existing plant databases and corporate management-information systems

  13. Research in Hospitality Management: Editorial Policies

    Research in Hospitality Management is a peer-reviewed journal publishing papers that ... financial management, marketing, strategic management, economics, ... Articles covering social theory and the history and politics of the hospitality ...

  14. Ornamental rocks prospection in Uruguay. A new database territorial management

    Carmignani, L.; Gattiglio, S.; Masquelin, H.; Gomez Rifas, C.; Medina, E.; Da Silva, J.; Pirelli, H.

    1998-01-01

    Here are exposed the main of the last ornamental rocks inventory and their exploitation ambiental implicances. The project was realized between the Uruguayan government (Ministry of Industries, Energy and Mining) and the economical European Community (C.E.E). a two -fold target was poursuit. The first o administrative order in the sense that results of such recensement could allow to review the ornamental rocks management in a more efficiently and realistic manner. The second of geoeconomical order permits to re-evaluate traditional ornamental rocks facilities (marbles and granites) form the marketing and either the valoration of a new generation of ornamental materials also. (author)

  15. Planning the future of JPL's management and administrative support systems around an integrated database

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  16. Workshop presentation: research guidelines for Construction Management

    Marco Alvise Bragadin

    2013-10-01

    Full Text Available Nowadays the European economic system challenges the construction sector to take part to industrial recovery of western countries. In co-operation with the Construction Production research group of the Tampere University of of research about construction management tools and methods were detected. Research guidelines: 1 Construction management: tools and methods to manage construction projects 2 environmental impact of construction projects 3 construction management and safety 4 project procurement 5 construction management for major public works & complex projects

  17. The Database Management Module of the Splice System.

    1983-06-01

    standardization is the only wise chocs . E. FUNCTIONS OF THE EATABASE MkNAGEMENT MODULE As a result of onqoing research in thmc impl1msntaticn of SPLICE, thns...u an e-v Offset by one or mc--l orders of ma#-inuIs inorcvesnnt --L tue execution time cf user transacdrioas. Purthermore, ’is s-toraqe requlrement

  18. African Journal of Management Research: Editorial Policies

    Topics and themes appropriate for African Journal of Management Research will ... African Journal of Management Research maintains a 2-3 month turnaround time from submission to decision. ... Emeritus Professor, Goldsmiths College, UK.

  19. African Journal of Management Research: Journal Sponsorship

    African Journal of Management Research: Journal Sponsorship. Journal Home > About the Journal > African Journal of Management Research: Journal Sponsorship. Log in or Register to get access to full text downloads.

  20. Links between Conflict Management Research and Practice

    Roloff, Michael E.

    2009-01-01

    This paper explicates the implications of my research on conflict management for self improvement and for practitioners who work to improve the conflict management of others. I also note how my experiences with practitioners have informed my research.

  1. Legacy2Drupal - Conversion of an existing oceanographic relational database to a semantically enabled Drupal content management system

    Maffei, A. R.; Chandler, C. L.; Work, T.; Allen, J.; Groman, R. C.; Fox, P. A.

    2009-12-01

    Content Management Systems (CMSs) provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have previously designed customized schemas for their metadata. The WHOI Ocean Informatics initiative and the NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) have jointly sponsored a project to port an existing, relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal6 Content Management System. The goal was to translate all the existing database tables, input forms, website reports, and other features present in the existing system to employ Drupal CMS features. The replacement features include Drupal content types, CCK node-reference fields, themes, RDB, SPARQL, workflow, and a number of other supporting modules. Strategic use of some Drupal6 CMS features enables three separate but complementary interfaces that provide access to oceanographic research metadata via the MySQL database: 1) a Drupal6-powered front-end; 2) a standard SQL port (used to provide a Mapserver interface to the metadata and data; and 3) a SPARQL port (feeding a new faceted search capability being developed). Future plans include the creation of science ontologies, by scientist/technologist teams, that will drive semantically-enabled faceted search capabilities planned for the site. Incorporation of semantic technologies included in the future Drupal 7 core release is also anticipated. Using a public domain CMS as opposed to proprietary middleware, and taking advantage of the many features of Drupal 6 that are designed to support semantically-enabled interfaces will help prepare the BCO-DMO database for interoperability with other ecosystem databases.

  2. Research Supervision: The Research Management Matrix

    Maxwell, T. W.; Smyth, Robyn

    2010-01-01

    We briefly make a case for re-conceptualising research project supervision/advising as the consideration of three inter-related areas: the learning and teaching process; developing the student; and producing the research project/outcome as a social practice. We use this as our theoretical base for an heuristic tool, "the research management…

  3. Comparison of scientific and administrative database management systems

    Stoltzfus, J. C.

    1983-01-01

    Some characteristics found to be different for scientific and administrative data bases are identified and some of the corresponding generic requirements for data base management systems (DBMS) are discussed. The requirements discussed are especially stringent for either the scientific or administrative data bases. For some, no commercial DBMS is fully satisfactory, and the data base designer must invent a suitable approach. For others, commercial systems are available with elegant solutions, and a wrong choice would mean an expensive work-around to provide the missing features. It is concluded that selection of a DBMS must be based on the requirements for the information system. There is no unique distinction between scientific and administrative data bases or DBMS. The distinction comes from the logical structure of the data, and understanding the data and their relationships is the key to defining the requirements and selecting an appropriate DBMS for a given set of applications.

  4. Modelling a critical infrastructure-driven spatial database for proactive disaster management: A developing country context

    David O. Baloye

    2016-04-01

    Full Text Available The understanding and institutionalisation of the seamless link between urban critical infrastructure and disaster management has greatly helped the developed world to establish effective disaster management processes. However, this link is conspicuously missing in developing countries, where disaster management has been more reactive than proactive. The consequence of this is typified in poor response time and uncoordinated ways in which disasters and emergency situations are handled. As is the case with many Nigerian cities, the challenges of urban development in the city of Abeokuta have limited the effectiveness of disaster and emergency first responders and managers. Using geospatial techniques, the study attempted to design and deploy a spatial database running a web-based information system to track the characteristics and distribution of critical infrastructure for effective use during disaster and emergencies, with the purpose of proactively improving disaster and emergency management processes in Abeokuta. Keywords: Disaster Management; Emergency; Critical Infrastructure; Geospatial Database; Developing Countries; Nigeria

  5. Creating a data exchange strategy for radiotherapy research: Towards federated databases and anonymised public datasets

    Skripcak, Tomas; Belka, Claus; Bosch, Walter; Brink, Carsten; Brunner, Thomas; Budach, Volker; Büttner, Daniel; Debus, Jürgen; Dekker, Andre; Grau, Cai; Gulliford, Sarah; Hurkmans, Coen; Just, Uwe

    2014-01-01

    Disconnected cancer research data management and lack of information exchange about planned and ongoing research are complicating the utilisation of internationally collected medical information for improving cancer patient care. Rapidly collecting/pooling data can accelerate translational research in radiation therapy and oncology. The exchange of study data is one of the fundamental principles behind data aggregation and data mining. The possibilities of reproducing the original study results, performing further analyses on existing research data to generate new hypotheses or developing computational models to support medical decisions (e.g. risk/benefit analysis of treatment options) represent just a fraction of the potential benefits of medical data-pooling. Distributed machine learning and knowledge exchange from federated databases can be considered as one beyond other attractive approaches for knowledge generation within “Big Data”. Data interoperability between research institutions should be the major concern behind a wider collaboration. Information captured in electronic patient records (EPRs) and study case report forms (eCRFs), linked together with medical imaging and treatment planning data, are deemed to be fundamental elements for large multi-centre studies in the field of radiation therapy and oncology. To fully utilise the captured medical information, the study data have to be more than just an electronic version of a traditional (un-modifiable) paper CRF. Challenges that have to be addressed are data interoperability, utilisation of standards, data quality and privacy concerns, data ownership, rights to publish, data pooling architecture and storage. This paper discusses a framework for conceptual packages of ideas focused on a strategic development for international research data exchange in the field of radiation therapy and oncology

  6. Creating a data exchange strategy for radiotherapy research: towards federated databases and anonymised public datasets.

    Skripcak, Tomas; Belka, Claus; Bosch, Walter; Brink, Carsten; Brunner, Thomas; Budach, Volker; Büttner, Daniel; Debus, Jürgen; Dekker, Andre; Grau, Cai; Gulliford, Sarah; Hurkmans, Coen; Just, Uwe; Krause, Mechthild; Lambin, Philippe; Langendijk, Johannes A; Lewensohn, Rolf; Lühr, Armin; Maingon, Philippe; Masucci, Michele; Niyazi, Maximilian; Poortmans, Philip; Simon, Monique; Schmidberger, Heinz; Spezi, Emiliano; Stuschke, Martin; Valentini, Vincenzo; Verheij, Marcel; Whitfield, Gillian; Zackrisson, Björn; Zips, Daniel; Baumann, Michael

    2014-12-01

    Disconnected cancer research data management and lack of information exchange about planned and ongoing research are complicating the utilisation of internationally collected medical information for improving cancer patient care. Rapidly collecting/pooling data can accelerate translational research in radiation therapy and oncology. The exchange of study data is one of the fundamental principles behind data aggregation and data mining. The possibilities of reproducing the original study results, performing further analyses on existing research data to generate new hypotheses or developing computational models to support medical decisions (e.g. risk/benefit analysis of treatment options) represent just a fraction of the potential benefits of medical data-pooling. Distributed machine learning and knowledge exchange from federated databases can be considered as one beyond other attractive approaches for knowledge generation within "Big Data". Data interoperability between research institutions should be the major concern behind a wider collaboration. Information captured in electronic patient records (EPRs) and study case report forms (eCRFs), linked together with medical imaging and treatment planning data, are deemed to be fundamental elements for large multi-centre studies in the field of radiation therapy and oncology. To fully utilise the captured medical information, the study data have to be more than just an electronic version of a traditional (un-modifiable) paper CRF. Challenges that have to be addressed are data interoperability, utilisation of standards, data quality and privacy concerns, data ownership, rights to publish, data pooling architecture and storage. This paper discusses a framework for conceptual packages of ideas focused on a strategic development for international research data exchange in the field of radiation therapy and oncology. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. A research on the enhancement of research management efficiency for the division of research, Korea cancer center hospital

    Lee, S. W.; Ma, K. H.; Kim, J. R.; Lee, D. C.; Lee, J. H.

    1999-06-01

    The research activities of Korea Cancer Center Hospital have increased for the past a few years just in proportion to the increase of research budget, but the assisting manpower of the office of research management has never been increased and the indications are that the internal and external circumstances will not allow the recruitment for a fairly long time. It has, therefore, become inevitable to enhance the work efficiency of the office by analyzing the administrative research assistance system, finding out problems and inefficiency factors, and suggesting possible answers to them. The office of research management and international cooperation has conducted this research to suggest possible ways to facilitate the administrative support for the research activities of Korea Cancer Center Hospital By analyzing the change of research budget, organization of the division of research and administrative support, manpower, and the administrative research supporting system of other institutes, we suggested possible ways to enhance the work efficiency for administrative research support and developed a relative database program. The research report will serve as a data for the organization of research support division when the Radiation Medicine Research Center is established. The database program has already been used for research budget management

  8. Performance management for academic researchers

    Jacobsen, Christian Bøtcher; Andersen, Lotte Bøgh

    2014-01-01

    The ability to design and implement performance management systems that motivate employees to high performance has become pivotal for many public leaders. Many public organizations use command systems which are based on the threat of sanctions, but our knowledge on the effects of such systems...... is very limited, because studies have focused on rewards instead. This article investigates how publication command systems (and especially the perception of them) affect individual researchers’ productivity. The typical publication command system consists of rules concerning the minimum number of journal...... articles required from each researcher and procedures for monitoring and sanctioning. Principal Agent Theory expects command systems to induce agents to work harder and perform better, whereas motivation crowding theory claims that the agents’ perception of the command system is the important factor...

  9. Content Based Retrieval Database Management System with Support for Similarity Searching and Query Refinement

    Ortega-Binderberger, Michael

    2002-01-01

    ... as a critical area of research. This thesis explores how to enhance database systems with content based search over arbitrary abstract data types in a similarity based framework with query refinement...

  10. The Future of Asset Management for Human Space Exploration: Supply Classification and an Integrated Database

    Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert

    2006-01-01

    One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.

  11. Knowledge base technology for CT-DIMS: Report 1. [CT-DIMS (Cutting Tool - Database and Information Management System)

    Kelley, E.E.

    1993-05-01

    This report discusses progress on the Cutting Tool-Database and Information Management System (CT-DIMS) project being conducted by the University of Illinois Urbana-Champaign (UIUC) under contract to the Department of Energy. This project was initiated in October 1991 by UIUC. The Knowledge-Based Engineering Systems Research Laboratory (KBESRL) at UIUC is developing knowledge base technology and prototype software for the presentation and manipulation of the cutting tool databases at Allied-Signal Inc., Kansas City Division (KCD). The graphical tool selection capability being developed for CT-DIMS in the Intelligent Design Environment for Engineering Automation (IDEEA) will provide a concurrent environment for simultaneous access to tool databases, tool standard libraries, and cutting tool knowledge.

  12. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  13. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Applying the archetype approach to the database of a biobank information management system.

    Späth, Melanie Bettina; Grimson, Jane

    2011-03-01

    The purpose of this study is to investigate the feasibility of applying the openEHR archetype approach to modelling the data in the database of an existing proprietary biobank information management system. A biobank information management system stores the clinical/phenotypic data of the sample donor and sample related information. The clinical/phenotypic data is potentially sourced from the donor's electronic health record (EHR). The study evaluates the reuse of openEHR archetypes that have been developed for the creation of an interoperable EHR in the context of biobanking, and proposes a new set of archetypes specifically for biobanks. The ultimate goal of the research is the development of an interoperable electronic biomedical research record (eBMRR) to support biomedical knowledge discovery. The database of the prostate cancer biobank of the Irish Prostate Cancer Research Consortium (PCRC), which supports the identification of novel biomarkers for prostate cancer, was taken as the basis for the modelling effort. First the database schema of the biobank was analyzed and reorganized into archetype-friendly concepts. Then, archetype repositories were searched for matching archetypes. Some existing archetypes were reused without change, some were modified or specialized, and new archetypes were developed where needed. The fields of the biobank database schema were then mapped to the elements in the archetypes. Finally, the archetypes were arranged into templates specifically to meet the requirements of the PCRC biobank. A set of 47 archetypes was found to cover all the concepts used in the biobank. Of these, 29 (62%) were reused without change, 6 were modified and/or extended, 1 was specialized, and 11 were newly defined. These archetypes were arranged into 8 templates specifically required for this biobank. A number of issues were encountered in this research. Some arose from the immaturity of the archetype approach, such as immature modelling support tools

  15. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  16. Databases and coordinated research projects at the IAEA on atomic processes in plasmas

    Braams, Bastiaan J.; Chung, Hyun-Kyung [Nuclear Data Section, NAPC Division, International Atomic Energy Agency P. O. Box 100, Vienna International Centre, AT-1400 Vienna (Austria)

    2012-05-25

    The Atomic and Molecular Data Unit at the IAEA works with a network of national data centres to encourage and coordinate production and dissemination of fundamental data for atomic, molecular and plasma-material interaction (A+M/PMI) processes that are relevant to the realization of fusion energy. The Unit maintains numerical and bibliographical databases and has started a Wiki-style knowledge base. The Unit also contributes to A+M database interface standards and provides a search engine that offers a common interface to multiple numerical A+M/PMI databases. Coordinated Research Projects (CRPs) bring together fusion energy researchers and atomic, molecular and surface physicists for joint work towards the development of new data and new methods. The databases and current CRPs on A+M/PMI processes are briefly described here.

  17. Databases and coordinated research projects at the IAEA on atomic processes in plasmas

    Braams, Bastiaan J.; Chung, Hyun-Kyung

    2012-05-01

    The Atomic and Molecular Data Unit at the IAEA works with a network of national data centres to encourage and coordinate production and dissemination of fundamental data for atomic, molecular and plasma-material interaction (A+M/PMI) processes that are relevant to the realization of fusion energy. The Unit maintains numerical and bibliographical databases and has started a Wiki-style knowledge base. The Unit also contributes to A+M database interface standards and provides a search engine that offers a common interface to multiple numerical A+M/PMI databases. Coordinated Research Projects (CRPs) bring together fusion energy researchers and atomic, molecular and surface physicists for joint work towards the development of new data and new methods. The databases and current CRPs on A+M/PMI processes are briefly described here.

  18. Databases and coordinated research projects at the IAEA on atomic processes in plasmas

    Braams, Bastiaan J.; Chung, Hyun-Kyung

    2012-01-01

    The Atomic and Molecular Data Unit at the IAEA works with a network of national data centres to encourage and coordinate production and dissemination of fundamental data for atomic, molecular and plasma-material interaction (A+M/PMI) processes that are relevant to the realization of fusion energy. The Unit maintains numerical and bibliographical databases and has started a Wiki-style knowledge base. The Unit also contributes to A+M database interface standards and provides a search engine that offers a common interface to multiple numerical A+M/PMI databases. Coordinated Research Projects (CRPs) bring together fusion energy researchers and atomic, molecular and surface physicists for joint work towards the development of new data and new methods. The databases and current CRPs on A+M/PMI processes are briefly described here.

  19. The IAEA's Net Enabled Waste Management Database: Overview and current status

    Csullog, G.W.; Bell, M.J.; Pozdniakov, I.; Petison, G.; Kostitsin, V.

    2002-01-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) contains information on national radioactive waste management programmes and organizations, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. The NEWMDB, which was launched on the Internet on 6 July 2001, is the successor to the IAEA's Waste Management Database (WMDB), which was in use during the 1990's. The NEWMDB's first data collection cycle took place from July 2001 to March 2002. This paper provides an overview of the NEWMDB, it describes the results of the first data collection cycle, and it discusses the way forward for additional data collection cycles. Three companion papers describe (1) the role of the NEWMDB as an international source of information about radioactive waste management, (2) issues related to the variety of waste classification schemes used by IAEA Member States, and (3) the NEWMDB in the context of an indicator of sustainable development for radioactive waste management. (author)

  20. Are Managed Futures Indices Telling Truth? Biases in CTA Databases and Proposals of Potential Enhancements

    Adam Zaremba

    2011-07-01

    Full Text Available Managed futures are an alternative asset class which has recently became considerably popular among investment industry. However, due to its characteristics, access to managed futures historical performance statistics is relatively confined. All available information originates from commercial and academic databases, reporting to which is entirely voluntary. This situation results in series of biases which distort the managed futures performance in the eyes of investors. The paper consists of two parts. First, the author reviews and describes various biases that influence the reliability of the managed futures indices and databases. The second section encompasses author’s proposals of potential enhancements, which aim to reduce the impact of the biases in order to derive a benchmark that could better reflect characteristics of managed futures investment from the point of view of a potential investor.

  1. The Knowledge Management Research of Agricultural Scientific Research Institution

    2010-01-01

    Based on the perception of knowledge management from experts specializing in different fields,and experts at home and abroad,the knowledge management of agricultural scientific research institution can build new platform,offer new approach for realization of explicit or tacit knowledge,and promote resilience and innovative ability of scientific research institution.The thesis has introduced functions of knowledge management research of agricultural science.First,it can transform the tacit knowledge into explicit knowledge.Second,it can make all the scientific personnel share knowledge.Third,it is beneficial to the development of prototype system of knowledge management.Fourth,it mainly researches the realization of knowledge management system.Fifth,it can manage the external knowledge via competitive intelligence.Sixth,it can foster talents of knowledge management for agricultural scientific research institution.Seventh,it offers the decision-making service for leaders to manage scientific program.The thesis also discusses the content of knowledge management of agricultural scientific research institution as follows:production and innovation of knowledge;attainment and organizing of knowledge;dissemination and share of knowledge;management of human resources and the construction and management of infrastructure.We have put forward corresponding countermeasures to further reinforce the knowledge management research of agricultural scientific research institution.

  2. Research nuclear reactor operation management

    Preda, M.; Carabulea, A.

    2008-01-01

    Some aspects of reactor operation management are highlighted. The main mission of the operational staff at a testing reactor is to operate it safely and efficiently, to ensure proper conditions for different research programs implying the use of the reactor. For reaching this aim, there were settled down operating plans for every objective, and procedure and working instructions for staff training were established, both for the start-up and for the safe operation of the reactor. Damages during operation or special situations which can arise, at stop, start-up, maintenance procedures were thoroughly considered. While the technical skill is considered to be the most important quality of the staff, the organising capacity is a must in the operation of any nuclear facility. Staff training aims at gaining both theoretical and practical experience based on standards about staff quality at each work level. 'Plow' sheet has to be carefully done, setting clear the decision responsibility for each person so that everyone's own technical level to be coupled to the problems which implies his responsibility. Possible events which may arise in operation, e.g., criticality, irradiation, contamination, and which do not arise in other fields, have to be carefully studied. One stresses that the management based on technical and scientific arguments have to cover through technical, economical and nuclear safety requirements a series of interlinked subprograms. Every such subprograms is subject to some peculiar demands by the help of which the entire activity field is coordinated. Hence for any subprogram there are established the objectives to be achieved, the applicable regulations, well-defined responsibilities, training of the personnel involved, the material and documentation basis required and activity planning. The following up of positive or negative responses generated by experiments and the information synthesis close the management scope. Important management aspects

  3. Workshop presentation: research guidelines for Construction Management

    Marco Alvise Bragadin

    2013-01-01

    Nowadays the European economic system challenges the construction sector to take part to industrial recovery of western countries. In co-operation with the Construction Production research group of the Tampere University of of research about construction management tools and methods were detected. Research guidelines: 1) Construction management: tools and methods to manage construction projects 2) environmental impact of construction projects 3) construction management and safety 4) project p...

  4. Review of radioactive waste management research in the Agency

    2002-01-01

    The report presents a concise summary of the Programme of Radioactive Waste Management Research carried out by the Agency in the period 1996 to 2001. It not only provides information, which is relevant to the Agency's responsibilities, but also offers an input to the government's development of a policy for managing solid radioactive waste in the UK. The research projects have included laboratory and field scientific studies, reviews of existing scientific data and understanding, development of assessment methodologies, and development of technical support software and databases. The Agency has participated widely in internationally-supported projects and on jointly-funded projects amongst UK regulators, advisory bodies and industry

  5. Creating a data exchange strategy for radiotherapy research : Towards federated databases and anonymised public datasets

    Skripcak, Tomas; Belka, Claus; Bosch, Walter; Brink, Carsten; Brunner, Thomas; Budach, Volker; Buettner, Daniel; Debus, Juergen; Dekker, Andre; Grau, Cai; Gulliford, Sarah; Hurkmans, Coen; Just, Uwe; Krause, Mechthild; Lambin, Philippe; Langendijk, Johannes A.; Lewensohn, Rolf; Luehr, Armin; Maingon, Philippe; Masucci, Michele; Niyazi, Maximilian; Poortmans, Philip; Simon, Monique; Schmidberger, Heinz; Spezi, Emiliano; Stuschke, Martin; Valentini, Vincenzo; Verheij, Marcel; Whitfield, Gillian; Zackrisson, Bjoern; Zips, Daniel; Baumann, Michael

    2014-01-01

    Disconnected cancer research data management and lack of information exchange about planned and ongoing research are complicating the utilisation of internationally collected medical information for improving cancer patient care. Rapidly collecting/pooling data can accelerate 'translational research

  6. Creating a data exchange strategy for radiotherapy research: Towards federated databases and anonymised public datasets

    Skripcak, T.; Belka, C.; Bosch, W.; Brink, C. Van den; Brunner, T.; Budach, V.; Buttner, D.; Debus, J.; Dekker, A.; Grau, C.; Gulliford, S.; Hurkmans, C.; Just, U.; Krause, M.; Lambin, P.; Langendijk, J.A.; Lewensohn, R.; Luhr, A.; Maingon, P.; Masucci, M.; Niyazi, M.; Poortmans, P.M.P.; Simon, M.; Schmidberger, H.; Spezi, E.; Stuschke, M.; Valentini, V.; Verheij, M.; Whitfield, G.; Zackrisson, B.; Zips, D.; Baumann, M.

    2014-01-01

    Disconnected cancer research data management and lack of information exchange about planned and ongoing research are complicating the utilisation of internationally collected medical information for improving cancer patient care. Rapidly collecting/pooling data can accelerate translational research

  7. Models, Tools, and Databases for Land and Waste Management Research

    These publicly available resources can be used for such tasks as simulating biodegradation or remediation of contaminants such as hydrocarbons, measuring sediment accumulation at superfund sites, or assessing toxicity and risk.

  8. Development of subsurface drainage database system for use in environmental management issues

    Azhar, A.H.; Rafiq, M.; Alam, M.M.

    2007-01-01

    A simple user-friendly menue-driven system for database management pertinent to the Impact of Subsurface Drainage Systems on Land and Water Conditions (ISIAW) has been developed for use in environment-management issues of the drainage areas. This database has been developed by integrating four soft wares, viz; Microsoft Excel, MS Word Acrobat and MS Access. The information, in the form of tables and figures, with respect to various drainage projects has been presented in MS Word files. The major data-sets of various subsurface drainage projects included in the ISLaW database are: i) technical aspects, ii) groundwater and soil-salinity aspects, iii) socio-technical aspects, iv) agro-economic aspects, and v) operation and maintenance aspects. The various ISlAW file can be accessed just by clicking at the Menu buttons of the database system. This database not only gives feed back on the functioning of different subsurface drainage projects, with respect to the above-mentioned aspects, but also serves as a resource-document for these data for future studies on other drainage projects. The developed database-system is useful for planners, designers and Farmers Organisations for improved operation of existing drainage projects as well as development of future ones. (author)

  9. KALIMER database development (database configuration and design methodology)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  10. Improving Care And Research Electronic Data Trust Antwerp (iCAREdata): a research database of linked data on out-of-hours primary care.

    Colliers, Annelies; Bartholomeeusen, Stefaan; Remmen, Roy; Coenen, Samuel; Michiels, Barbara; Bastiaens, Hilde; Van Royen, Paul; Verhoeven, Veronique; Holmgren, Philip; De Ruyck, Bernard; Philips, Hilde

    2016-05-04

    Primary out-of-hours care is developing throughout Europe. High-quality databases with linked data from primary health services can help to improve research and future health services. In 2014, a central clinical research database infrastructure was established (iCAREdata: Improving Care And Research Electronic Data Trust Antwerp, www.icaredata.eu ) for primary and interdisciplinary health care at the University of Antwerp, linking data from General Practice Cooperatives, Emergency Departments and Pharmacies during out-of-hours care. Medical data are pseudonymised using the services of a Trusted Third Party, which encodes private information about patients and physicians before data is sent to iCAREdata. iCAREdata provides many new research opportunities in the fields of clinical epidemiology, health care management and quality of care. A key aspect will be to ensure the quality of data registration by all health care providers. This article describes the establishment of a research database and the possibilities of linking data from different primary out-of-hours care providers, with the potential to help to improve research and the quality of health care services.

  11. An Institutional Approach to Developing Research Data Management Infrastructure

    James A. J. Wilson

    2011-10-01

    Full Text Available This article outlines the work that the University of Oxford is undertaking to implement a coordinated data management infrastructure. The rationale for the approach being taken by Oxford is presented, with particular attention paid to the role of each service division. This is followed by a consideration of the relative advantages and disadvantages of institutional data repositories, as opposed to national or international data centres. The article then focuses on two ongoing JISC-funded projects, ‘Embedding Institutional Data Curation Services in Research’ (Eidcsr and ‘Supporting Data Management Infrastructure for the Humanities’ (Sudamih. Both projects are intra-institutional collaborations and involve working with researchers to develop particular aspects of infrastructure, including: University policy, systems for the preservation and documentation of research data, training and support, software tools for the visualisation of large images, and creating and sharing databases via the Web (Database as a Service.

  12. A survey of the use of database management systems in accelerator projects

    Poole, John

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accelerator projects and what they are being used for. Initially introduced to handle equipment builders' data, commercial DBMS are now being used in almost all areas of accelerators from on-line control to personnel data. A variety of commercial systems are being used in conjunction with a diverse selection of application software for data maintenance/manipulation and controls. This paper reviews the database activities known to IADBG.

  13. Power Electronics Thermal Management | Transportation Research | NREL

    Power Electronics Thermal Management Power Electronics Thermal Management A photo of water boiling in liquid cooling lab equipment. Power electronics thermal management research aims to help lower the investigates and develops thermal management strategies for power electronics systems that use wide-bandgap

  14. WikiPathways: a multifaceted pathway database bridging metabolomics to other omics research.

    Slenter, Denise N; Kutmon, Martina; Hanspers, Kristina; Riutta, Anders; Windsor, Jacob; Nunes, Nuno; Mélius, Jonathan; Cirillo, Elisa; Coort, Susan L; Digles, Daniela; Ehrhart, Friederike; Giesbertz, Pieter; Kalafati, Marianthi; Martens, Marvin; Miller, Ryan; Nishida, Kozo; Rieswijk, Linda; Waagmeester, Andra; Eijssen, Lars M T; Evelo, Chris T; Pico, Alexander R; Willighagen, Egon L

    2018-01-04

    WikiPathways (wikipathways.org) captures the collective knowledge represented in biological pathways. By providing a database in a curated, machine readable way, omics data analysis and visualization is enabled. WikiPathways and other pathway databases are used to analyze experimental data by research groups in many fields. Due to the open and collaborative nature of the WikiPathways platform, our content keeps growing and is getting more accurate, making WikiPathways a reliable and rich pathway database. Previously, however, the focus was primarily on genes and proteins, leaving many metabolites with only limited annotation. Recent curation efforts focused on improving the annotation of metabolism and metabolic pathways by associating unmapped metabolites with database identifiers and providing more detailed interaction knowledge. Here, we report the outcomes of the continued growth and curation efforts, such as a doubling of the number of annotated metabolite nodes in WikiPathways. Furthermore, we introduce an OpenAPI documentation of our web services and the FAIR (Findable, Accessible, Interoperable and Reusable) annotation of resources to increase the interoperability of the knowledge encoded in these pathways and experimental omics data. New search options, monthly downloads, more links to metabolite databases, and new portals make pathway knowledge more effortlessly accessible to individual researchers and research communities. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Time management strategies for research productivity.

    Chase, Jo-Ana D; Topp, Robert; Smith, Carol E; Cohen, Marlene Z; Fahrenwald, Nancy; Zerwic, Julie J; Benefield, Lazelle E; Anderson, Cindy M; Conn, Vicki S

    2013-02-01

    Researchers function in a complex environment and carry multiple role responsibilities. This environment is prone to various distractions that can derail productivity and decrease efficiency. Effective time management allows researchers to maintain focus on their work, contributing to research productivity. Thus, improving time management skills is essential to developing and sustaining a successful program of research. This article presents time management strategies addressing behaviors surrounding time assessment, planning, and monitoring. Herein, the Western Journal of Nursing Research editorial board recommends strategies to enhance time management, including setting realistic goals, prioritizing, and optimizing planning. Involving a team, problem-solving barriers, and early management of potential distractions can facilitate maintaining focus on a research program. Continually evaluating the effectiveness of time management strategies allows researchers to identify areas of improvement and recognize progress.

  16. Nordic research in logistics and supply chain management

    Arlbjørn, Jan Stentoft; Jonsson, Patrik; Johansen, John

    2008-01-01

    Purpose - The purpose of this data-based analysis is to report and reflect on the characteristics of the academic discipline concerned with logistics and supply chain management (SCM) as it is conducted in the Nordic countries (Denmark, Finland, Iceland, Norway and Sweden). The paper further seeks...... returned, the response rate was 41 per cent. Findings - The study did not provide a clear picture of a distinct Nordic research paradigm applying to the study of logistics and SCM. The analysis shows as characteristic of research issues pursued by Nordic researchers the focus on supply chains and networks...... with research in the field and external funding. Research limitations/implications - The research reported here may help individual researchers raise their consciousness about their own research. Originality/value - This is the first empirical study to analyze research paradigms within logistics and SCM...

  17. Drug utilization research and risk management

    Mazzaglia, Giampiero; Mol, Peter G. M.; Elseviers, Monique; Wettermark, Björn; Almarsdóttir, Anna Birna; Andersen, Morten; Benko, Ria; Bennie, Marion; Eriksson, Irene; Godman, Brian; Krska, Janet; Poluzzi, Elisabetta; Taxis, Katja; Vlahovic-Palcevski, Vera; Stichele, Robert Vander

    2016-01-01

    Good risk management requires continuous evaluation and improvement of planned activities. The evaluation impact of risk management activities requires robust study designs and carefully selected outcome measures. Key learnings and caveats from drug utilization research should be applied to the

  18. Microsoft Enterprise Consortium: A Resource for Teaching Data Warehouse, Business Intelligence and Database Management Systems

    Kreie, Jennifer; Hashemi, Shohreh

    2012-01-01

    Data is a vital resource for businesses; therefore, it is important for businesses to manage and use their data effectively. Because of this, businesses value college graduates with an understanding of and hands-on experience working with databases, data warehouses and data analysis theories and tools. Faculty in many business disciplines try to…

  19. Microcomputer Database Management Systems that Interface with Online Public Access Catalogs.

    Rice, James

    1988-01-01

    Describes a study that assessed the availability and use of microcomputer database management interfaces to online public access catalogs. The software capabilities needed to effect such an interface are identified, and available software packages are evaluated by these criteria. A directory of software vendors is provided. (4 notes with…

  20. Supporting Telecom Business Processes by means of Workflow Management and Federated Databases

    Nijenhuis, Wim; Jonker, Willem; Grefen, P.W.P.J.

    This report addresses the issues related to the use of workflow management systems and federated databases to support business processes that operate on large and heterogeneous collections of autonomous information systems. We discuss how they can enhance the overall IT-architecture. Starting from

  1. Nailfold capillaroscopy in systemic sclerosis: data from the EULAR scleroderma trials and research (EUSTAR) database.

    Ingegnoli, Francesca; Ardoino, Ilaria; Boracchi, Patrizia; Cutolo, Maurizio

    2013-09-01

    The aims of this study were to obtain cross-sectional data on capillaroscopy in an international multi-center cohort of Systemic Sclerosis (SSc) and to investigate the frequency of the capillaroscopic patterns and their disease-phenotype associations. Data collected between June 2004 and October 2011 in the EULAR Scleroderma Trials and Research (EUSTAR) registry were examined. Patients' profiles based on clinical and laboratory data were obtained by cluster analysis and the association between profiles and capillaroscopy was investigated by multinomial logistic regression. 62 of the 110 EUSTAR centers entered data on capillaroscopy in the EUSTAR database. 376 of the 2754 patients (13.65%) were classified as scleroderma pattern absent, but non-specific capillary abnormalities were noted in 55.48% of the cases. Four major patients' profiles were identified characterized by a progressive severity for skin involvement, as well as an increased number of systemic manifestations. The "early" and "active" scleroderma patterns were generally observed in patients with mild/moderate skin involvement and a low number of disease manifestations, while the "late" scleroderma pattern was found more frequently in the more severe forms of the disease. These data indicate the importance of capillaroscopy in SSc management and that capillaroscopic patterns are directly related to the extent of organ involvement. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Integration of operational research and environmental management

    Bloemhof - Ruwaard, J.M.

    1996-01-01


    The subject of this thesis is the integration of Operational Research and Environmental Management. Both sciences play an important role in the research of environmental issues. Part I describes a framework for the interactions between Operational Research and Environmental Management.

  3. Has management accounting research been critical?

    Hopper, Trevor; Bui, Binh

    2016-01-01

    This paper examines the contributions Management Accounting Research (MAR) has (and has not) made to social and critical analyses of management accounting in the twenty-five years since its launch. It commences with a personalised account of the first named author’s experiences of behavioural, social and critical accounting in the twenty-five years before MAR appeared. This covers events in the UK, especially the Management Control Workshop, Management Accounting Research conferences at Aston...

  4. Geoscientific (GEO) database of the Andra Meuse / Haute-Marne research center

    Tabani, P.; Hemet, P.; Hermand, G.; Delay, J.; Auriere, C.

    2010-01-01

    Document available in extended abstract form only. The GEO database (geo-scientific database of the Meuse/Haute-Marne Center) is a tool developed by Andra, with a view to group in a secured computer form all data related to the acquisition of in situ and laboratory measurements made on solid and fluid samples. This database has three main functions: - Acquisition and management of data and computer files related to geological, geomechanical, hydrogeological and geochemical measurements on solid and fluid samples and in situ measurements (logging, on sample measurements, geological logs, etc). - Available consultation by the staff on Andra's intranet network for selective viewing of data linked to a borehole and/or a sample and for making computations and graphs on sets of laboratory measurements related to a sample. - Physical management of fluid and solid samples stored in a 'core library' in order to localize a sample, follow-up its movement out of the 'core library' to an organization, and carry out regular inventories. The GEO database is a relational Oracle data base. It is installed on a data server which stores information and manages the users' transactions. The users can consult, download and exploit data from any computer connected to the Andra network or Internet. Management of the access rights is made through a login/ password. Four geo-scientific explanations are linked to the Geo database, they are: - The Geosciences portal: The Geosciences portal is a web Intranet application accessible from the ANDRA network. It does not require a particular installation from the client and is accessible through the Internet navigator. A SQL Server Express database manages the users and access rights to the application. This application is used for the acquisition of hydrogeological and geochemical data collected on the field and on fluid samples, as well as data related to scientific work carried out at surface level or in drifts

  5. Appropriateness of the food-pics image database for experimental eating and appetite research with adolescents.

    Jensen, Chad D; Duraccio, Kara M; Barnett, Kimberly A; Stevens, Kimberly S

    2016-12-01

    Research examining effects of visual food cues on appetite-related brain processes and eating behavior has proliferated. Recently investigators have developed food image databases for use across experimental studies examining appetite and eating behavior. The food-pics image database represents a standardized, freely available image library originally validated in a large sample primarily comprised of adults. The suitability of the images for use with adolescents has not been investigated. The aim of the present study was to evaluate the appropriateness of the food-pics image library for appetite and eating research with adolescents. Three hundred and seven adolescents (ages 12-17) provided ratings of recognizability, palatability, and desire to eat, for images from the food-pics database. Moreover, participants rated the caloric content (high vs. low) and healthiness (healthy vs. unhealthy) of each image. Adolescents rated approximately 75% of the food images as recognizable. Approximately 65% of recognizable images were correctly categorized as high vs. low calorie and 63% were correctly classified as healthy vs. unhealthy in 80% or more of image ratings. These results suggest that a smaller subset of the food-pics image database is appropriate for use with adolescents. With some modifications to included images, the food-pics image database appears to be appropriate for use in experimental appetite and eating-related research conducted with adolescents. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Managing University Research Microdata Collections

    Woolfrey, Lynn; Fry, Jane

    2015-01-01

    This article examines the management of microdata collections in a university context. It is a cross-country analysis: Collection management at data services in Canada and South Africa are considered. The case studies are of two university sub-contexts: One collection is located in a library; the other at a Faculty-based Data Service. Stages in…

  7. Routine health insurance data for scientific research: potential and limitations of the Agis Health Database.

    Smeets, Hugo M; de Wit, Niek J; Hoes, Arno W

    2011-04-01

    Observational studies performed within routine health care databases have the advantage of their large size and, when the aim is to assess the effect of interventions, can offer a completion to randomized controlled trials with usually small samples from experimental situations. Institutional Health Insurance Databases (HIDs) are attractive for research because of their large size, their longitudinal perspective, and their practice-based information. As they are based on financial reimbursement, the information is generally reliable. The database of one of the major insurance companies in the Netherlands, the Agis Health Database (AHD), is described in detail. Whether the AHD data sets meet the specific requirements to conduct several types of clinical studies is discussed according to the classification of the four different types of clinical research; that is, diagnostic, etiologic, prognostic, and intervention research. The potential of the AHD for these various types of research is illustrated using examples of studies recently conducted in the AHD. HIDs such as the AHD offer large potential for several types of clinical research, in particular etiologic and intervention studies, but at present the lack of detailed clinical information is an important limitation. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Organizing, exploring, and analyzing antibody sequence data: the case for relational-database managers.

    Owens, John

    2009-01-01

    Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.

  9. Database Foundation For The Configuration Management Of The CERN Accelerator Controls Systems

    Zaharieva, Z; Peryt, M

    2011-01-01

    The Controls Configuration Database (CCDB) and its interfaces have been developed over the last 25 years in order to become nowadays the basis for the Configuration Management of the Controls System for all accelerators at CERN. The CCDB contains data for all configuration items and their relationships, required for the correct functioning of the Controls System. The configuration items are quite heterogeneous, depicting different areas of the Controls System – ranging from 3000 Front-End Computers, 75 000 software devices allowing remote control of the accelerators, to valid states of the Accelerators Timing System. The article will describe the different areas of the CCDB, their interdependencies and the challenges to establish the data model for such a diverse configuration management database, serving a multitude of clients. The CCDB tracks the life of the configuration items by allowing their clear identification, triggering of change management processes as well as providing status accounting and aud...

  10. GSIMF: a web service based software and database management system for the next generation grids

    Wang, N; Ananthan, B; Gieraltowski, G; May, E; Vaniachine, A

    2008-01-01

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

  11. Establishment of database and network for research of stream generator and state of the art technology review

    Choi, Jae Bong; Hur, Nam Su; Moon, Seong In; Seo, Hyeong Won; Park, Bo Kyu; Park, Sung Ho; Kim, Hyung Geun [Sungkyunkwan Univ., Seoul (Korea, Republic of)

    2004-02-15

    A significant number of steam generator tubes are defective and are removed from service or repaired world widely. This wide spread damage has been caused by diverse degradation mechanisms, some of which are difficult to detect and predict. Regarding domestic nuclear power plants, also, the increase of number of operating nuclear power plants and operating periods may result in the increase of steam generator tube failure. So, it is important to carry out the integrity evaluation process to prevent the steam generator tube damage. There are two objectives of this research. The one is to make database for the research of steam generator at domestic research institution. It will increase the efficiency and capability of limited domestic research resources by sharing data and information through network organization. Also, it will enhance the current standard of integrity evaluation procedure that is considerably conservative but can be more reasonable. The second objective is to establish the standard integrity evaluation procedure for steam generator tube by reviewing state of the art technology. The research resources related to steam generator tubes are managed by the established web-based database system. The following topics are covered in this project: development of web-based network for research on steam generator tubes review of state of the art technology.

  12. Establishment of database and network for research of stream generator and state of the art technology review

    Choi, Jae Bong; Hur, Nam Su; Moon, Seong In; Seo, Hyeong Won; Park, Bo Kyu; Park, Sung Ho; Kim, Hyung Geun

    2004-02-01

    A significant number of steam generator tubes are defective and are removed from service or repaired world widely. This wide spread damage has been caused by diverse degradation mechanisms, some of which are difficult to detect and predict. Regarding domestic nuclear power plants, also, the increase of number of operating nuclear power plants and operating periods may result in the increase of steam generator tube failure. So, it is important to carry out the integrity evaluation process to prevent the steam generator tube damage. There are two objectives of this research. The one is to make database for the research of steam generator at domestic research institution. It will increase the efficiency and capability of limited domestic research resources by sharing data and information through network organization. Also, it will enhance the current standard of integrity evaluation procedure that is considerably conservative but can be more reasonable. The second objective is to establish the standard integrity evaluation procedure for steam generator tube by reviewing state of the art technology. The research resources related to steam generator tubes are managed by the established web-based database system. The following topics are covered in this project: development of web-based network for research on steam generator tubes review of state of the art technology

  13. Materials and Waste Management Research

    EPA is developing data and tools to reduce waste, manage risks, reuse and conserve natural materials, and optimize energy recovery. Collaboration with states facilitates assessment and utilization of technologies developed by the private sector.

  14. Reactor pressure vessel embrittlement management through EPRI-Developed material property databases

    Rosinski, S.T.; Server, W.L.; Griesbach, T.J.

    1997-01-01

    Uncertainties and variability in U.S. reactor pressure vessel (RPV) material properties have caused the U.S. Nuclear Regulatory Commission (NRC) to request information from all nuclear utilities in order to assess the impact of these data scatter and uncertainties on compliance with existing regulatory criteria. Resolving the vessel material uncertainty issues requires compiling all available data into a single integrated database to develop a better understanding of irradiated material property behavior. EPRI has developed two comprehensive databases for utility implementation to compile and evaluate available material property and surveillance data. RPVDATA is a comprehensive reactor vessel materials database and data management program that combines data from many different sources into one common database. Searches of the data can be easily performed to identify plants with similar materials, sort through measured test results, compare the ''best-estimates'' for reported chemistries with licensing basis values, quantify variability in measured weld qualification and test data, identify relevant surveillance results for characterizing embrittlement trends, and resolve uncertainties in vessel material properties. PREP4 has been developed to assist utilities in evaluating existing unirradiated and irradiated data for plant surveillance materials; PREP4 evaluations can be used to assess the accuracy of new trend curve predictions. In addition, searches of the data can be easily performed to identify available Charpy shift and upper shelf data, review surveillance material chemistry and fabrication information, review general capsule irradiation information, and identify applicable source reference information. In support of utility evaluations to consider thermal annealing as a viable embrittlement management option, EPRI is also developing a database to evaluate material response to thermal annealing. Efforts are underway to develop an irradiation

  15. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    The area of the Swabian-Franconian cuesta landscape (Southern Germany) is highly prone to landslides. This was apparent in the late spring of 2013, when numerous landslides occurred as a consequence of heavy and long-lasting rainfalls. The specific climatic situation caused numerous damages with serious impact on settlements and infrastructure. Knowledge on spatial distribution of landslides, processes and characteristics are important to evaluate the potential risk that can occur from mass movements in those areas. In the frame of two projects about 400 landslides were mapped and detailed data sets were compiled during years 2011 to 2014 at the Franconian Alb. The studies are related to the project "Slope stability and hazard zones in the northern Bavarian cuesta" (DFG, German Research Foundation) as well as to the LfU (The Bavarian Environment Agency) within the project "Georisks and climate change - hazard indication map Jura". The central goal of the present study is to create a spatial database for landslides. The database should contain all fundamental parameters to characterize the mass movements and should provide the potential for secure data storage and data management, as well as statistical evaluations. The spatial database was created with PostgreSQL, an object-relational database management system and PostGIS, a spatial database extender for PostgreSQL, which provides the possibility to store spatial and geographic objects and to connect to several GIS applications, like GRASS GIS, SAGA GIS, QGIS and GDAL, a geospatial library (Obe et al. 2011). Database access for querying, importing, and exporting spatial and non-spatial data is ensured by using GUI or non-GUI connections. The database allows the use of procedural languages for writing advanced functions in the R, Python or Perl programming languages. It is possible to work directly with the (spatial) data entirety of the database in R. The inventory of the database includes (amongst others

  16. Interacting with the National Database for Autism Research (NDAR) via the LONI Pipeline workflow environment.

    Torgerson, Carinna M; Quinn, Catherine; Dinov, Ivo; Liu, Zhizhong; Petrosyan, Petros; Pelphrey, Kevin; Haselgrove, Christian; Kennedy, David N; Toga, Arthur W; Van Horn, John Darrell

    2015-03-01

    Under the umbrella of the National Database for Clinical Trials (NDCT) related to mental illnesses, the National Database for Autism Research (NDAR) seeks to gather, curate, and make openly available neuroimaging data from NIH-funded studies of autism spectrum disorder (ASD). NDAR has recently made its database accessible through the LONI Pipeline workflow design and execution environment to enable large-scale analyses of cortical architecture and function via local, cluster, or "cloud"-based computing resources. This presents a unique opportunity to overcome many of the customary limitations to fostering biomedical neuroimaging as a science of discovery. Providing open access to primary neuroimaging data, workflow methods, and high-performance computing will increase uniformity in data collection protocols, encourage greater reliability of published data, results replication, and broaden the range of researchers now able to perform larger studies than ever before. To illustrate the use of NDAR and LONI Pipeline for performing several commonly performed neuroimaging processing steps and analyses, this paper presents example workflows useful for ASD neuroimaging researchers seeking to begin using this valuable combination of online data and computational resources. We discuss the utility of such database and workflow processing interactivity as a motivation for the sharing of additional primary data in ASD research and elsewhere.

  17. Accessing the public MIMIC-II intensive care relational database for clinical research.

    Scott, Daniel J; Lee, Joon; Silva, Ikaro; Park, Shinhyuk; Moody, George B; Celi, Leo A; Mark, Roger G

    2013-01-10

    The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge "Predicting mortality of ICU Patients". QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database.

  18. Data-Based Decision Making at the Policy, Research, and Practice Levels

    Schildkamp, Kim; Ebbeler, J.

    2015-01-01

    Data-based decision making (DBDM) can lead to school improvement. However, schools struggle with the implementation of DBDM. In this symposium, we will discuss research and the implementation of DBDM at the national and regional policy level and the classroom level. We will discuss policy issues

  19. On the advancement of highly cited research in China: An analysis of the Highly Cited database.

    Li, John Tianci

    2018-01-01

    This study investigates the progress of highly cited research in China from 2001 to 2016 through the analysis of the Highly Cited database. The Highly Cited database, compiled by Clarivate Analytics, is comprised of the world's most influential researchers in the 22 Essential Science Indicator fields as catalogued by the Web of Science. The database is considered an international standard for the measurement of national and institutional highly cited research output. Overall, we found a consistent and substantial increase in Highly Cited Researchers from China during the timespan. The Chinese institutions with the most Highly Cited Researchers- the Chinese Academy of Sciences, Tsinghua University, Peking University, Zhejiang University, the University of Science and Technology of China, and BGI Shenzhen- are all top ten universities or primary government research institutions. Further evaluation of separate fields of research and government funding data from the National Natural Science Foundation of China revealed disproportionate growth efficiencies among the separate divisions of the National Natural Science Foundation. The most development occurred in the fields of Chemistry, Materials Sciences, and Engineering, whereas the least development occurred in Economics and Business, Health Sciences, and Life Sciences.

  20. Research on the establishment of the database system for R and D on the innovative technology for the earth; Chikyu kankyo sangyo gijutsu kenkyu kaihatsuyo database system ni kansuru chosa

    NONE

    1994-03-01

    For the purpose of structuring a database system of technical information about the earth environmental issues, the `database system for R and D of the earth environmental industrial technology` was operationally evaluated, and study was made to open it and structure a prototype of database. In the present state as pointed out in the operational evaluation, the utilization frequency is not heightened due to lack of UNIX experience, absence of system managers and shortage of utilizable articles listed, so that the renewal of database does not ideally progress. Therefore, study was then made to introduce tools utilizable by the initiators and open the information access terminal to the researchers at headquarters utilizing the internet. In order for the earth environment-related researchers to easily obtain the information, a database was prototypically structured to support the research exchange. Tasks were made clear to be taken for selecting the fields of research and compiling common thesauri in Japanese, Western and other languages. 28 figs., 16 tabs.

  1. Strengthening links between waterfowl research and management

    Roberts, Anthony J.; Eadie, John M.; Howerter, David; Johnson, Fred A.; Nichols, James; Runge, Michael C.; Vrtiska, Mark; Williams, Byron K.

    2018-01-01

    Waterfowl monitoring, research, regulation, and adaptive planning are leading the way in supporting science-informed wildlife management. However, increasing societal demands on natural resources have created a greater need for adaptable and successful linkages between waterfowl science and management. We presented a special session at the 2016 North American Duck Symposium, Annapolis, Maryland, USA on the successes and challenges of linking research and management in waterfowl conservation, and we summarize those thoughts in this commentary. North American waterfowl management includes a diversity of actions including management of harvest and habitat. Decisions for waterfowl management are structured using decision analysis by incorporating stakeholder values into formal objectives, identifying research relevant to objectives, integrating scientific knowledge, and choosing an optimal strategy with respect to objectives. Recently, the consideration of the value of information has been proposed as a means to evaluate the utility of research designed to meet objectives. Despite these advances, the ability to conduct waterfowl research with direct management application may be increasingly difficult in research institutions for several reasons including reduced funding for applied research and the lower perceived value of applied versus theoretical research by some university academics. In addition, coordination between researchers and managers may be logistically constrained, and communication may be ineffective between the 2 groups. Strengthening these links would help develop stronger and more coordinated approaches for the conservation of waterfowl and the wetlands upon which they depend.

  2. Current trends and new challenges of databases and web applications for systems driven biological research

    Pradeep Kumar eSreenivasaiah

    2010-12-01

    Full Text Available Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases (DBs and web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research.

  3. Future Research Themes in Supply Chain Management

    Wieland, Andreas

    2016-01-01

    Guest post on research results published in the article "Mapping the Landscape of Future Research Themes in Supply Chain Management" by Andreas Wieland, Robert Handfield and Christian Durach ( Journal of Business Logistics (2016). Vol. 37, no. 3, pp. 205-212).......Guest post on research results published in the article "Mapping the Landscape of Future Research Themes in Supply Chain Management" by Andreas Wieland, Robert Handfield and Christian Durach ( Journal of Business Logistics (2016). Vol. 37, no. 3, pp. 205-212)....

  4. Management of radiological related equipments. Creating the equipment management database and analysis of the repair and maintenance records

    Eguchi, Megumu; Taguchi, Keiichi; Oota, Takashi; Kajiwara, Hiroki; Ono, Kiyotune; Hagio, Kiyofumi; Uesugi, Ekizo; Kajishima, Tetuo; Ueda, Kenji

    2002-01-01

    In 1997, we established the committee of equipments maintenance and management in our department. We designed the database in order to classify and register all the radiological related equipments using Microsoft Access. The management of conditions and cost of each equipment has become easier, by keeping and recording the database in the equipments management ledger and by filing the history of repairs or maintenances occurred to modalities. We then accounted numbers, cost of repairs and downtimes from the data of the repair and maintenance records for four years, and we reexamined the causal analysis of failures and the contents of the regular maintenance for CT and MRI equipments that had shown the higher numbers of repairs. Consequently, we have found the improvement of registration method of the data and the more economical way to use of the cost of repair. (author)

  5. Knowledge management: An abstraction of knowledge base and database management systems

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  6. Product- and Process Units in the CRITT Translation Process Research Database

    Carl, Michael

    than 300 hours of text production. The database provides the raw logging data, as well as Tables of pre-processed product- and processing units. The TPR-DB includes various types of simple and composed product and process units that are intended to support the analysis and modelling of human text......The first version of the "Translation Process Research Database" (TPR DB v1.0) was released In August 2012, containing logging data of more than 400 translation and text production sessions. The current version of the TPR DB, (v1.4), contains data from more than 940 sessions, which represents more...

  7. PeDaB - the personal dosimetry database at the research centre Juelich

    Geisse, C.; Hill, P.; Paschke, M.; Hille, R.; Schlaeger, M.

    1998-01-01

    In May, 1997 the mainframe based registration, processing and archiving of personal monitoring data at the research centre Juelich (FZJ) was transferred to a client server system. A complex database application was developed. The client user interface is a Windows based Microsoft ACCESS application which is connected to an ORACLE database via ODBC and TCP/IP. The conversion covered all areas of personal dosimetry including internal and external exposition as well as administrative areas. A higher degree of flexibility, data security and integrity was achieved. (orig.) [de

  8. MouldingSandDB – a modern database storing moulding sands properties research results

    J. Jakubski

    2011-01-01

    Full Text Available The complexity of foundry processes requires the use of modern, advanced IT tools for optimization, storage and analysis of t echnicaldata. Properties of moulding and core sands that are collected in research laboratories, manufacturers, and finally in the foundries, are not in use later on. It seems important to create a database that will allow to use the results stored, along with the possibility of searching according to set criteria, adjusted to casting practice. This paper presents part of the database named „MouldingSandDB”, which allows to collect and search data for synthetic moulding sands.

  9. Research Outputs of England's Hospital Episode Statistics (HES) Database: Bibliometric Analysis.

    Chaudhry, Zain; Mannan, Fahmida; Gibson-White, Angela; Syed, Usama; Ahmed, Shirin; Majeed, Azeem

    2017-12-06

    Hospital administrative data, such as those provided by the Hospital Episode Statistics (HES) database in England, are increasingly being used for research and quality improvement. To date, no study has tried to quantify and examine trends in the use of HES for research purposes. To examine trends in the use of HES data for research. Publications generated from the use of HES data were extracted from PubMed and analysed. Publications from 1996 to 2014 were then examined further in the Science Citation Index (SCI) of the Thompson Scientific Institute for Science Information (Web of Science) for details of research specialty area. 520 studies, categorised into 44 specialty areas, were extracted from PubMed. The review showed an increase in publications over the 18-year period with an average of 27 publications per year, however with the majority of output observed in the latter part of the study period. The highest number of publications was in the Health Statistics specialty area. The use of HES data for research is becoming more common. Increase in publications over time shows that researchers are beginning to take advantage of the potential of HES data. Although HES is a valuable database, concerns exist over the accuracy and completeness of the data entered. Clinicians need to be more engaged with HES for the full potential of this database to be harnessed.

  10. Preparation of Database for Land use Management in North East of Cairo

    El-Ghawaby, A.M.

    2012-01-01

    Environmental management in urban areas is difficult due to the amount and miscellaneous data needed for decision making. This amount of data is splendid without adequate database systems and modern methodologies. A geo-database building for East Cairo City Area (ECCA) is built to be used in the process of urban land-use suitability to achieve better performance compared with usual methods used. This Geo-database has required availability of detailed, accurate, updated and geographically referenced data on its terrain physical characteristics and its expected environmental hazards that may occur. A smart environmental suitability model for ECCA is developed and implemented using ERDAS IMAGINE 9.2. This model is capable of suggesting the more appropriate urban land-use, based on the existing spatial and non-spatial potentials and constraints.

  11. Development of intelligent database program for PSI/ISI data management of nuclear power plant

    Um, Byong Guk; Park, Un Su; Park, Ik Keun; Park, Yun Won; Kang, Suk Chul

    1998-01-01

    An intelligent database program has been developed under fully compatible with windows 95 for the construction of total support system and the effective management of Pre-/In-Service Inspection data. Using the database program, it can be executed the analysis and multi-dimensional evaluation of the defects detected during PSI/ISI in the pipe and the pressure vessel of the nuclear power plants. And also it can be used to investigate the NDE data inspected repetitively and the contents of treatment, and to offer the fundamental data for application of evaluation data related to Fracture Mechanics Analysis(FMA). Furthermore, the PSI/ISI database loads and material properties can be utilized to secure the higher degree of safety, integrity, reliability, and life-prediction of components and systems in nuclear power plant.

  12. ALARA database value in future outage work planning and dose management

    Miller, D.W.; Green, W.H.

    1995-01-01

    ALARA database encompassing job-specific duration and man-rem plant specific information over three refueling outages represents an invaluable tool for the outage work planner and ALARA engineer. This paper describes dose-management trends emerging based on analysis of three refueling outages at Clinton Power Station. Conclusions reached based on hard data available from a relational database dose-tracking system is a valuable tool for planning of future outage work. The system's ability to identify key problem areas during a refueling outage is improving as more outage comparative data becomes available. Trends over a three outage period are identified in this paper in the categories of number and type of radiation work permits implemented, duration of jobs, projected vs. actual dose rates in work areas, and accuracy of outage person-rem projection. The value of the database in projecting 1 and 5 year station person-rem estimates is discussed

  13. ALARA database value in future outage work planning and dose management

    Miller, D.W.; Green, W.H. [Clinton Power Station Illinois Power Co., IL (United States)

    1995-03-01

    ALARA database encompassing job-specific duration and man-rem plant specific information over three refueling outages represents an invaluable tool for the outage work planner and ALARA engineer. This paper describes dose-management trends emerging based on analysis of three refueling outages at Clinton Power Station. Conclusions reached based on hard data available from a relational database dose-tracking system is a valuable tool for planning of future outage work. The system`s ability to identify key problem areas during a refueling outage is improving as more outage comparative data becomes available. Trends over a three outage period are identified in this paper in the categories of number and type of radiation work permits implemented, duration of jobs, projected vs. actual dose rates in work areas, and accuracy of outage person-rem projection. The value of the database in projecting 1 and 5 year station person-rem estimates is discussed.

  14. Water resources management in Tanzania: identifying research ...

    This paper aims at identifying research gaps and needs and recommendations for a research agenda on water resources management in Tanzania. We reviewed published literature on water resources management in Tanzania in order to highlight what is currently known, and to identify knowledge gaps, and suggest ...

  15. Management of operational events in research reactor

    Zhong Heping; Yang Shuchun; Peng Xueming

    2001-01-01

    The author describes the tracing management process post-operational event in a research reactor based on nuclear safety code, under the background of the research reactor in Nuclear Power Institute of China. It presorts the definite measures to the event tracing and it up its management factors

  16. The MANAGE database: nutrient load and site characteristic updates and runoff concentration data.

    Harmel, Daren; Qian, Song; Reckhow, Ken; Casebolt, Pamela

    2008-01-01

    The "Measured Annual Nutrient loads from AGricultural Environments" (MANAGE) database was developed to be a readily accessible, easily queried database of site characteristic and field-scale nutrient export data. The original version of MANAGE, which drew heavily from an early 1980s compilation of nutrient export data, created an electronic database with nutrient load data and corresponding site characteristics from 40 studies on agricultural (cultivated and pasture/range) land uses. In the current update, N and P load data from 15 additional studies of agricultural runoff were included along with N and P concentration data for all 55 studies. The database now contains 1677 watershed years of data for various agricultural land uses (703 for pasture/rangeland; 333 for corn; 291 for various crop rotations; 177 for wheat/oats; and 4-33 yr for barley, citrus, vegetables, sorghum, soybeans, cotton, fallow, and peanuts). Across all land uses, annual runoff loads averaged 14.2 kg ha(-1) for total N and 2.2 kg ha(-1) for total P. On average, these losses represented 10 to 25% of applied fertilizer N and 4 to 9% of applied fertilizer P. Although such statistics produce interesting generalities across a wide range of land use, management, and climatic conditions, regional crop-specific analyses should be conducted to guide regulatory and programmatic decisions. With this update, MANAGE contains data from a vast majority of published peer-reviewed N and P export studies on homogeneous agricultural land uses in the USA under natural rainfall-runoff conditions and thus provides necessary data for modeling and decision-making related to agricultural runoff. The current version can be downloaded at http://www.ars.usda.gov/spa/manage-nutrient.

  17. GDR (Genome Database for Rosaceae: integrated web resources for Rosaceae genomics and genetics research

    Ficklin Stephen

    2004-09-01

    Full Text Available Abstract Background Peach is being developed as a model organism for Rosaceae, an economically important family that includes fruits and ornamental plants such as apple, pear, strawberry, cherry, almond and rose. The genomics and genetics data of peach can play a significant role in the gene discovery and the genetic understanding of related species. The effective utilization of these peach resources, however, requires the development of an integrated and centralized database with associated analysis tools. Description The Genome Database for Rosaceae (GDR is a curated and integrated web-based relational database. GDR contains comprehensive data of the genetically anchored peach physical map, an annotated peach EST database, Rosaceae maps and markers and all publicly available Rosaceae sequences. Annotations of ESTs include contig assembly, putative function, simple sequence repeats, and anchored position to the peach physical map where applicable. Our integrated map viewer provides graphical interface to the genetic, transcriptome and physical mapping information. ESTs, BACs and markers can be queried by various categories and the search result sites are linked to the integrated map viewer or to the WebFPC physical map sites. In addition to browsing and querying the database, users can compare their sequences with the annotated GDR sequences via a dedicated sequence similarity server running either the BLAST or FASTA algorithm. To demonstrate the utility of the integrated and fully annotated database and analysis tools, we describe a case study where we anchored Rosaceae sequences to the peach physical and genetic map by sequence similarity. Conclusions The GDR has been initiated to meet the major deficiency in Rosaceae genomics and genetics research, namely a centralized web database and bioinformatics tools for data storage, analysis and exchange. GDR can be accessed at http://www.genome.clemson.edu/gdr/.

  18. GDR (Genome Database for Rosaceae): integrated web resources for Rosaceae genomics and genetics research.

    Jung, Sook; Jesudurai, Christopher; Staton, Margaret; Du, Zhidian; Ficklin, Stephen; Cho, Ilhyung; Abbott, Albert; Tomkins, Jeffrey; Main, Dorrie

    2004-09-09

    Peach is being developed as a model organism for Rosaceae, an economically important family that includes fruits and ornamental plants such as apple, pear, strawberry, cherry, almond and rose. The genomics and genetics data of peach can play a significant role in the gene discovery and the genetic understanding of related species. The effective utilization of these peach resources, however, requires the development of an integrated and centralized database with associated analysis tools. The Genome Database for Rosaceae (GDR) is a curated and integrated web-based relational database. GDR contains comprehensive data of the genetically anchored peach physical map, an annotated peach EST database, Rosaceae maps and markers and all publicly available Rosaceae sequences. Annotations of ESTs include contig assembly, putative function, simple sequence repeats, and anchored position to the peach physical map where applicable. Our integrated map viewer provides graphical interface to the genetic, transcriptome and physical mapping information. ESTs, BACs and markers can be queried by various categories and the search result sites are linked to the integrated map viewer or to the WebFPC physical map sites. In addition to browsing and querying the database, users can compare their sequences with the annotated GDR sequences via a dedicated sequence similarity server running either the BLAST or FASTA algorithm. To demonstrate the utility of the integrated and fully annotated database and analysis tools, we describe a case study where we anchored Rosaceae sequences to the peach physical and genetic map by sequence similarity. The GDR has been initiated to meet the major deficiency in Rosaceae genomics and genetics research, namely a centralized web database and bioinformatics tools for data storage, analysis and exchange. GDR can be accessed at http://www.genome.clemson.edu/gdr/.

  19. Drug residues in urban water: A database for ecotoxicological risk management.

    Destrieux, Doriane; Laurent, François; Budzinski, Hélène; Pedelucq, Julie; Vervier, Philippe; Gerino, Magali

    2017-12-31

    Human-use drug residues (DR) are only partially eliminated by waste water treatment plants (WWTPs), so that residual amounts can reach natural waters and cause environmental hazards. In order to properly manage these hazards in the aquatic environment, a database is made available that integrates the concentration ranges for DR, which cause adverse effects for aquatic organisms, and the temporal variations of the ecotoxicological risks. To implement this database for the ecotoxicological risk assessment (ERA database), the required information for each DR is the predicted no effect concentrations (PNECs), along with the predicted environmental concentrations (PECs). The risk assessment is based on the ratio between the PNECs and the PECs. Adverse effect data or PNECs have been found in the publicly available literature for 45 substances. These ecotoxicity test data have been extracted from 125 different sources. This ERA database contains 1157 adverse effect data and 287 PNECs. The efficiency of this ERA database was tested with a data set coming from a simultaneous survey of WWTPs and the natural environment. In this data set, 26 DR were searched for in two WWTPs and in the river. On five sampling dates, concentrations measured in the river for 10 DR could pose environmental problems of which 7 were measured only downstream of WWTP outlets. From scientific literature and measurements, data implementation with unit homogenisation in a single database facilitates the actual ecotoxicological risk assessment, and may be useful for further risk coming from data arising from the future field survey. Moreover, the accumulation of a large ecotoxicity data set in a single database should not only improve knowledge of higher risk molecules but also supply an objective tool to help the rapid and efficient evaluation of the risk. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A Spatio-Temporal Building Exposure Database and Information Life-Cycle Management Solution

    Marc Wieland

    2017-04-01

    Full Text Available With an ever-increasing volume and complexity of data collected from a variety of sources, the efficient management of geospatial information becomes a key topic in disaster risk management. For example, the representation of assets exposed to natural disasters is subjected to changes throughout the different phases of risk management reaching from pre-disaster mitigation to the response after an event and the long-term recovery of affected assets. Spatio-temporal changes need to be integrated into a sound conceptual and technological framework able to deal with data coming from different sources, at varying scales, and changing in space and time. Especially managing the information life-cycle, the integration of heterogeneous information and the distributed versioning and release of geospatial information are important topics that need to become essential parts of modern exposure modelling solutions. The main purpose of this study is to provide a conceptual and technological framework to tackle the requirements implied by disaster risk management for describing exposed assets in space and time. An information life-cycle management solution is proposed, based on a relational spatio-temporal database model coupled with Git and GeoGig repositories for distributed versioning. Two application scenarios focusing on the modelling of residential building stocks are presented to show the capabilities of the implemented solution. A prototype database model is shared on GitHub along with the necessary scenario data.

  1. Air Traffic Management Research at NASA Ames

    Davis, Thomas J.

    2012-01-01

    The Aviation Systems Division at the NASA Ames Research Center conducts leading edge research in air traffic management concepts and technologies. This overview will present concepts and simulation results for research in traffic flow management, safe and efficient airport surface operations, super density terminal area operations, separation assurance and system wide modeling and simulation. A brief review of the ongoing air traffic management technology demonstration (ATD-1) will also be presented. A panel discussion, with Mr. Davis serving as a panelist, on air traffic research will follow the briefing.

  2. Preliminary study for unified management of CANDU safety codes and construction of database system

    Min, Byung Joo; Kim, Hyoung Tae

    2003-03-01

    It is needed to develop the Graphical User Interface(GUI) for the unified management of CANDU safety codes and to construct database system for the validation of safety codes, for which the preliminary study is done in the first stage of the present work. The input and output structures and data flow of CATHENA and PRESCON2 are investigated and the interaction of the variables between CATHENA and PRESCON2 are identified. Furthermore, PC versions of CATHENA and PRESCON2 codes are developed for the interaction of these codes and GUI(Graphic User Interface). The PC versions are assessed by comparing the calculation results with those by HP workstation or from FSAR(Final Safety Analysis Report). Preliminary study on the GUI for the safety codes in the unified management system are done. The sample of GUI programming is demonstrated preliminarily. Visual C++ is selected as the programming language for the development of GUI system. The data for Wolsong plants, reactor core, and thermal-hydraulic experiments executed in the inside and outside of the country, are collected and classified following the structure of the database system, of which two types are considered for the final web-based database system. The preliminary GUI programming for database system is demonstrated, which is updated in the future work

  3. The FP4026 Research Database on the fundamental period of RC infilled frame structures.

    Asteris, Panagiotis G

    2016-12-01

    The fundamental period of vibration appears to be one of the most critical parameters for the seismic design of buildings because it strongly affects the destructive impact of the seismic forces. In this article, important research data (entitled FP4026 Research Database (Fundamental Period-4026 cases of infilled frames) based on a detailed and in-depth analytical research on the fundamental period of reinforced concrete structures is presented. In particular, the values of the fundamental period which have been analytically determined are presented, taking into account the majority of the involved parameters. This database can be extremely valuable for the development of new code proposals for the estimation of the fundamental period of reinforced concrete structures fully or partially infilled with masonry walls.

  4. The FP4026 Research Database on the fundamental period of RC infilled frame structures

    Panagiotis G. Asteris

    2016-12-01

    Full Text Available The fundamental period of vibration appears to be one of the most critical parameters for the seismic design of buildings because it strongly affects the destructive impact of the seismic forces. In this article, important research data (entitled FP4026 Research Database (Fundamental Period-4026 cases of infilled frames based on a detailed and in-depth analytical research on the fundamental period of reinforced concrete structures is presented. In particular, the values of the fundamental period which have been analytically determined are presented, taking into account the majority of the involved parameters. This database can be extremely valuable for the development of new code proposals for the estimation of the fundamental period of reinforced concrete structures fully or partially infilled with masonry walls.

  5. Accreditation to manage research programs

    Miramand, Pierre

    1993-01-01

    In this report for an accreditation to supervise research, the author proposes an overview of a study of transfers of vanadium towards benthic organisms (i.e. the toxicity of vanadium for sea coastal organisms), of studies of transfer of transuranic elements from sediment to marine benthic species. He presents current researches and perspectives: study of the level of metallic pollutants and physical-chemical characteristics of coastal waters in northern Cotentin, researches in Seine Bay, study of pollution biologic indicators. Numerous articles are provided in appendix

  6. Astronomy Education Research Observations from the iSTAR international Study of Astronomical Reasoning Database

    Tatge, C. B.; Slater, S. J.; Slater, T. F.; Schleigh, S.; McKinnon, D.

    2016-12-01

    Historically, an important part of the scientific research cycle is to situate any research project within the landscape of the existing scientific literature. In the field of discipline-based astronomy education research, grappling with the existing literature base has proven difficult because of the difficulty in obtaining research reports from around the world, particularly early ones. In order to better survey and efficiently utilize the wide and fractured range and domain of astronomy education research methods and results, the iSTAR international Study of Astronomical Reasoning database project was initiated. The project aims to host a living, online repository of dissertations, theses, journal articles, and grey literature resources to serve the world's discipline-based astronomy education research community. The first domain of research artifacts ingested into the iSTAR database were doctoral dissertations. To the authors' great surprise, nearly 300 astronomy education research dissertations were found from the last 100-years. Few, if any, of the literature reviews from recent astronomy education dissertations surveyed even come close to summarizing this many dissertations, most of which have not been published in traditional journals, as re-publishing one's dissertation research as a journal article was not a widespread custom in the education research community until recently. A survey of the iSTAR database dissertations reveals that the vast majority of work has been largely quantitative in nature until the last decade. We also observe that modern-era astronomy education research writings reaches as far back as 1923 and that the majority of dissertations come from the same eight institutions. Moreover, most of the astronomy education research work has been done covering learners' grasp of broad knowledge of astronomy rather than delving into specific learning targets, which has been more in vogue during the last two decades. The surprisingly wide breadth

  7. Respiratory infections research in afghanistan: bibliometric analysis with the database pubmed

    Pilsezek, F.H.

    2015-01-01

    Infectious diseases research in a low-income country like Afghanistan is important. Methods: In this study an internet-based database Pubmed was used for bibliometric analysis of infectious diseases research activity. Research publications entries in PubMed were analysed according to number of publications, topic, publication type, and country of investigators. Results: Between 2002-2011, 226 (77.7%) publications with the following research topics were identified: respiratory infections 3 (1.3%); parasites 8 (3.5%); diarrhoea 10 (4.4%); tuberculosis 10 (4.4%); human immunodeficiency virus (HIV) 11(4.9%); multi-drug resistant bacteria (MDR) 18(8.0%); polio 31(13.7%); leishmania 31(13.7%); malaria 46(20.4%). From 2002-2011, 11 (4.9%) publications were basic science laboratory-based research studies. Between 2002-2011, 8 (3.5%) publications from Afghan institutions were identified. Conclusion: In conclusion, the internet-based database Pubmed can be consulted to collect data for guidance of infectious diseases research activity of low-income countries. The presented data suggest that infectious diseases research in Afghanistan is limited for respiratory infections research, has few studies conducted by Afghan institutions, and limited laboratory-based research contributions. (author)

  8. RESPIRATORY INFECTIONS RESEARCH IN AFGHANISTAN: BIBLIOMETRIC ANALYSIS WITH THE DATABASE PUBMED.

    Pilsczek, Florian H

    2015-01-01

    Infectious diseases research in a low-income country like Afghanistan is important. In this study an internet-based database Pubmed was used for bibliometric analysis of infectious diseases research activity. Research publications entries in PubMed were analysed according to number of publications, topic, publication type, and country of investigators. Between 2002-2011, 226 (77.7%) publications with the following research topics were identified: respiratory infections 3 (1.3%); parasites 8 (3.5%); diarrhoea 10 (4.4%); tuberculosis 10 (4.4%); human immunodeficiency virus (HIV) 11 (4.9%); multi-drug resistant bacteria (MDR) 18 (8.0%); polio 31 (13.7%); leishmania 31 (13.7%); malaria 46 (20.4%). From 2002-2011, 11 (4.9%) publications were basic science laboratory-based research studies. Between 2002-2011, 8 (3.5%) publications from Afghan institutions were identified. In conclusion, the internet-based database Pubmed can be consulted to collect data for guidance of infectious diseases research activity of low-income countries. The presented data suggest that infectious diseases research in Afghanistan is limited for respiratory infections research, has few studies conducted by Afghan institutions, and limited laboratory-based research contributions.

  9. The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-01-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  10. The Kepler DB: a database management system for arrays, sparse arrays, and binary data

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-07-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30 minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database management system (Kepler DB)was created to act as the repository of this information. After one year of flight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one-dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  11. Managing science developing your research, leadership and management skills

    Peach, Ken

    2017-01-01

    Managing science, which includes managing scientific research and, implicitly, managing scientists, has much in common with managing any enterprise, and most of these issues (e.g. annual budget planning and reporting) form the background. Equally, much scientific research is carried in universities ancient and modern, which have their own mores, ranging from professorial autocracy to democratic plurality, as well as national and international with their missions and styles. But science has issues that require a somewhat different approach if it is to prosper and succeed. Society now expects science, whether publicly or privately funded, to deliver benefits, yet the definition of science presumes no such benefit. Managing the expectations of the scientist with those of society is the challenge of the manager of science. The book addresses some issues around science and the organizations that do science. It then deals with leadership, management and communication, team building, recruitment, motivation, managin...

  12. Database system for management of health physics and industrial hygiene records

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-01-01

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection

  13. Modified Delphi study to determine optimal data elements for inclusion in an emergency management database system

    A. Jabar

    2012-03-01

    Conclusion: The use of a modified Expert Delphi study achieved consensus in aspects of hospital institutional capacity that can be translated into practical recommendations for implementation by the local emergency management database system. Additionally, areas of non-consensus have been identified where further work is required. This purpose of this study is to contribute to and aid in the development of this new system.

  14. Experience of MAPS in monitoring of personnel movement with on-line database management system

    Rajendran, T.S.; Anand, S.D.

    1992-01-01

    As a part of physical protection system, access control system has been installed in Madras Atomic Power Station(MAPS) to monitor and regulate the movement of persons within MAPS. The present system in its original form was meant only for security monitoring. A PC based database management system was added to this to computerize the availability of work force for actual work. (author). 2 annexures

  15. IAEA Coordinated Research Project on the Establishment of a Material Properties Database for Irradiated Core Structural Components for Continued Safe Operation and Lifetime Extension of Ageing Research Reactors

    Borio Di Tigliole, A.; Schaaf, Van Der; Barnea, Y.; Bradley, E.; Morris, C.; Rao, D. V. H. [Research Reactor Section, Vianna (Australia); Shokr, A. [Research Reactor Safety Section, Vienna (Australia); Zeman, A. [International Atomic Energy Agency, Vienna (Australia)

    2013-07-01

    Today more than 50% of operating Research Reactors (RRs) are over 45 years old. Thus, ageing management is one of the most important issues to face in order to ensure availability (including life extension), reliability and safe operation of these facilities for the future. Management of the ageing process requires, amongst others, the predictions for the behavior of structural materials of primary components subjected to irradiation such as reactor vessel and core support structures, many of which are extremely difficult or impossible to replace. In fact, age-related material degradation mechanisms resulted in high profile, unplanned and lengthy shutdowns and unique regulatory processes of relicensing the facilities in recent years. These could likely have been prevented by utilizing available data for the implementation of appropriate maintenance and surveillance programmes. This IAEA Coordinated Research Project (CRP) will provide an international forum to establish a material properties Database for irradiated core structural materials and components. It is expected that this Database will be used by research reactor operators and regulators to help predict ageing related degradation. This would be useful to minimize unpredicted outages due to ageing processes of primary components and to mitigate lengthy and costly shutdowns. The Database will be a compilation of data from RRs operators' inputs, comprehensive literature reviews and experimental data from RRs. Moreover, the CRP will specify further activities needed to be addressed in order to bridge the gaps in the new created Database, for potential follow-on activities. As per today, 13 Member States (MS) confirmed their agreement to contribute to the development of the Database, covering a wide number of materials and properties. The present publication incorporates two parts: the first part includes details on the pre-CRP Questionnaire, including the conclusions drawn from the answers received from

  16. IAEA Coordinated Research Project on the Establishment of a Material Properties Database for Irradiated Core Structural Components for Continued Safe Operation and Lifetime Extension of Ageing Research Reactors

    Borio Di Tigliole, A.; Schaaf, Van Der; Barnea, Y.; Bradley, E.; Morris, C.; Rao, D. V. H.; Shokr, A.; Zeman, A.

    2013-01-01

    Today more than 50% of operating Research Reactors (RRs) are over 45 years old. Thus, ageing management is one of the most important issues to face in order to ensure availability (including life extension), reliability and safe operation of these facilities for the future. Management of the ageing process requires, amongst others, the predictions for the behavior of structural materials of primary components subjected to irradiation such as reactor vessel and core support structures, many of which are extremely difficult or impossible to replace. In fact, age-related material degradation mechanisms resulted in high profile, unplanned and lengthy shutdowns and unique regulatory processes of relicensing the facilities in recent years. These could likely have been prevented by utilizing available data for the implementation of appropriate maintenance and surveillance programmes. This IAEA Coordinated Research Project (CRP) will provide an international forum to establish a material properties Database for irradiated core structural materials and components. It is expected that this Database will be used by research reactor operators and regulators to help predict ageing related degradation. This would be useful to minimize unpredicted outages due to ageing processes of primary components and to mitigate lengthy and costly shutdowns. The Database will be a compilation of data from RRs operators' inputs, comprehensive literature reviews and experimental data from RRs. Moreover, the CRP will specify further activities needed to be addressed in order to bridge the gaps in the new created Database, for potential follow-on activities. As per today, 13 Member States (MS) confirmed their agreement to contribute to the development of the Database, covering a wide number of materials and properties. The present publication incorporates two parts: the first part includes details on the pre-CRP Questionnaire, including the conclusions drawn from the answers received from the MS

  17. MIPS PlantsDB: a database framework for comparative plant genome research.

    Nussbaumer, Thomas; Martis, Mihaela M; Roessner, Stephan K; Pfeifer, Matthias; Bader, Kai C; Sharma, Sapna; Gundlach, Heidrun; Spannagl, Manuel

    2013-01-01

    The rapidly increasing amount of plant genome (sequence) data enables powerful comparative analyses and integrative approaches and also requires structured and comprehensive information resources. Databases are needed for both model and crop plant organisms and both intuitive search/browse views and comparative genomics tools should communicate the data to researchers and help them interpret it. MIPS PlantsDB (http://mips.helmholtz-muenchen.de/plant/genomes.jsp) was initially described in NAR in 2007 [Spannagl,M., Noubibou,O., Haase,D., Yang,L., Gundlach,H., Hindemitt, T., Klee,K., Haberer,G., Schoof,H. and Mayer,K.F. (2007) MIPSPlantsDB-plant database resource for integrative and comparative plant genome research. Nucleic Acids Res., 35, D834-D840] and was set up from the start to provide data and information resources for individual plant species as well as a framework for integrative and comparative plant genome research. PlantsDB comprises database instances for tomato, Medicago, Arabidopsis, Brachypodium, Sorghum, maize, rice, barley and wheat. Building up on that, state-of-the-art comparative genomics tools such as CrowsNest are integrated to visualize and investigate syntenic relationships between monocot genomes. Results from novel genome analysis strategies targeting the complex and repetitive genomes of triticeae species (wheat and barley) are provided and cross-linked with model species. The MIPS Repeat Element Database (mips-REdat) and Catalog (mips-REcat) as well as tight connections to other databases, e.g. via web services, are further important components of PlantsDB.

  18. Establishing the user requirements for the research reactor decommissioning database system

    Park, S. K.; Park, H. S.; Lee, G. W.; Park, J. H.

    2002-01-01

    In generally, so much information and data will be raised during the decommissioning activities. It is need a systematical electric system for the management of that. A database system for the decommissioning information and data management from the KRR-1 and 2 decommissioning project is developing now. All information and data will be put into this database system and retrieval also. For the developing the DB system, the basic concept, user requirements were established the then set up the system for categorizing the information and data. The entities of tables for input the data was raised and categorized and then converted the code. The ERD (Entity Relation Diagram) was also set up to show their relation. In need of the developing the user interface system for retrieval the data, is should be studied the analyzing on the relation between the input and output the data. Through this study, as results, the items of output tables are established and categorized according to the requirement of the user interface system for the decommissioning information and data. These tables will be used for designing the prototype and be set up by several feeds back for establishing the decommissioning database system

  19. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  20. SNPpy--database management for SNP data from genome wide association studies.

    Faheem Mitha

    Full Text Available BACKGROUND: We describe SNPpy, a hybrid script database system using the Python SQLAlchemy library coupled with the PostgreSQL database to manage genotype data from Genome-Wide Association Studies (GWAS. This system makes it possible to merge study data with HapMap data and merge across studies for meta-analyses, including data filtering based on the values of phenotype and Single-Nucleotide Polymorphism (SNP data. SNPpy and its dependencies are open source software. RESULTS: The current version of SNPpy offers utility functions to import genotype and annotation data from two commercial platforms. We use these to import data from two GWAS studies and the HapMap Project. We then export these individual datasets to standard data format files that can be imported into statistical software for downstream analyses. CONCLUSIONS: By leveraging the power of relational databases, SNPpy offers integrated management and manipulation of genotype and phenotype data from GWAS studies. The analysis of these studies requires merging across GWAS datasets as well as patient and marker selection. To this end, SNPpy enables the user to filter the data and output the results as standardized GWAS file formats. It does low level and flexible data validation, including validation of patient data. SNPpy is a practical and extensible solution for investigators who seek to deploy central management of their GWAS data.

  1. Towards an international authoritative system for coordination and management of a unique recommended k0-NAA database

    De Corte, F.

    2010-01-01

    This paper describes the evolution of the database in k 0 -standardized neutron activation analysis (k 0 -NAA), ranging from its full supervision by the founders of the k 0 -method at the Institute for Nuclear Sciences (INW)/Gent and the Central Research Institute for Physics (KFKI)/Budapest (from about the mid 1970s up to the early 1990s), to the present situation (roughly speaking starting with the first k 0 Users Workshop in 1992) where an increasing number of researchers from institutes allover the world are reporting on their experimental work aiming at the improvement and extension of the existing database. Although these individual contributions are undoubtedly commendable, the resulting fragmentary data sets leave behind important questions with respect to interpretation, evaluation, integration and recommendation, as illustrated with the (extreme) example of 131 Ba. This situation urgently calls for establishing and managing an international authoritative system for the coordination and quality control of a unique database with recommended data for k 0 -NAA, considering such parameters as accuracy, traceability and consistency. In the present paper, it is proposed to entrust this task to a standing 'Reference k 0 -Data Subcommittee' of the k 0 -ISC (k 0 International Scientific Committee).

  2. Relating Performative and Ostensive Management Accounting Research

    Hansen, Allan

    2011-01-01

    . Findings – The paper illustrates how the process is a balancing act. On the one hand, it requires performative researchers to relate more closely to aspects decisive for ostensive researchers; yet, on the other, they need to preserve the distinctiveness of the performative approach. Originality....../value – This paper exemplifies these issues with reference to management accounting research and contributes by clarifying the methodological implications of moving performative research closer to ostensive research....

  3. Information Management in Creative Engineering Design and Capabilities of Database Transactions

    Jacobsen, Kim; Eastman, C. A.; Jeng, T. S.

    1997-01-01

    This paper examines the information management requirements and sets forth the general criteria for collaboration and concurrency control in creative engineering design. Our work attempts to recognize the full range of concurrency, collaboration and complex transactions structure now practiced...... in manual and semi-automated design and the range of capabilities needed as the demands for enhanced but flexible electronic information management unfolds.The objective of this paper is to identify new issues that may advance the use of databases to support creative engineering design. We start...... with a generalized description of the structure of design tasks and how information management in design is dealt with today. After this review, we identify extensions to current information management capabilities that have been realized and/or proposed to support/augment what designers can do now. Given...

  4. Positioning soundscape research and management

    Andringa, Tjeerd C.; Weber, Miriam; Payne, Sarah R.; Krijnders, J. D. (Dirkjan); Dixon, Maxwell N.; v. d. Linden, Robert; de Kock, Eveline G. L.; Lanser, J. Jolie L.

    2013-01-01

    This paper is an outcome of a workshop that addressed the question how soundscape research can improve its impact on the local level. It addresses a number of topics by complementing existing approaches and practices with possible future approaches and practices. The paper starts with an analysis of

  5. Preliminary Study on Management of Agricultural Scientific Research Projects in the New Situation

    Haiyan LUO; Qingqun YAO; Lizhen CHEN; Yu ZHENG

    2015-01-01

    Project management of agricultural scientific research institutions is an important section of agricultural scientific research plan management. It is of great significance for sustainable development of scientific research work of scientific research institutions. According to a series of opinions and notices about scientific and technological system reform issued by the state,and combining current situations of management of scientific research projects in scientific research institutions,this paper made a preliminary study on management of agricultural scientific research projects in the new trend. Finally,on the basis of the current situations of management of agricultural scientific research projects,it came up with pertinent recommendations,including strengthening communication and cooperation and actively declaring projects,strengthening preliminary planning of projects and establishing project information database,reinforcing project process management,ensuring on-time and high quality completion of projects,and strengthening learning and improving quality of management personnel.

  6. The Net Enabled Waste Management Database in the context of an indicator of sustainable development for radioactive waste management

    Csullog, G.W.; Selling, H.; Holmes, R.; Benitez, J.C.

    2002-01-01

    The IAEA was selected by the UN to be the lead agency for the development and implementation of indicators of sustainable development for radioactive waste management (ISD-RW). Starting in late 1999, the UN initiated a program to consolidate a large number of indicators into a smaller set and advised the IAEA that a single ISD-RW was needed. In September 2001, a single indicator was developed by the IAEA and subsequently revised in February 2002. In parallel with its work on the ISD-RW, the IAEA developed and implemented the Net Enabled Waste Management Database (NEWMDB). The NEWMDB is an international database to collect, compile and disseminate information about nationally-based radioactive waste management programmes and waste inventories. The first data collection cycle with the NEWMDB (July 2001 to March 2002) demonstrated that much of the information needed to calculate the ISD-RW could be collected by the IAEA for its international database. However, the first data collection cycle indicated that capacity building, in the area of identifying waste classification schemes used in countries, is required. (author)

  7. A Graduate Class in Research Data Management

    Schmidt, Lawrence; Holles, Joseph

    2018-01-01

    A graduate elective course in Research Data Management (RDM) was developed and taught as a team by a research librarian and a research active faculty member. Coteaching allowed each instructor to contribute knowledge in their specialty areas. The goal of this course was to provide graduate students the RDM knowledge necessary to efficiently and…

  8. Managing Research Libraries in Developing Economy | Ahmed ...

    This paper discusses managing research libraries in developing economy. The concepts of special libraries, funding of research libraries, the need for training and retraining of library staff and resources sharing networking were highlighted. The paper recommends that research Institutes need to re-order their priorities ...

  9. Development of the Database for Environmental Sound Research and Application (DESRA: Design, Functionality, and Retrieval Considerations

    Brian Gygi

    2010-01-01

    Full Text Available Theoretical and applied environmental sounds research is gaining prominence but progress has been hampered by the lack of a comprehensive, high quality, accessible database of environmental sounds. An ongoing project to develop such a resource is described, which is based upon experimental evidence as to the way we listen to sounds in the world. The database will include a large number of sounds produced by different sound sources, with a thorough background for each sound file, including experimentally obtained perceptual data. In this way DESRA can contain a wide variety of acoustic, contextual, semantic, and behavioral information related to an individual sound. It will be accessible on the Internet and will be useful to researchers, engineers, sound designers, and musicians.

  10. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 5

    2003-05-01

    The document consists of two parts: Overview and Country Waste Profile Reports for Reporting Year 2000. The first section contains overview reports that provide assessments of the achievements and shortcomings of the Net Enabled Waste Management Database (NEWMDB) during the first two data collection cycles (July 2001 to March 2002 and July 2002 to February 2003). The second part of the report includes a summary and compilation of waste management data submitted by Agency Member States in both the first and second data collection cycles

  11. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 9, May 2008

    2008-05-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an Internet-based application which contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories in IAEA Member States. It can be accessed via the following Internet address: http://www-newmdb.iaea.org. The Country Waste Profiles provide a concise summary of the information entered into the NEWMDB system by each participating Member State. This Profiles report is based on data collected using the NEWMDB from May to December 2007

  12. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 8, August 2007

    2007-08-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an Internet-based application which contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories in IAEA Member States. It can be accessed via the following Internet address: http://www-newmdb.iaea.org. The Country Waste Profiles provide a concise summary of the information entered into the NEWMDB system by each participating Member State. This Profiles report is based on data collected using the NEWMDB from May to December 2006

  13. Congestion Quantification Using the National Performance Management Research Data Set

    Virginia P. Sisiopiku

    2017-11-01

    Full Text Available Monitoring of transportation system performance is a key element of any transportation operation and planning strategy. Estimation of dependable performance measures relies on analysis of large amounts of traffic data, which are often expensive and difficult to gather. National databases can assist in this regard, but challenges still remain with respect to data management, accuracy, storage, and use for performance monitoring. In an effort to address such challenges, this paper showcases a process that utilizes the National Performance Management Research Data Set (NPMRDS for generating performance measures for congestion monitoring applications in the Birmingham region. The capabilities of the relational database management system (RDBMS are employed to manage the large amounts of NPMRDS data. Powerful visual maps are developed using GIS software and used to illustrate congestion location, extent and severity. Travel time reliability indices are calculated and utilized to quantify congestion, and congestion intensity measures are developed and employed to rank and prioritize congested segments in the study area. The process for managing and using big traffic data described in the Birmingham case study is a great example that can be replicated by small and mid-size Metropolitan Planning Organizations to generate performance-based measures and monitor congestion in their jurisdictions.

  14. Meta-analysis constrained by data: Recommendations to improve relevance of nutrient management research

    Five research teams received funding through the North American 4R Research Fund to conduct meta-analyses of the air and water quality impacts of on-farm 4R nutrient management practices. In compiling or expanding databases for these analyses on environmental and crop production effects, researchers...

  15. Third research coordination meeting on reference database for neutron activation analysis. Summary report

    Kellett, M.A.

    2009-12-01

    The third meeting of the Co-ordinated Research Project on 'Reference Database for Neutron Activation Analysis' was held at the IAEA, Vienna from 17-19 November 2008. A summary of presentations made by participants is given, reports on specific tasks and subsequent discussions. With the aim of finalising the work of this CRP and in order to meet initial objectives, outputs were discussed and detailed task assignments agreed upon. (author)

  16. CmMDb: a versatile database for Cucumis melo microsatellite markers and other horticulture crop research.

    Bhawna; Chaduvula, Pavan K; Bonthala, Venkata S; Manjusha, Verma; Siddiq, Ebrahimali A; Polumetla, Ananda K; Prasad, Gajula M N V

    2015-01-01

    Cucumis melo L. that belongs to Cucurbitaceae family ranks among one of the highest valued horticulture crops being cultivated across the globe. Besides its economical and medicinal importance, Cucumis melo L. is a valuable resource and model system for the evolutionary studies of cucurbit family. However, very limited numbers of molecular markers were reported for Cucumis melo L. so far that limits the pace of functional genomic research in melon and other similar horticulture crops. We developed the first whole genome based microsatellite DNA marker database of Cucumis melo L. and comprehensive web resource that aids in variety identification and physical mapping of Cucurbitaceae family. The Cucumis melo L. microsatellite database (CmMDb: http://65.181.125.102/cmmdb2/index.html) encompasses 39,072 SSR markers along with its motif repeat, motif length, motif sequence, marker ID, motif type and chromosomal locations. The database is featured with novel automated primer designing facility to meet the needs of wet lab researchers. CmMDb is a freely available web resource that facilitates the researchers to select the most appropriate markers for marker-assisted selection in melons and to improve breeding strategies.

  17. Validating the extract, transform, load process used to populate a large clinical research database.

    Denney, Michael J; Long, Dustin M; Armistead, Matthew G; Anderson, Jamie L; Conway, Baqiyyah N

    2016-10-01

    Informaticians at any institution that are developing clinical research support infrastructure are tasked with populating research databases with data extracted and transformed from their institution's operational databases, such as electronic health records (EHRs). These data must be properly extracted from these source systems, transformed into a standard data structure, and then loaded into the data warehouse while maintaining the integrity of these data. We validated the correctness of the extract, load, and transform (ETL) process of the extracted data of West Virginia Clinical and Translational Science Institute's Integrated Data Repository, a clinical data warehouse that includes data extracted from two EHR systems. Four hundred ninety-eight observations were randomly selected from the integrated data repository and compared with the two source EHR systems. Of the 498 observations, there were 479 concordant and 19 discordant observations. The discordant observations fell into three general categories: a) design decision differences between the IDR and source EHRs, b) timing differences, and c) user interface settings. After resolving apparent discordances, our integrated data repository was found to be 100% accurate relative to its source EHR systems. Any institution that uses a clinical data warehouse that is developed based on extraction processes from operational databases, such as EHRs, employs some form of an ETL process. As secondary use of EHR data begins to transform the research landscape, the importance of the basic validation of the extracted EHR data cannot be underestimated and should start with the validation of the extraction process itself. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. A European Flood Database: facilitating comprehensive flood research beyond administrative boundaries

    J. Hall

    2015-06-01

    Full Text Available The current work addresses one of the key building blocks towards an improved understanding of flood processes and associated changes in flood characteristics and regimes in Europe: the development of a comprehensive, extensive European flood database. The presented work results from ongoing cross-border research collaborations initiated with data collection and joint interpretation in mind. A detailed account of the current state, characteristics and spatial and temporal coverage of the European Flood Database, is presented. At this stage, the hydrological data collection is still growing and consists at this time of annual maximum and daily mean discharge series, from over 7000 hydrometric stations of various data series lengths. Moreover, the database currently comprises data from over 50 different data sources. The time series have been obtained from different national and regional data sources in a collaborative effort of a joint European flood research agreement based on the exchange of data, models and expertise, and from existing international data collections and open source websites. These ongoing efforts are contributing to advancing the understanding of regional flood processes beyond individual country boundaries and to a more coherent flood research in Europe.

  19. An attempt to develop a database for epidemiological research in Semipalatinsk

    Katayama, Hiroaki; Apsalikov, K.N.; Gusev, B.I.; Galich, B.; Madieva, M.; Koshpessova, G.; Abdikarimova, A.; Hoshi, Masaharu

    2006-01-01

    The present paper reports progress and problems in our development of a database for comprehensive epidemiological research in Semipalatinsk whose ultimate aim is to examine the effects of low dose radiation exposure on the human body. The database was constructed and set up at the Scientific Research Institute of Radiation Medicine Ecology in 2003, and the number of data entries into the database reached 110,000 on 31 January 2005. However, we face some problems concerning size, accuracy and reliability of data which hinder full epidemiological analysis. Firstly we need fuller bias free data. The second task is to establish a committee for a discussion of the analysis, which should be composed of statisticians and epidemiologists, to conduct a research project from a long-term perspective, and carry out the collection of data effectively, along the lines of the project. Due to the insufficiency of data collected so far, our analysis is limited to showing the trends in mortality rates in the high and low dose areas. (author)

  20. Quality in Qualitative Management Accounting Research

    Nørreklit, Hanne

    2014-01-01

    , the paper has implications for contemporary discussions on doing research that is relevant for practice. Originality/value: The paper provides novel insight into the analysis of quality in management accounting research. Additionally, it provides a framework for reflecting on the accumulation of practice......Purpose: The purpose of this article is to demonstrate how the quality of Qualitative Research in Accounting & Management (QRAM) is manifested through the conceptualization of knowledge about functioning actions that are applicable for local management accounting practices. Design...... to the development of a performativity in management accounting topos that integrates facts, possibilities, value and communication. Findings: The analysis documents that the three QRAM articles on inter-organizational cost management make a common contribution to the knowledge related to what to do to make...

  1. Research and Development Management System (RDMS)

    Mohd Azidi Abdul Rahman; Abdul Muin Abdul Rahman; Sufian Norazam Mohamed Aris; Saaidi Ismail; Mohamad Safuan Sulaiman; Maizura Ibrahim; Hazizi Omar; Roslan Mohd Ali

    2010-01-01

    Research and Development (R and D) is a main activity carried out at the Malaysian Nuclear Agency particularly in the physical science and nuclear field. The R and D activity that is carried out needs to be managed more efficiently and systematically. Until now all research management activities are carried out manually or semi electronically, beginning from filling in application forms to when the project is completed. Therefore a computerized system is needed in order to manage and monitor R and D projects. The R and D system is capable of giving access information concerning R and D projects which are carried out to users inside and outside the agency. The R and D management system (RDMS) can increase the capability of the Malaysian Nuclear Agency in managing, researching and developing, innovating and inventing technology as well as commercializing the R and D produced. (author)

  2. MIMIC II: a massive temporal ICU patient database to support research in intelligent patient monitoring

    Saeed, M.; Lieu, C.; Raber, G.; Mark, R. G.

    2002-01-01

    Development and evaluation of Intensive Care Unit (ICU) decision-support systems would be greatly facilitated by the availability of a large-scale ICU patient database. Following our previous efforts with the MIMIC (Multi-parameter Intelligent Monitoring for Intensive Care) Database, we have leveraged advances in networking and storage technologies to develop a far more massive temporal database, MIMIC II. MIMIC II is an ongoing effort: data is continuously and prospectively archived from all ICU patients in our hospital. MIMIC II now consists of over 800 ICU patient records including over 120 gigabytes of data and is growing. A customized archiving system was used to store continuously up to four waveforms and 30 different parameters from ICU patient monitors. An integrated user-friendly relational database was developed for browsing of patients' clinical information (lab results, fluid balance, medications, nurses' progress notes). Based upon its unprecedented size and scope, MIMIC II will prove to be an important resource for intelligent patient monitoring research, and will support efforts in medical data mining and knowledge-discovery.

  3. Creating a sampling frame for population-based veteran research: representativeness and overlap of VA and Department of Defense databases.

    Washington, Donna L; Sun, Su; Canning, Mark

    2010-01-01

    Most veteran research is conducted in Department of Veterans Affairs (VA) healthcare settings, although most veterans obtain healthcare outside the VA. Our objective was to determine the adequacy and relative contributions of Veterans Health Administration (VHA), Veterans Benefits Administration (VBA), and Department of Defense (DOD) administrative databases for representing the U.S. veteran population, using as an example the creation of a sampling frame for the National Survey of Women Veterans. In 2008, we merged the VHA, VBA, and DOD databases. We identified the number of unique records both overall and from each database. The combined databases yielded 925,946 unique records, representing 51% of the 1,802,000 U.S. women veteran population. The DOD database included 30% of the population (with 8% overlap with other databases). The VHA enrollment database contributed an additional 20% unique women veterans (with 6% overlap with VBA databases). VBA databases contributed an additional 2% unique women veterans (beyond 10% overlap with other databases). Use of VBA and DOD databases substantially expands access to the population of veterans beyond those in VHA databases, regardless of VA use. Adoption of these additional databases would enhance the value and generalizability of a wide range of studies of both male and female veterans.

  4. Building a recruitment database for asthma trials: a conceptual framework for the creation of the UK Database of Asthma Research Volunteers.

    Nwaru, Bright I; Soyiri, Ireneous N; Simpson, Colin R; Griffiths, Chris; Sheikh, Aziz

    2016-05-26

    Randomised clinical trials are the 'gold standard' for evaluating the effectiveness of healthcare interventions. However, successful recruitment of participants remains a key challenge for many trialists. In this paper, we present a conceptual framework for creating a digital, population-based database for the recruitment of asthma patients into future asthma trials in the UK. Having set up the database, the goal is to then make it available to support investigators planning asthma clinical trials. The UK Database of Asthma Research Volunteers will comprise a web-based front-end that interactively allows participant registration, and a back-end that houses the database containing participants' key relevant data. The database will be hosted and maintained at a secure server at the Asthma UK Centre for Applied Research based at The University of Edinburgh. Using a range of invitation strategies, key demographic and clinical data will be collected from those pre-consenting to consider participation in clinical trials. These data will, with consent, in due course, be linkable to other healthcare, social, economic, and genetic datasets. To use the database, asthma investigators will send their eligibility criteria for participant recruitment; eligible participants will then be informed about the new trial and asked if they wish to participate. A steering committee will oversee the running of the database, including approval of usage access. Novel communication strategies will be utilised to engage participants who are recruited into the database in order to avoid attrition as a result of waiting time to participation in a suitable trial, and to minimise the risk of their being approached when already enrolled in a trial. The value of this database will be whether it proves useful and usable to researchers in facilitating recruitment into clinical trials on asthma and whether patient privacy and data security are protected in meeting this aim. Successful recruitment is

  5. Integration of the ATLAS tag database with data management and analysis components

    Cranshaw, J; Malon, D; Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C

    2008-01-01

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted

  6. Integration of the ATLAS tag database with data management and analysis components

    Cranshaw, J; Malon, D [Argonne National Laboratory, Argonne, IL 60439 (United States); Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C [Department of Physics and Astronomy, University of Glasgow, Glasgow, G12 8QQ, Scotland (United Kingdom)], E-mail: c.nicholson@physics.gla.ac.uk

    2008-07-15

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted.

  7. Waste management research abstracts no. 21

    1992-12-01

    The 21th issue of this publication contains over 700 abstracts from 35 IAEA Member Countries comprehending various aspects of radioactive waste management. Radioactive waste disposal, processing and storage, geochemical and geological investigations related to waste management, mathematical models and environmental impacts are reviewed. Many programs involve cooperation among several countries and further international cooperation is expected to be promoted through availability of compiled information on research programs, institutions and scientists engaged in waste management

  8. Waste management research abstracts. No. 20

    1990-10-01

    The 20th issue of this publication contains over 700 abstracts from 32 IAEA Member Countries comprehending various aspects of radioactive waste management. Radioactive waste disposal, processing and storage, geochemical and geological investigations related to waste management, mathematical models and environmental impacts are reviewed. Many programs involve cooperation among several countries and further international cooperation is expected to be promoted through availability of compiled information on research programs, institutions and scientists engaged in waste management

  9. Editorial | Cavagnaro | Research in Hospitality Management

    Research in Hospitality Management. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 6, No 1 (2016) >. Log in or Register to get access to full text downloads.

  10. Editorial | Lashley | Research in Hospitality Management

    Research in Hospitality Management. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 7, No 2 (2017) >. Log in or Register to get access to full text downloads.

  11. Editorial | Lashley | Research in Hospitality Management

    Research in Hospitality Management. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 7, No 1 (2017) >. Log in or Register to get access to full text downloads.

  12. Editorial | Aboagye | African Journal of Management Research

    African Journal of Management Research. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 21, No 1 (2013) >. Log in or Register to get access to full text downloads.

  13. Editorial | Lashley | Research in Hospitality Management

    Research in Hospitality Management. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 5, No 2 (2015) >. Log in or Register to get access to full text downloads.

  14. Research Methodologies in Supply Chain Management

    Kotzab, Herbert

    . Within the 36 chapters 70 authors bring together a rich selection of theoretical and practical examples of how research methodologies are applied in supply chain management. The book contains papers on theoretical implications as well as papers on a range of key methods, such as modelling, surveys, case...... studies or action research. It will be of great interest to researchers in the area of supply chain management and logistics, but also to neighbouring fields, such as network management or global operations.......While supply chain management has risen to great prominence in recent year, there are hardly related developments in research methodologies. Yet, as supply chains cover more than one company, one central issue is how to collect and analyse data along the whole or relevant part of the supply chain...

  15. Database Administrator

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  16. Facility management research in the Netherlands

    Thijssen, Thomas; van der Voordt, Theo; Mobach, Mark P.

    This article provides a brief overview of the history and development of facility management research in the Netherlands and indicates future directions. Facility management as a profession has developed from single service to multi-services and integral services over the past 15 years.

  17. Data management of web archive research data

    Zierau, Eld; Jurik, Bolette

    This paper will provide recommendations to overcome various challenges for data management of web materials. The recommendations are based on results from two independent Danish research projects with different requirements to data management: The first project focuses on high precision on a par...

  18. Closing the gap between research and management

    Deborah M. Finch; Marcia Patton-Mallory

    1993-01-01

    In this paper, we evaluate the reasons for gaps in communication between researchers and natural resource managers and identify methods to close these gaps. Gaps originate from differing patterns of language use, disparities in organizational culture and values, generation of knowledge that is too narrowly-focused to solve complex problems, failure by managers to relay...

  19. Research challenges for energy data management (panel)

    Pedersen, Torben Bach; Lehner, Wolfgang

    2013-01-01

    This panel paper aims at initiating discussion at the Second International Workshop on Energy Data Management (EnDM 2013) about the important research challenges within Energy Data Management. The authors are the panel organizers, extra panelists will be recruited before the workshop...

  20. Basic Project Management Methodologies for Survey Researchers.

    Beach, Robert H.

    To be effective, project management requires a heavy dependence on the document, list, and computational capability of a computerized environment. Now that microcomputers are readily available, only the rediscovery of classic project management methodology is required for improved resource allocation in small research projects. This paper provides…

  1. [The research project: financing and management].

    Schena, F P

    2003-01-01

    Basic and clinical research is accomplished by projects. The design of a project is not only based on the scientific content but also on its financing and management. This article wants to illustrate the correct modalities for project financing and project management in a scientific project.

  2. Corrigendum | Gehrels | Research in Hospitality Management

    Li, L. (2016). Are social media applications a facilitator or barrier to learning for tourism and hospitality management students? Research in Hospitality Management, 6 (2), 195–202. https://doi.org/10.1080/22243534.2016.1253289. Acknowledgement — I thank University of Surrey, University of Derby, and Bath Spa ...

  3. The web-enabled database of JRC-EC: a useful tool for managing european gen 4 materials data

    Over, H.H.; Dietz, W.

    2008-01-01

    Materials and document databases are important tools to conserve knowledge and experimental materials data of European R and D projects. A web-enabled application guarantees a fast access to these data. In combination with analysis tools the experimental data are used for e.g. mechanical design, construction and lifetime predictions of complex components. The effective and efficient handling of large amounts of generic and detailed materials data with regard to properties related to e.g. fabrication processes, joining techniques, irradiation or aging is one of the basic elements of data management within ongoing nuclear safety and design related European research projects and networks. The paper describes the structure and functionality of Mat-DB and gives examples how these tools can be used for the management and evaluation of materials data for EURATOM FP7 Generation IV reactor types. (authors)

  4. The image database management system of teaching file using personal computer

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  5. Budgeting, funding, and managing clinical research projects.

    Hatfield, Elizabeth; Dicks, Elizabeth; Parfrey, Patrick

    2009-01-01

    Large, integrated multidisciplinary teams have become recognized as an efficient means by which to drive innovation and discovery in clinical research. This chapter describes how to budget and fund these large studies and effectively manage the large, often dispersed teams involved. Sources of funding are identified; budget development, justification, reporting, financial governance, and accountability are described; in addition to the creation and management of the multidisciplinary team that will implement the research plan.

  6. A partnership approach to research data management

    Brown, Mark L.; White, Wendy

    2013-01-01

    This outlines developments to support and enhance research data management policy and practice at the University of Southampton. It details a research-led approach to identify institutional challenges and priorities and use of this evidence-base to inform the creation of a 10 year roadmap and policy framework. The particular issues relating to workflow, storage, security and archiving are discussed and examples are given of both pilot and embedded services including data management planning s...

  7. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance. Copyright © 2015. Published

  8. A Systematic Review of Coding Systems Used in Pharmacoepidemiology and Database Research.

    Chen, Yong; Zivkovic, Marko; Wang, Tongtong; Su, Su; Lee, Jianyi; Bortnichak, Edward A

    2018-02-01

    Clinical coding systems have been developed to translate real-world healthcare information such as prescriptions, diagnoses and procedures into standardized codes appropriate for use in large healthcare datasets. Due to the lack of information on coding system characteristics and insufficient uniformity in coding practices, there is a growing need for better understanding of coding systems and their use in pharmacoepidemiology and observational real world data research. To determine: 1) the number of available coding systems and their characteristics, 2) which pharmacoepidemiology databases are they adopted in, 3) what outcomes and exposures can be identified from each coding system, and 4) how robust they are with respect to consistency and validity in pharmacoepidemiology and observational database studies. Electronic literature database and unpublished literature searches, as well as hand searching of relevant journals were conducted to identify eligible articles discussing characteristics and applications of coding systems in use and published in the English language between 1986 and 2016. Characteristics considered included type of information captured by codes, clinical setting(s) of use, adoption by a pharmacoepidemiology database, region, and available mappings. Applications articles describing the use and validity of specific codes, code lists, or algorithms were also included. Data extraction was performed independently by two reviewers and a narrative synthesis was performed. A total of 897 unique articles and 57 coding systems were identified, 17% of which included country-specific modifications or multiple versions. Procedures (55%), diagnoses (36%), drugs (38%), and site of disease (39%) were most commonly and directly captured by these coding systems. The systems were used to capture information from the following clinical settings: inpatient (63%), ambulatory (55%), emergency department (ED, 34%), and pharmacy (13%). More than half of all coding

  9. Chess databases as a research vehicle in psychology: Modeling large data.

    Vaci, Nemanja; Bilalić, Merim

    2017-08-01

    The game of chess has often been used for psychological investigations, particularly in cognitive science. The clear-cut rules and well-defined environment of chess provide a model for investigations of basic cognitive processes, such as perception, memory, and problem solving, while the precise rating system for the measurement of skill has enabled investigations of individual differences and expertise-related effects. In the present study, we focus on another appealing feature of chess-namely, the large archive databases associated with the game. The German national chess database presented in this study represents a fruitful ground for the investigation of multiple longitudinal research questions, since it collects the data of over 130,000 players and spans over 25 years. The German chess database collects the data of all players, including hobby players, and all tournaments played. This results in a rich and complete collection of the skill, age, and activity of the whole population of chess players in Germany. The database therefore complements the commonly used expertise approach in cognitive science by opening up new possibilities for the investigation of multiple factors that underlie expertise and skill acquisition. Since large datasets are not common in psychology, their introduction also raises the question of optimal and efficient statistical analysis. We offer the database for download and illustrate how it can be used by providing concrete examples and a step-by-step tutorial using different statistical analyses on a range of topics, including skill development over the lifetime, birth cohort effects, effects of activity and inactivity on skill, and gender differences.

  10. Development of an integrated database management system to evaluate integrity of flawed components of nuclear power plant

    Mun, H. L.; Choi, S. N.; Jang, K. S.; Hong, S. Y.; Choi, J. B.; Kim, Y. J.

    2001-01-01

    The object of this paper is to develop an NPP-IDBMS(Integrated DataBase Management System for Nuclear Power Plants) for evaluating the integrity of components of nuclear power plant using relational data model. This paper describes the relational data model, structure and development strategy for the proposed NPP-IDBMS. The NPP-IDBMS consists of database, database management system and interface part. The database part consists of plant, shape, operating condition, material properties and stress database, which are required for the integrity evaluation of each component in nuclear power plants. For the development of stress database, an extensive finite element analysis was performed for various components considering operational transients. The developed NPP-IDBMS will provide efficient and accurate way to evaluate the integrity of flawed components

  11. REALIZING BUSINESS PROCESS MANAGEMENT BY HELP OF A PROCESS MAPPING DATABASE TOOL

    Vergili, Ceren

    2016-01-01

    In a typical business sector, processes are the building blocks of the achievement. A considerable percentage of the processes are consisting of business processes. This fact is bringing the fact that business sectors are in need of a management discipline. Business Process Management (BPM) is a discipline that combines modelling, automation, execution, control, measurement, and optimization of process by considering enterprise goals, spanning systems, employees, customers, and partners. CERN’s EN – HE – HM section desires to apply the BPM discipline appropriately for improving their necessary technical, administrative and managerial actions to supply appropriate CERN industrial transport, handling and lifting equipment and to maintain it. For this reason, a Process Mapping Database Tool is created to develop a common understanding about how the section members can visualize their processes, agree on quality standards and on how to improve. It provides a management support by establishing Process Charts...

  12. Integrated Storage and Management of Vector and Raster Data Based on Oracle Database

    WU Zheng

    2017-05-01

    Full Text Available At present, there are many problems in the storage and management of multi-source heterogeneous spatial data, such as the difficulty of transferring, the lack of unified storage and the low efficiency. By combining relational database and spatial data engine technology, an approach for integrated storage and management of vector and raster data is proposed on the basis of Oracle in this paper. This approach establishes an integrated storage model on vector and raster data and optimizes the retrieval mechanism at first, then designs a framework for the seamless data transfer, finally realizes the unified storage and efficient management of multi-source heterogeneous data. By comparing experimental results with the international leading similar software ArcSDE, it is proved that the proposed approach has higher data transfer performance and better query retrieval efficiency.

  13. Study on Mandatory Access Control in a Secure Database Management System

    2001-01-01

    This paper proposes a security policy model for mandatory access control in class B1 database management system whose level of labeling is tuple. The relation-hierarchical data model is extended to multilevel relation-hierarchical data model. Based on the multilevel relation-hierarchical data model, the concept of upper-lower layer relational integrity is presented after we analyze and eliminate the covert channels caused by the database integrity. Two SQL statements are extended to process polyinstantiation in the multilevel secure environment. The system is based on the multilevel relation-hierarchical data model and is capable of integratively storing and manipulating multilevel complicated objects (e. g., multilevel spatial data) and multilevel conventional data ( e. g., integer. real number and character string).

  14. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  15. Open-access MIMIC-II database for intensive care research.

    Lee, Joon; Scott, Daniel J; Villarroel, Mauricio; Clifford, Gari D; Saeed, Mohammed; Mark, Roger G

    2011-01-01

    The critical state of intensive care unit (ICU) patients demands close monitoring, and as a result a large volume of multi-parameter data is collected continuously. This represents a unique opportunity for researchers interested in clinical data mining. We sought to foster a more transparent and efficient intensive care research community by building a publicly available ICU database, namely Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II). The data harnessed in MIMIC-II were collected from the ICUs of Beth Israel Deaconess Medical Center from 2001 to 2008 and represent 26,870 adult hospital admissions (version 2.6). MIMIC-II consists of two major components: clinical data and physiological waveforms. The clinical data, which include patient demographics, intravenous medication drip rates, and laboratory test results, were organized into a relational database. The physiological waveforms, including 125 Hz signals recorded at bedside and corresponding vital signs, were stored in an open-source format. MIMIC-II data were also deidentified in order to remove protected health information. Any interested researcher can gain access to MIMIC-II free of charge after signing a data use agreement and completing human subjects training. MIMIC-II can support a wide variety of research studies, ranging from the development of clinical decision support algorithms to retrospective clinical studies. We anticipate that MIMIC-II will be an invaluable resource for intensive care research by stimulating fair comparisons among different studies.

  16. A database for reproducible manipulation research: CapriDB – Capture, Print, Innovate

    Florian T. Pokorny

    2017-04-01

    Full Text Available We present a novel approach and database which combines the inexpensive generation of 3D object models via monocular or RGB-D camera images with 3D printing and a state of the art object tracking algorithm. Unlike recent efforts towards the creation of 3D object databases for robotics, our approach does not require expensive and controlled 3D scanning setups and aims to enable anyone with a camera to scan, print and track complex objects for manipulation research. The proposed approach results in detailed textured mesh models whose 3D printed replicas provide close approximations of the originals. A key motivation for utilizing 3D printed objects is the ability to precisely control and vary object properties such as the size, material properties and mass distribution in the 3D printing process to obtain reproducible conditions for robotic manipulation research. We present CapriDB – an extensible database resulting from this approach containing initially 40 textured and 3D printable mesh models together with tracking features to facilitate the adoption of the proposed approach.

  17. Radioactive waste management profiles. Compilation from the Waste Management Database. No. 3

    2000-07-01

    In 1989, the International Atomic Energy Agency began development of the Waste Management Data Base (WMDB) to, primarily, establish a mechanism for the collection, archival and dissemination of information about radioactive waste management in Member States. This current report is a summary and compilation of waste management collected from Member States from February 1998 to December 1999 in response to the Agency's 1997/98 WMDB Questionnaire. Member States were asked to report waste accumulations up to the end of 1996 and to predict waste accumulations up to the end of 2014

  18. The Starkey habitat database for ungulate research: construction, documentation, and use.

    Mary M. Rowland; Priscilla K. Coe; Rosemary J. Stussy; [and others].

    1998-01-01

    The Starkey Project, a large-scale, multidisciplinary research venture, began in 1987 in the Starkey Experimental Forest and Range in northeast Oregon. Researchers are studying effects of forest management on interactions and habitat use of mule deer (Odocoileus hemionus hemionus), elk (Cervus elaphus nelsoni), and cattle. A...

  19. Safety management in research and development organisation

    Nivedha, T.

    2016-01-01

    Health and safety is one of the most important aspects of an organizations smooth and effective functioning. It depends on the safety management, health management, motivation, leadership and training, welfare facilities, accident statistics, policy, organization and administration, hazard control and risk analysis, monitoring, statistics and reporting. Workplace accidents are increasingly common, main causes are untidiness, noise, too hot or cold environments, old or poorly maintained machines, and lack of training or carelessness of employees. One of the biggest issues facing employers today is the safety of their employees. This study aims at analyzing the occupational health and safety of Research organization in Indira Gandhi Centre for Atomic Research by gathering information on health management, safety management, motivation, leadership and training, welfare facilities, accident statistics, organization and administration, hazard control and risk analysis, monitoring, statistics and reporting. Data were collected by using questionnaires which were developed on health and safety management system. (author)

  20. Medical research using governments' health claims databases: with or without patients' consent?

    Tsai, Feng-Jen; Junod, Valérie

    2018-03-01

    Taking advantage of its single-payer, universal insurance system, Taiwan has leveraged its exhaustive database of health claims data for research purposes. Researchers can apply to receive access to pseudonymized (coded) medical data about insured patients, notably their diagnoses, health status and treatments. In view of the strict safeguards implemented, the Taiwanese government considers that this research use does not require patients' consent (either in the form of an opt-in or in the form of an opt-out). A group of non-governmental organizations has challenged this view in the Taiwanese Courts, but to no avail. The present article reviews the arguments both against and in favor of patients' consent for re-use of their data in research. It concludes that offering patients an opt-out would be appropriate as it would best balance the important interests at issue.