WorldWideScience

Sample records for object database management

  1. Managing the BABAR Object Oriented Database

    International Nuclear Information System (INIS)

    Hasan, Adil

    2002-01-01

    The BaBar experiment stores its data in an Object Oriented federated database supplied by Objectivity/DB(tm). This database is currently 350TB in size and is expected to increase considerably as the experiment matures. Management of this database requires careful planning and specialized tools in order to make the data available to physicists in an efficient and timely manner. We discuss the operational issues and management tools that were developed during the previous run to deal with this vast quantity of data at SLAC

  2. Computer Application Of Object Oriented Database Management ...

    African Journals Online (AJOL)

    Object Oriented Systems (OOS) have been widely adopted in software engineering because of their superiority with respect to data extensibility. The present trend in the software engineering process (SEP) towards concurrent computing raises novel concerns for the facilities and technology available in database ...

  3. The Network Configuration of an Object Relational Database Management System

    Science.gov (United States)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  4. Managing the BaBar object oriented database

    International Nuclear Information System (INIS)

    Hasan, A.; Trunov, A.

    2001-01-01

    The BaBar experiment stores its data in an Object Oriented federated database supplied by Objectivity/DB(tm). This database is currently 350TB in size and is expected to increase considerably as the experiment matures. Management of this database requires careful planning and specialized tools in order to make the data available to physicists in an efficient and timely manner. The authors discuss the operational issues and management tools that were developed during the previous run to deal with this vast quantity of data at SLAC

  5. An object-oriented framework for managing cooperating legacy databases

    NARCIS (Netherlands)

    Balsters, H; de Brock, EO

    2003-01-01

    We describe a general semantic framework for precise specification of so-called database federations. A database federation provides for tight coupling of a collection of heterogeneous legacy databases into a global integrated system. Our approach to database federation is based on the UML/OCL data

  6. Managing XML Data to optimize Performance into Object-Relational Databases

    Directory of Open Access Journals (Sweden)

    Iuliana BOTHA

    2011-06-01

    Full Text Available This paper propose some possibilities for manage XML data in order to optimize performance into object-relational databases. It is detailed the possibility of storing XML data into such databases, using for exemplification an Oracle database and there are tested some optimizing techniques of the queries over XMLType tables, like indexing and partitioning tables.

  7. On Modeling the Behavior of Comparators for Complex Fuzzy Objects in a Fuzzy Object-Relational Database Management System

    Directory of Open Access Journals (Sweden)

    JuanM. Medina

    2012-08-01

    Full Text Available This paper proposes a parameterized definition for fuzzy comparators on complex fuzzy datatypes like fuzzy collections with conjunctive semantics and fuzzy objects. This definition and its implementation on a Fuzzy Object-Relational Database Management System (FORDBMS provides the designer with a powerful tool to adapt the behavior of these operators to the semantics of the considered application.

  8. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    Science.gov (United States)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  9. Applying AN Object-Oriented Database Model to a Scientific Database Problem: Managing Experimental Data at Cebaf.

    Science.gov (United States)

    Ehlmann, Bryon K.

    Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.

  10. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    Science.gov (United States)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available

  11. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  12. Database development and management

    CERN Document Server

    Chao, Lee

    2006-01-01

    Introduction to Database Systems Functions of a DatabaseDatabase Management SystemDatabase ComponentsDatabase Development ProcessConceptual Design and Data Modeling Introduction to Database Design Process Understanding Business ProcessEntity-Relationship Data Model Representing Business Process with Entity-RelationshipModelTable Structure and NormalizationIntroduction to TablesTable NormalizationTransforming Data Models to Relational Databases .DBMS Selection Transforming Data Models to Relational DatabasesEnforcing ConstraintsCreating Database for Business ProcessPhysical Design and Database

  13. Passenger baggage object database (PBOD)

    Science.gov (United States)

    Gittinger, Jaxon M.; Suknot, April N.; Jimenez, Edward S.; Spaulding, Terry W.; Wenrich, Steve A.

    2018-04-01

    Detection of anomalies of interest in x-ray images is an ever-evolving problem that requires the rapid development of automatic detection algorithms. Automatic detection algorithms are developed using machine learning techniques, which would require developers to obtain the x-ray machine that was used to create the images being trained on, and compile all associated metadata for those images by hand. The Passenger Baggage Object Database (PBOD) and data acquisition application were designed and developed for acquiring and persisting 2-D and 3-D x-ray image data and associated metadata. PBOD was specifically created to capture simulated airline passenger "stream of commerce" luggage data, but could be applied to other areas of x-ray imaging to utilize machine-learning methods.

  14. Database functionality for learning objects

    NARCIS (Netherlands)

    Sessink, O.D.T.; Beeftink, H.H.; Hartog, R.J.M.

    2005-01-01

    The development of student-activating digital learning material in six research projects revealed several shortcomings in the current learning management systems. Once the SCORM 2004 and the IMS Sharable State Persistence specifications are implemented in learning management systems, some of these

  15. Records Management Database

    Data.gov (United States)

    US Agency for International Development — The Records Management Database is tool created in Microsoft Access specifically for USAID use. It contains metadata in order to access and retrieve the information...

  16. O-ODM Framework for Object-Relational Databases

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Rombaldo Jr

    2012-09-01

    Full Text Available Object-Relational Databases introduce new features which allow manipulating objects in databases. At present, many DBMS offer resources to manipulate objects in database, but most application developers just map class to relations tables, failing to exploit the O-R model strength. The lack of tools that aid the database project contributes to this situation. This work presents O-ODM (Object-Object Database Mapping, a persistent framework that maps objects from OO applications to database objects. Persistent Frameworks have been used to aid developers, managing all access to DBMS. This kind of tool allows developers to persist objects without solid knowledge about DBMSs and specific languages, improving the developers’ productivity, mainly when a different DBMS is used. The results of some experiments using O-ODM are shown.

  17. Object-relational database design-exploiting object orientation at the ...

    African Journals Online (AJOL)

    This paper applies the object-relational database paradigm in the design of a Health Management Information System. The class design, mapping of object classes to relational tables, the representation of inheritance hierarchies, and the appropriate database schema are all examined. Keywords: object relational ...

  18. A Database Interface for Complex Objects

    NARCIS (Netherlands)

    Holsheimer, Marcel; de By, Rolf A.; de By, R.A.; Ait-Kaci, Hassan

    We describe a formal design for a logical query language using psi-terms as data structures to interact effectively and efficiently with a relational database. The structure of psi-terms provides an adequate representation for so-called complex objects. They generalize conventional terms used in

  19. Nuclear database management systems

    International Nuclear Information System (INIS)

    Stone, C.; Sutton, R.

    1996-01-01

    The authors are developing software tools for accessing and visualizing nuclear data. MacNuclide was the first software application produced by their group. This application incorporates novel database management and visualization tools into an intuitive interface. The nuclide chart is used to access properties and to display results of searches. Selecting a nuclide in the chart displays a level scheme with tables of basic, radioactive decay, and other properties. All level schemes are interactive, allowing the user to modify the display, move between nuclides, and display entire daughter decay chains

  20. Ageing Management Program Database

    International Nuclear Information System (INIS)

    Basic, I.; Vrbanic, I.; Zabric, I.; Savli, S.

    2008-01-01

    The aspects of plant ageing management (AM) gained increasing attention over the last ten years. Numerous technical studies have been performed to study the impact of ageing mechanisms on the safe and reliable operation of nuclear power plants. National research activities have been initiated or are in progress to provide the technical basis for decision making processes. The long-term operation of nuclear power plants is influenced by economic considerations, the socio-economic environment including public acceptance, developments in research and the regulatory framework, the availability of technical infrastructure to maintain and service the systems, structures and components as well as qualified personnel. Besides national activities there are a number of international activities in particular under the umbrella of the IAEA, the OECD and the EU. The paper discusses the process, procedure and database developed for Slovenian Nuclear Safety Administration (SNSA) surveillance of ageing process of Nuclear power Plant Krsko.(author)

  1. Learning Ontology from Object-Relational Database

    Directory of Open Access Journals (Sweden)

    Kaulins Andrejs

    2015-12-01

    Full Text Available This article describes a method of transformation of object-relational model into ontology. The offered method uses learning rules for such complex data types as object tables and collections – arrays of a variable size, as well as nested tables. Object types and their transformation into ontologies are insufficiently considered in scientific literature. This fact served as motivation for the authors to investigate this issue and to write the article on this matter. In the beginning, we acquaint the reader with complex data types and object-oriented databases. Then we describe an algorithm of transformation of complex data types into ontologies. At the end of the article, some examples of ontologies described in the OWL language are given.

  2. Development of Information Technology of Object-relational Databases Design

    Directory of Open Access Journals (Sweden)

    Valentyn A. Filatov

    2012-12-01

    Full Text Available The article is concerned with the development of information technology of object-relational databases design and study of object features infological and logical database schemes entities and connections.

  3. Database Independent Migration of Objects into an Object-Relational Database

    CERN Document Server

    Ali, A; Munir, K; Waseem-Hassan, M; Willers, I

    2002-01-01

    CERN's (European Organization for Nuclear Research) WISDOM project [1] deals with the replication of data between homogeneous sources in a Wide Area Network (WAN) using the extensible Markup Language (XML). The last phase of the WISDOM (Wide-area, database Independent Serialization of Distributed Objects for data Migration) project [2], indicates the future directions for this work to be to incorporate heterogeneous sources as compared to homogeneous sources as described by [3]. This work will become essential for the CERN community once the need to transfer their legacy data to some other source, other then Objectivity [4], arises. Oracle 9i - an Object-Relational Database (including support for abstract data types, ADTs) appears to be a potential candidate for the physics event store in the CERN CMS experiment as suggested by [4] & [5]. Consequently this database has been selected for study. As a result of this work the HEP community will get a tool for migrating their data from Objectivity to Oracle9i.

  4. Radioactive Waste Management Objectives

    International Nuclear Information System (INIS)

    2011-01-01

    considered and the specific goals to be achieved at different stages of implementation, all of which are consistent with the Basic Principles. The four Objectives publications include Nuclear General Objectives, Nuclear Power Objectives, Nuclear Fuel Cycle Objectives, and Radioactive Waste Management and Decommissioning Objectives. This publication sets out the objectives that need to be achieved in the area of radioactive waste management, including decommissioning and environmental remediation, to ensure that the Nuclear Energy Basic Principles are satisfied.

  5. Modeling Spatial Data within Object Relational-Databases

    Directory of Open Access Journals (Sweden)

    Iuliana BOTHA

    2011-03-01

    Full Text Available Spatial data can refer to elements that help place a certain object in a certain area. These elements are latitude, longitude, points, geometric figures represented by points, etc. However, when translating these elements into data that can be stored in a computer, it all comes down to numbers. The interesting part that requires attention is how to memorize them in order to obtain fast and various spatial queries. This part is where the DBMS (Data Base Management System that contains the database acts in. In this paper, we analyzed and compared two object-relational DBMS that work with spatial data: Oracle and PostgreSQL.

  6. Aging management database

    International Nuclear Information System (INIS)

    Vidican, Dan

    2003-01-01

    As operation time is accumulated, the overall safety and performance of NPP tend to decrease. The reasons for potential non-availability of the structures, Systems and Components (SCC) in operation, are various but they represent in different mode the end result of the ageing phenomena. In order to understand the ageing phenomena and to be able to take adequate countermeasures, it is necessary to accumulate a big amount of information, from worldwide and also from the own plant. These Data have to be organized in a systematic form, easy to retrieval and use. General requirements and structure of an Ageing DataBase Activities related to ageing evaluation have to allow: - Identification and evaluation of degradation phenomena, potential malfunction and failure mode of the plant typical components; - Trend analyses (on selected critical components), prediction of the future performance and the remaining service life. To perform these activities, it is necessary to have information on similar components behavior in different NPP (in different environment and different operating conditions) and also the results from different pilot studies. The knowledge of worldwide experience is worthwhile. Also, it is necessary to know very well the operating and environmental conditions in own NPP and to analyze in detail the failure mode and root cause for the components removed from the plant due to extended degradation. Based on the above aspects, one presents a proposal for the structure of an Ageing DataBase. It has three main sections: - Section A: General knowledge about ageing phenomena. It contain all the information collected based on the worldwide experience. It could have, a general part with crude information and a synthetic one, structured on typical components (if possible on different manufacturers). The synthetic part, has to consider different ageing aspects and different monitoring and evaluation methods (e. g. component, function, environment condition, specific

  7. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  8. Management system of instrument database

    International Nuclear Information System (INIS)

    Zhang Xin

    1997-01-01

    The author introduces a management system of instrument database. This system has been developed using with Foxpro on network. The system has some characters such as clear structure, easy operation, flexible and convenient query, as well as the data safety and reliability

  9. A Database Management Assessment Instrument

    Science.gov (United States)

    Landry, Jeffrey P.; Pardue, J. Harold; Daigle, Roy; Longenecker, Herbert E., Jr.

    2013-01-01

    This paper describes an instrument designed for assessing learning outcomes in data management. In addition to assessment of student learning and ABET outcomes, we have also found the instrument to be effective for determining database placement of incoming information systems (IS) graduate students. Each of these three uses is discussed in this…

  10. Meeting Objectives and Relevant NDS Databases

    International Nuclear Information System (INIS)

    Simakov, S.P.

    2012-01-01

    The purpose of the meeting was to find ways to overcome the drawbacks of the NRT standard and benefit from the recent developments in primary radiation damage simulations, the Technical Meeting had the objectives to discuss: - revisiting the NRT standard with the purpose of improving it by the evaluation of uncertainties connected with recoil spectra and the energy partitioning model; - proposal of a new upgraded standard that will capture the annealing of defects in the recoil cascade on the basis of MD, BCA and other models. As an outcome of discussions, the definition of objectives and participating organisations for a new Coordinated Research Project (CRP) on this topic are expected

  11. Interconnecting heterogeneous database management systems

    Science.gov (United States)

    Gligor, V. D.; Luckenbaugh, G. L.

    1984-01-01

    It is pointed out that there is still a great need for the development of improved communication between remote, heterogeneous database management systems (DBMS). Problems regarding the effective communication between distributed DBMSs are primarily related to significant differences between local data managers, local data models and representations, and local transaction managers. A system of interconnected DBMSs which exhibit such differences is called a network of distributed, heterogeneous DBMSs. In order to achieve effective interconnection of remote, heterogeneous DBMSs, the users must have uniform, integrated access to the different DBMs. The present investigation is mainly concerned with an analysis of the existing approaches to interconnecting heterogeneous DBMSs, taking into account four experimental DBMS projects.

  12. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  13. Geospatial Database for Strata Objects Based on Land Administration Domain Model (ladm)

    Science.gov (United States)

    Nasorudin, N. N.; Hassan, M. I.; Zulkifli, N. A.; Rahman, A. Abdul

    2016-09-01

    Recently in our country, the construction of buildings become more complex and it seems that strata objects database becomes more important in registering the real world as people now own and use multilevel of spaces. Furthermore, strata title was increasingly important and need to be well-managed. LADM is a standard model for land administration and it allows integrated 2D and 3D representation of spatial units. LADM also known as ISO 19152. The aim of this paper is to develop a strata objects database using LADM. This paper discusses the current 2D geospatial database and needs for 3D geospatial database in future. This paper also attempts to develop a strata objects database using a standard data model (LADM) and to analyze the developed strata objects database using LADM data model. The current cadastre system in Malaysia includes the strata title is discussed in this paper. The problems in the 2D geospatial database were listed and the needs for 3D geospatial database in future also is discussed. The processes to design a strata objects database are conceptual, logical and physical database design. The strata objects database will allow us to find the information on both non-spatial and spatial strata title information thus shows the location of the strata unit. This development of strata objects database may help to handle the strata title and information.

  14. Query Processing and Interlinking of Fuzzy Object-Oriented Database

    OpenAIRE

    Shweta Dwivedi; Santosh Kumar

    2017-01-01

    Due to the many limitation and poor data handling in the existing relational database, the software professional and researchers moves towards the object-oriented database which has much better capability to handling the real and complex real world data i.e. clear and crisp data and also have the capability to perform some huge and complex queries in an effective manner. On the other hand, a new approach in database is introduced named as Fuzzy Object-Oriented Database (FOOD); it has all the ...

  15. RTDB: A memory resident real-time object database

    International Nuclear Information System (INIS)

    Nogiec, Jerzy M.; Desavouret, Eugene

    2003-01-01

    RTDB is a fast, memory-resident object database with built-in support for distribution. It constitutes an attractive alternative for architecting real-time solutions with multiple, possibly distributed, processes or agents sharing data. RTDB offers both direct and navigational access to stored objects, with local and remote random access by object identifiers, and immediate direct access via object indices. The database supports transparent access to objects stored in multiple collaborating dispersed databases and includes a built-in cache mechanism that allows for keeping local copies of remote objects, with specifiable invalidation deadlines. Additional features of RTDB include a trigger mechanism on objects that allows for issuing events or activating handlers when objects are accessed or modified and a very fast, attribute based search/query mechanism. The overall architecture and application of RTDB in a control and monitoring system is presented

  16. Column-oriented database management systems

    OpenAIRE

    Možina, David

    2013-01-01

    In the following thesis I will present column-oriented database. Among other things, I will answer on a question why there is a need for a column-oriented database. In recent years there have been a lot of attention regarding a column-oriented database, even if the existence of a columnar database management systems dates back in the early seventies of the last century. I will compare both systems for a database management – a colum-oriented database system and a row-oriented database system ...

  17. The management object in risk management approaches

    OpenAIRE

    Christiansen, Ulrik

    2013-01-01

    Using a systematic review of the last 55 years of research within risk management this paper explores how risk management as a management technology (methodologies, tools and frameworks to mitigate or manage risks) singles out risks as an object for management in order to make action possible. The paper synthesise by developing a framework of how different views on risk management enable and constrain the knowledge about risk and thus frame the possibilities to measure, analyse an...

  18. Exploiting database technology for object based event storage and retrieval

    International Nuclear Information System (INIS)

    Rawat, Anil; Rajan, Alpana; Tomar, Shailendra Singh; Bansal, Anurag

    2005-01-01

    This paper discusses the storage and retrieval of experimental data on relational databases. Physics experiments carried out using reactors and particle accelerators, generate huge amount of data. Also, most of the data analysis and simulation programs are developed using object oriented programming concepts. Hence, one of the most important design features of an experiment related software framework is the way object persistency is handled. We intend to discuss these issues in the light of the module developed by us for storing C++ objects in relational databases like Oracle. This module was developed under the POOL persistency framework being developed for LHC, CERN grid. (author)

  19. The GIOD Project-Globally Interconnected Object Databases

    CERN Document Server

    Bunn, J J; Newman, H B; Wilkinson, R P

    2001-01-01

    The GIOD (Globally Interconnected Object Databases) Project, a joint effort between Caltech and CERN, funded by Hewlett Packard Corporation, has investigated the use of WAN-distributed Object Databases and Mass Storage systems for LHC data. A prototype small- scale LHC data analysis center has been constructed using computing resources at Caltechs Centre for advanced Computing Research (CACR). These resources include a 256 CPU HP Exemplar of ~4600 SPECfp95, a 600 TByte High Performance Storage System (HPSS), and local/wide area links based on OC3 ATM. Using the exemplar, a large number of fully simulated CMS events were produced, and used to populate an object database with a complete schema for raw, reconstructed and analysis objects. The reconstruction software used for this task was based on early codes developed in preparation for the current CMS reconstruction program, ORCA. (6 refs).

  20. RANCANGAN DATABASE SUBSISTEM PRODUKSI DENGAN PENDEKATAN SEMANTIC OBJECT MODEL

    Directory of Open Access Journals (Sweden)

    Oviliani Yenty Yuliana

    2002-01-01

    Full Text Available To compete in the global market, business performer who active in industry fields should have and get information quickly and accurately, so they could make the precise decision. Traditional cost accounting system cannot give sufficient information, so many industries shift to Activity-Based Costing system (ABC. ABC system is more complex and need more data that should be save and process, so it should be applied information technology and database than traditional cost accounting system. The development of the software technology recently makes the construction of application program is not problem again. The primary problem is how to design database that presented information quickly and accurately. For that reason it necessary to make the model first. This paper discusses database modelling with semantic object model approach. This model is easier to use and is generate more normal database design than entity relationship model approach. Abstract in Bahasa Indonesia : Dalam persaingan di pasar bebas, para pelaku bisnis di bidang industri dalam membuat suatu keputusan yang tepat memerlukan informasi secara cepat dan akurat. Sistem akuntansi biaya tradisional tidak dapat menyediakan informasi yang memadai, sehingga banyak perusahaan industri yang beralih ke sistem Activity-Based Costing (ABC. Tetapi, sistem ABC merupakan sistem yang kompleks dan memerlukan banyak data yang harus disimpan dan diolah, sehingga harus menggunakan teknologi informasi dan database. Kemajuan di bidang perangkat lunak mengakibatkan pembuatan aplikasi program bukan masalah lagi. Permasalahan utama adalah bagaimana merancang database, agar dapat menyajikan informasi secara cepat dan akurat. Untuk itu, dalam makalah ini dibahas pemodelan database dengan pendekatan semantic object model. Model data ini lebih mudah digunakan dan menghasilkan transformasi yang lebih normal, jika dibandingkan dengan entity relationship model yang umum digunakan. Kata kunci: Sub Sistem

  1. Reldata - a tool for reliability database management

    International Nuclear Information System (INIS)

    Vinod, Gopika; Saraf, R.K.; Babar, A.K.; Sanyasi Rao, V.V.S.; Tharani, Rajiv

    2000-01-01

    Component failure, repair and maintenance data is a very important element of any Probabilistic Safety Assessment study. The credibility of the results of such study is enhanced if the data used is generated from operating experience of similar power plants. Towards this objective, a computerised database is designed, with fields such as, date and time of failure, component name, failure mode, failure cause, ways of failure detection, reactor operating power status, repair times, down time, etc. This leads to evaluation of plant specific failure rate, and on demand failure probability/unavailability for all components. Systematic data updation can provide a real time component reliability parameter statistics and trend analysis and this helps in planning maintenance strategies. A software package has been developed RELDATA, which incorporates the database management and data analysis methods. This report describes the software features and underlying methodology in detail. (author)

  2. Microcomputer Database Management Systems for Bibliographic Data.

    Science.gov (United States)

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  3. TWRS technical baseline database manager definition document

    International Nuclear Information System (INIS)

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  4. EMEN2: an object oriented database and electronic lab notebook.

    Science.gov (United States)

    Rees, Ian; Langley, Ed; Chiu, Wah; Ludtke, Steven J

    2013-02-01

    Transmission electron microscopy and associated methods, such as single particle analysis, two-dimensional crystallography, helical reconstruction, and tomography, are highly data-intensive experimental sciences, which also have substantial variability in experimental technique. Object-oriented databases present an attractive alternative to traditional relational databases for situations where the experiments themselves are continually evolving. We present EMEN2, an easy to use object-oriented database with a highly flexible infrastructure originally targeted for transmission electron microscopy and tomography, which has been extended to be adaptable for use in virtually any experimental science. It is a pure object-oriented database designed for easy adoption in diverse laboratory environments and does not require professional database administration. It includes a full featured, dynamic web interface in addition to APIs for programmatic access. EMEN2 installations currently support roughly 800 scientists worldwide with over 1/2 million experimental records and over 20 TB of experimental data. The software is freely available with complete source.

  5. Cloud database development and management

    CERN Document Server

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  6. The representation of manipulable solid objects in a relational database

    Science.gov (United States)

    Bahler, D.

    1984-01-01

    This project is concerned with the interface between database management and solid geometric modeling. The desirability of integrating computer-aided design, manufacture, testing, and management into a coherent system is by now well recognized. One proposed configuration for such a system uses a relational database management system as the central focus; the various other functions are linked through their use of a common data repesentation in the data manager, rather than communicating pairwise to integrate a geometric modeling capability with a generic relational data managemet system in such a way that well-formed questions can be posed and answered about the performance of the system as a whole. One necessary feature of any such system is simplification for purposes of anaysis; this and system performance considerations meant that a paramount goal therefore was that of unity and simplicity of the data structures used.

  7. Space Object Radiometric Modeling for Hardbody Optical Signature Database Generation

    Science.gov (United States)

    2009-09-01

    Introduction This presentation summarizes recent activity in monitoring spacecraft health status using passive remote optical nonimaging ...Approved for public release; distribution is unlimited. Space Object Radiometric Modeling for Hardbody Optical Signature Database Generation...It is beneficial to the observer/analyst to understand the fundamental optical signature variability associated with these detection and

  8. Content And Multimedia Database Management Systems

    NARCIS (Netherlands)

    de Vries, A.P.

    1999-01-01

    A database management system is a general-purpose software system that facilitates the processes of defining, constructing, and manipulating databases for various applications. The main characteristic of the ‘database approach’ is that it increases the value of data by its emphasis on data

  9. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  10. Mobile object retrieval in server-based image databases

    Science.gov (United States)

    Manger, D.; Pagel, F.; Widak, H.

    2013-05-01

    The increasing number of mobile phones equipped with powerful cameras leads to huge collections of user-generated images. To utilize the information of the images on site, image retrieval systems are becoming more and more popular to search for similar objects in an own image database. As the computational performance and the memory capacity of mobile devices are constantly increasing, this search can often be performed on the device itself. This is feasible, for example, if the images are represented with global image features or if the search is done using EXIF or textual metadata. However, for larger image databases, if multiple users are meant to contribute to a growing image database or if powerful content-based image retrieval methods with local features are required, a server-based image retrieval backend is needed. In this work, we present a content-based image retrieval system with a client server architecture working with local features. On the server side, the scalability to large image databases is addressed with the popular bag-of-word model with state-of-the-art extensions. The client end of the system focuses on a lightweight user interface presenting the most similar images of the database highlighting the visual information which is common with the query image. Additionally, new images can be added to the database making it a powerful and interactive tool for mobile contentbased image retrieval.

  11. A new weighted fuzzy grammar on object oriented database queries

    Directory of Open Access Journals (Sweden)

    Ali Haroonabadi

    2012-08-01

    Full Text Available The fuzzy object oriented database model is often used to handle the existing imprecise and complicated objects for many real-world applications. The main focus of this paper is on fuzzy queries and tries to analyze a complicated and complex query to get more meaningful and closer responses. The method permits the user to provide the possibility of allocating the weight to various parts of the query, which makes it easier to follow better goals and return the target objects.

  12. Reexamining Operating System Support for Database Management

    OpenAIRE

    Vasil, Tim

    2003-01-01

    In 1981, Michael Stonebraker [21] observed that database management systems written for commodity operating systems could not effectively take advantage of key operating system services, such as buffer pool management and process scheduling, due to expensive overhead and lack of customizability. The “not quite right” fit between these kernel services and the demands of database systems forced database designers to work around such limitations or re-implement some kernel functionality in user ...

  13. Performance Enhancements for Advanced Database Management Systems

    OpenAIRE

    Helmer, Sven

    2000-01-01

    New applications have emerged, demanding database management systems with enhanced functionality. However, high performance is a necessary precondition for the acceptance of such systems by end users. In this context we developed, implemented, and tested algorithms and index structures for improving the performance of advanced database management systems. We focused on index structures and join algorithms for set-valued attributes.

  14. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    Science.gov (United States)

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  15. Comparative performance measures of relational and object-oriented databases using High Energy Physics data

    International Nuclear Information System (INIS)

    Marstaller, J.

    1993-12-01

    The major experiments at the SSC are expected to produce up to 1 Petabyte of data per year. The use of database techniques can significantly reduce the time it takes to access data. The goal of this project was to test which underlying data model, the relational or the object-oriented, would be better suited for archival and accessing high energy data. We describe the relational and the object-oriented data model and their implementation in commercial database management systems. To determine scalability we tested both implementations for 10-MB and 100-MB databases using storage and timing criteria

  16. Insertion algorithms for network model database management systems

    Science.gov (United States)

    Mamadolimov, Abdurashid; Khikmat, Saburov

    2017-12-01

    The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, forms partial order. When a database is large and a query comparison is expensive then the efficiency requirement of managing algorithms is minimizing the number of query comparisons. We consider updating operation for network model database management systems. We develop a new sequantial algorithm for updating operation. Also we suggest a distributed version of the algorithm.

  17. Temporal Databases in Network Management

    National Research Council Canada - National Science Library

    Gupta, Ajay

    1998-01-01

    .... This thesis discusses issues involved with performing network management, specifically with means of reducing and storing the large quantity of data that networks management tools and systems generate...

  18. Databases spatiotemporal taxonomy with moving objects. Theme review

    Directory of Open Access Journals (Sweden)

    Sergio Alejandro Rojas Barbosa

    2018-01-01

    Full Text Available Context: In the last decade, databases have evolved so much that we no longer speak only of spatial databases, but also of spatial and temporal databases. This means that the event or record has a spatial or localization variable and a temporality variable, which allows updating the previously stored record. Method: This paper presents a bibliographic review about concepts, spatio-temporal data models, specifically the models of data in movement. Results: Taxonomic considerations of the queries are presented in the models of data in movement, according to the persistence of the query (time, location, movement, object and patterns, as well as the different proposals of indexes and structures. Conclusions: The implementation of model proposals, such as indexes and structures, can lead to standardization problems. This is why it should be standardized under the standards and standards of the OGC (Open Geospatial Consortium.

  19. Building a genome database using an object-oriented approach.

    Science.gov (United States)

    Barbasiewicz, Anna; Liu, Lin; Lang, B Franz; Burger, Gertraud

    2002-01-01

    GOBASE is a relational database that integrates data associated with mitochondria and chloroplasts. The most important data in GOBASE, i. e., molecular sequences and taxonomic information, are obtained from the public sequence data repository at the National Center for Biotechnology Information (NCBI), and are validated by our experts. Maintaining a curated genomic database comes with a towering labor cost, due to the shear volume of available genomic sequences and the plethora of annotation errors and omissions in records retrieved from public repositories. Here we describe our approach to increase automation of the database population process, thereby reducing manual intervention. As a first step, we used Unified Modeling Language (UML) to construct a list of potential errors. Each case was evaluated independently, and an expert solution was devised, and represented as a diagram. Subsequently, the UML diagrams were used as templates for writing object-oriented automation programs in the Java programming language.

  20. Nuclear power plant reliability database management

    International Nuclear Information System (INIS)

    Meslin, Th.; Aufort, P.

    1996-04-01

    In the framework of the development of a probabilistic safety project on site (notion of living PSA), Saint Laurent des Eaux NPP implements a specific EDF reliability database. The main goals of this project at Saint Laurent des Eaux are: to expand risk analysis and to constitute an effective local basis of thinking about operating safety by requiring the participation of all departments of a power plant: analysis of all potential operating transients, unavailability consequences... that means to go further than a simple culture of applying operating rules; to involve nuclear power plant operators in experience feedback and its analysis, especially by following up behaviour of components and of safety functions; to allow plant safety managers to outline their decisions facing safety authorities for notwithstanding, preventive maintenance programme, operating incident evaluation. To hit these goals requires feedback data, tools, techniques and development of skills. The first step is to obtain specific reliability data on the site. Raw data come from plant maintenance management system which processes all maintenance activities and keeps in memory all the records of component failures and maintenance activities. Plant specific reliability data are estimated with a Bayesian model which combines these validated raw data with corporate generic data. This approach allow to provide reliability data for main components modelled in PSA, to check the consistency of the maintenance program (RCM), to verify hypothesis made at the design about component reliability. A number of studies, related to components reliability as well as decision making process of specific incident risk evaluation have been carried out. This paper provides also an overview of the process management set up on site from raw database to specific reliability database in compliance with established corporate objectives. (authors). 4 figs

  1. Nuclear power plant reliability database management

    Energy Technology Data Exchange (ETDEWEB)

    Meslin, Th [Electricite de France (EDF), 41 - Saint-Laurent-des-Eaux (France); Aufort, P

    1996-04-01

    In the framework of the development of a probabilistic safety project on site (notion of living PSA), Saint Laurent des Eaux NPP implements a specific EDF reliability database. The main goals of this project at Saint Laurent des Eaux are: to expand risk analysis and to constitute an effective local basis of thinking about operating safety by requiring the participation of all departments of a power plant: analysis of all potential operating transients, unavailability consequences... that means to go further than a simple culture of applying operating rules; to involve nuclear power plant operators in experience feedback and its analysis, especially by following up behaviour of components and of safety functions; to allow plant safety managers to outline their decisions facing safety authorities for notwithstanding, preventive maintenance programme, operating incident evaluation. To hit these goals requires feedback data, tools, techniques and development of skills. The first step is to obtain specific reliability data on the site. Raw data come from plant maintenance management system which processes all maintenance activities and keeps in memory all the records of component failures and maintenance activities. Plant specific reliability data are estimated with a Bayesian model which combines these validated raw data with corporate generic data. This approach allow to provide reliability data for main components modelled in PSA, to check the consistency of the maintenance program (RCM), to verify hypothesis made at the design about component reliability. A number of studies, related to components reliability as well as decision making process of specific incident risk evaluation have been carried out. This paper provides also an overview of the process management set up on site from raw database to specific reliability database in compliance with established corporate objectives. (authors). 4 figs.

  2. The Management Object in Risk Management Approaches

    DEFF Research Database (Denmark)

    Christiansen, Ulrik

    Using a systematic review of the last 55 years of research within risk management this paper explores how risk management as a management technology (methodologies, tools and frameworks to mitigate or manage risks) singles out risks as an object for management in order to make action possible....... The paper synthesise by developing a framework of how different views on risk management enable and constrain the knowledge about risk and thus frame the possibilities to measure, analyse and calculate uncertainty and risk. Inspired by social studies of finance and accounting, the paper finally develops...... three propositions that illustrate how the framing of risk establishes a boundary for how managers might understand value creation and the possible future and how this impacts the possible responses to risk....

  3. Motivational Objects in Natural Scenes (MONS): A Database of >800 Objects.

    Science.gov (United States)

    Schomaker, Judith; Rau, Elias M; Einhäuser, Wolfgang; Wittmann, Bianca C

    2017-01-01

    In daily life, we are surrounded by objects with pre-existing motivational associations. However, these are rarely controlled for in experiments with natural stimuli. Research on natural stimuli would therefore benefit from stimuli with well-defined motivational properties; in turn, such stimuli also open new paths in research on motivation. Here we introduce a database of Motivational Objects in Natural Scenes (MONS). The database consists of 107 scenes. Each scene contains 2 to 7 objects placed at approximately equal distance from the scene center. Each scene was photographed creating 3 versions, with one object ("critical object") being replaced to vary the overall motivational value of the scene (appetitive, aversive, and neutral), while maintaining high visual similarity between the three versions. Ratings on motivation, valence, arousal and recognizability were obtained using internet-based questionnaires. Since the main objective was to provide stimuli of well-defined motivational value, three motivation scales were used: (1) Desire to own the object; (2) Approach/Avoid; (3) Desire to interact with the object. Three sets of ratings were obtained in independent sets of observers: for all 805 objects presented on a neutral background, for 321 critical objects presented in their scene context, and for the entire scenes. On the basis of the motivational ratings, objects were subdivided into aversive, neutral, and appetitive categories. The MONS database will provide a standardized basis for future studies on motivational value under realistic conditions.

  4. Serialization and persistent objects turning data structures into efficient databases

    CERN Document Server

    Soukup, Jiri

    2014-01-01

    Recently, the pressure for fast processing and efficient storage of large data with complex?relations increased beyond the capability of traditional databases. Typical examples include iPhone applications, computer aided design - both electrical and mechanical, biochemistry applications, and incremental compilers. Serialization, which is sometimes used in such situations is notoriously tedious and error prone.In this book, Jiri Soukup and Petr Macha?ek show in detail how to write programs which store their internal data automatically and transparently to disk. Together with special data structure libraries which treat relations among objects as first-class entities, and with a UML class-diagram generator, the core application code is much simplified. The benchmark chapter shows a typical example where persistent data is faster by the order of magnitude than with a traditional database, in both traversing and accessing the data.The authors explore and exploit advanced features of object-oriented languages in a...

  5. Motivational Objects in Natural Scenes (MONS: A Database of >800 Objects

    Directory of Open Access Journals (Sweden)

    Judith Schomaker

    2017-09-01

    Full Text Available In daily life, we are surrounded by objects with pre-existing motivational associations. However, these are rarely controlled for in experiments with natural stimuli. Research on natural stimuli would therefore benefit from stimuli with well-defined motivational properties; in turn, such stimuli also open new paths in research on motivation. Here we introduce a database of Motivational Objects in Natural Scenes (MONS. The database consists of 107 scenes. Each scene contains 2 to 7 objects placed at approximately equal distance from the scene center. Each scene was photographed creating 3 versions, with one object (“critical object” being replaced to vary the overall motivational value of the scene (appetitive, aversive, and neutral, while maintaining high visual similarity between the three versions. Ratings on motivation, valence, arousal and recognizability were obtained using internet-based questionnaires. Since the main objective was to provide stimuli of well-defined motivational value, three motivation scales were used: (1 Desire to own the object; (2 Approach/Avoid; (3 Desire to interact with the object. Three sets of ratings were obtained in independent sets of observers: for all 805 objects presented on a neutral background, for 321 critical objects presented in their scene context, and for the entire scenes. On the basis of the motivational ratings, objects were subdivided into aversive, neutral, and appetitive categories. The MONS database will provide a standardized basis for future studies on motivational value under realistic conditions.

  6. Accelerator operation management using objects

    Energy Technology Data Exchange (ETDEWEB)

    Nishimura, H.; Timossi, C.; Valdez, M.

    1995-04-01

    Conflicts over control of shared devices or resources in an accelerator control system, and problems that can occur due to applications performing conflicting operations, are usually resolved by accelerator operators. For these conflicts to be detected by the control system, a model of accelerator operation must be available to the system. The authors present a design for an operation management system addressing the issues of operations management using the language of Object-Oriented Design (OOD). A possible implementation using commercially available software tools is also presented.

  7. Accelerator operation management using objects

    International Nuclear Information System (INIS)

    Nishimura, H.; Timossi, C.; Valdez, M.

    1995-01-01

    Conflicts over control of shared devices or resources in an accelerator control system, and problems that can occur due to applications performing conflicting operations, are usually resolved by accelerator operators. For these conflicts to be detected by the control system, a model of accelerator operation must be available to the system. The authors present a design for an operation management system addressing the issues of operations management using the language of Object-Oriented Design (OOD). A possible implementation using commercially available software tools is also presented

  8. SIRSALE: integrated video database management tools

    Science.gov (United States)

    Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.

    2002-07-01

    Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.

  9. Management of virtualized infrastructure for physics databases

    International Nuclear Information System (INIS)

    Topurov, Anton; Gallerani, Luigi; Chatal, Francois; Piorkowski, Mariusz

    2012-01-01

    Demands for information storage of physics metadata are rapidly increasing together with the requirements for its high availability. Most of the HEP laboratories are struggling to squeeze more from their computer centers, thus focus on virtualizing available resources. CERN started investigating database virtualization in early 2006, first by testing database performance and stability on native Xen. Since then we have been closely evaluating the constantly evolving functionality of virtualisation solutions for database and middle tier together with the associated management applications – Oracle's Enterprise Manager and VM Manager. This session will detail our long experience in dealing with virtualized environments, focusing on newest Oracle OVM 3.0 for x86 and Oracle Enterprise Manager functionality for efficiently managing your virtualized database infrastructure.

  10. Study on managing EPICS database using ORACLE

    International Nuclear Information System (INIS)

    Liu Shu; Wang Chunhong; Zhao Jijiu

    2007-01-01

    EPICS is used as a development toolkit of BEPCII control system. The core of EPICS is a distributed database residing in front-end machines. The distributed database is usually created by tools such as VDCT and text editor in the host, then loaded to front-end target IOCs through the network. In BEPCII control system there are about 20,000 signals, which are distributed in more than 20 IOCs. All the databases are developed by device control engineers using VDCT or text editor. There's no uniform tools providing transparent management. The paper firstly presents the current status on EPICS database management issues in many labs. Secondly, it studies EPICS database and the interface between ORACLE and EPICS database. finally, it introduces the software development and application is BEPCII control system. (authors)

  11. Recent developments and object-oriented approach in FTU database

    International Nuclear Information System (INIS)

    Bertocchi, A.; Bracco, G.; Buceti, G.; Centioli, C.; Iannone, F.; Manduchi, G.; Nanni, U.; Panella, M.; Stracuzzi, C.; Vitale, V.

    2001-01-01

    During the last two years, the experimental database of Frascati Tokamak Upgrade (FTU) has been changed from several points of view, particularly: (i) the data and the analysis codes have been moved from the IBM main frame to Unix platforms making enabling the users to take advantage of the large quantities of commercial and free software available under Unix (Matlab, IDL, etc); (ii) AFS (Andrew File System) has been chosen as the distributed file system making the data available on all the nodes and distributing the workload; (iii) 'One measure/one file' philosophy (vs. the previous 'one pulse/one file') has been adopted increasing the number of files into the database but, at the same time, allowing the most important data to be available just after the plasma discharge. The client-server architecture has been tested using the signal viewer client jScope. Moreover, an object oriented data model (OODM) of FTU experimental data has been tried: a generalized model in tokamak experimental data has been developed with typical concepts such as abstraction, encapsulation, inheritance, and polymorphism. The model has been integrated with data coming from different databases, building an Object Warehouse to extract, with data mining techniques, meaningful trends and patterns from huge amounts of data

  12. Using Online Databases in Corporate Issues Management.

    Science.gov (United States)

    Thomsen, Steven R.

    1995-01-01

    Finds that corporate public relations practitioners felt they were able, using online database and information services, to intercept issues earlier in the "issue cycle" and thus enable their organizations to develop more "proactionary" or "catalytic" issues management repose strategies. (SR)

  13. Distributed Database Management Systems A Practical Approach

    CERN Document Server

    Rahimi, Saeed K

    2010-01-01

    This book addresses issues related to managing data across a distributed database system. It is unique because it covers traditional database theory and current research, explaining the difficulties in providing a unified user interface and global data dictionary. The book gives implementers guidance on hiding discrepancies across systems and creating the illusion of a single repository for users. It also includes three sample frameworksâ€"implemented using J2SE with JMS, J2EE, and Microsoft .Netâ€"that readers can use to learn how to implement a distributed database management system. IT and

  14. Clinical Views: Object-Oriented Views for Clinical Databases

    Science.gov (United States)

    Portoni, Luisa; Combi, Carlo; Pinciroli, Francesco

    1998-01-01

    We present here a prototype of a clinical information system for the archiving and the management of multimedia and temporally-oriented clinical data related to PTCA patients. The system is based on an object-oriented DBMS and supports multiple views and view schemas on patients' data. Remote data access is supported too.

  15. The ATLAS Distributed Data Management System & Databases

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Barisits, M; Beermann, T; Vigne, R; Serfon, C

    2013-01-01

    The ATLAS Distributed Data Management (DDM) System is responsible for the global management of petabytes of high energy physics data. The current system, DQ2, has a critical dependency on Relational Database Management Systems (RDBMS), like Oracle. RDBMS are well-suited to enforcing data integrity in online transaction processing applications, however, concerns have been raised about the scalability of its data warehouse-like workload. In particular, analysis of archived data or aggregation of transactional data for summary purposes is problematic. Therefore, we have evaluated new approaches to handle vast amounts of data. We have investigated a class of database technologies commonly referred to as NoSQL databases. This includes distributed filesystems, like HDFS, that support parallel execution of computational tasks on distributed data, as well as schema-less approaches via key-value stores, like HBase. In this talk we will describe our use cases in ATLAS, share our experiences with various databases used ...

  16. Experience using a distributed object oriented database for a DAQ system

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    To configure the RD13 data acquisition system, we need many parameters which describe the various hardware and software components. Such information has been defined using an entity-relation model and stored in a commercial memory-resident database. during the last year, Itasca, an object oriented database management system (OODB), was chosen as a replacement database system. We have ported the existing databases (hs and sw configurations, run parameters etc.) to Itasca and integrated it with the run control system. We believe that it is possible to use an OODB in real-time environments such as DAQ systems. In this paper, we present our experience and impression: why we wanted to change from an entity-relational approach, some useful features of Itasca, the issues we meet during this project including integration of the database into an existing distributed environment and factors which influence performance. (author)

  17. ROME (Request Object Management Environment)

    Science.gov (United States)

    Kong, M.; Good, J. C.; Berriman, G. B.

    2005-12-01

    Most current astronomical archive services are based on an HTML/ CGI architecture where users submit HTML forms via a browser and CGI programs operating under a web server process the requests. Most services return an HTML result page with URL links to the result files or, for longer jobs, return a message indicating that email will be sent when the job is done. This paradigm has a few serious shortcomings. First, it is all too common for something to go wrong and for the user to never hear about the job again. Second, for long and complicated jobs there is often important intermediate information that would allow the user to adjust the processing. Finally, unless some sort of custom queueing mechanism is used, background jobs are started immediately upon receiving the CGI request. When there are many such requests the server machine can easily be overloaded and either slow to a crawl or crash. Request Object Management Environment (ROME) is a collection of middleware components being developed under the National Virtual Observatory Project to provide mechanism for managing long jobs such as computationally intensive statistical analysis requests or the generation of large scale mosaic images. Written as EJB objects within the open-source JBoss applications server, ROME receives processing requests via a servelet interface, stores them in a DBMS using JDBC, distributes the processing (via queuing mechanisms) across multiple machines and environments (including Grid resources), manages realtime messages from the processing modules, and ensures proper user notification. The request processing modules are identical in structure to standard CGI-programs -- though they can optionally implement status messaging -- and can be written in any language. ROME will persist these jobs across failures of processing modules, network outages, and even downtime of ROME and the DBMS, restarting them as necessary.

  18. A Persistent Object Manager for HEP

    CERN Multimedia

    Arderiu ribera, E; Couet, O; Duellmann, D; Conde gonzalez carrascosa, F J; Shiers, J; Ferrero merlino, B; Baranski, A

    2002-01-01

    % RD45 \\\\ \\\\ RD45 is currently investigating solutions to the problem of storing, managing and accessing the extremely large volumes of data that will be created at the LHC where, given the anticipated lifetime and data rates, a system capable of scaling to approximately 100~PB is required. The project places strong emphasis on the use of industry standard solutions wherever possible, and is currently focussing on the potential use of commercial standards-conforming Object Database Management Systems transparently coupled to Mass Storage Systems. Production use of components of what could eventually become the solution for LHC has already been made by existing experiments at CERN, and it is planned to gradually increase the amount of physics data handled by the system into the multi-TB range and beyond over the next years.

  19. Object-oriented modeling and design of database federations

    NARCIS (Netherlands)

    Balsters, H.

    2003-01-01

    We describe a logical architecture and a general semantic framework for precise specification of so-called database federations. A database federation provides for tight coupling of a collection of heterogeneous component databases into a global integrated system. Our approach to database federation

  20. From document to database: modernizing requirements management

    International Nuclear Information System (INIS)

    Giajnorio, J.; Hamilton, S.

    2007-01-01

    The creation, communication, and management of design requirements are central to the successful completion of any large engineering project, both technically and commercially. Design requirements in the Canadian nuclear industry are typically numbered lists in multiple documents created using word processing software. As an alternative, GE Nuclear Products implemented a central requirements management database for a major project at Bruce Power. The database configured the off-the-shelf software product, Telelogic Doors, to GE's requirements structure. This paper describes the advantages realized by this scheme. Examples include traceability from customer requirements through to test procedures, concurrent engineering, and automated change history. (author)

  1. Plant operation data collection and database management using NIC system

    International Nuclear Information System (INIS)

    Inase, S.

    1990-01-01

    The Nuclear Information Center (NIC), a division of the Central Research Institute of Electric Power Industry, collects nuclear power plant operation and maintenance information both in Japan and abroad and transmits the information to all domestic utilities so that it can be effectively utilized for safe plant operation and reliability enhancement. The collected information is entered into the database system after being key-worded by NIC. The database system, Nuclear Information database/Communication System (NICS), has been developed by NIC for storage and management of collected information. Objectives of keywords are retrieval and classification by the keyword categories

  2. The OKS persistent in-memory object manager

    International Nuclear Information System (INIS)

    Jones, R.; Mapelli, L.; Soloviev, I.

    1998-01-01

    The OKS (Object Kernel Support) is a library to support a simple, active persistent in-memory object manager. It is suitable for applications which need to create persistent structured information with fast access but do not require full database functionality. It can be used as the frame of configuration databases and real-time object managers for Data Acquisition and Detector Control Systems in such fields as setup, diagnostics and general configuration description. OKS is based on an object model that supports objects, classes, associations, methods, inheritance, polymorphism, object identifiers, composite objects, integrity constraints, schema evolution, data migration and active notification. OKS stores the class definitions and their instances in portable ASCII files. It provides query facilities, including indices support. The OKS has a C++ API (Application Program Interface) and includes Motif based GUI applications to design class schema and to manipulate objects. OKS has been developed on top of the Rogue Wave Tools h++ C++ class library

  3. The Cocoa Shop: A Database Management Case

    Science.gov (United States)

    Pratt, Renée M. E.; Smatt, Cindi T.

    2015-01-01

    This is an example of a real-world applicable case study, which includes background information on a small local business (i.e., TCS), description of functional business requirements, and sample data. Students are asked to design and develop a database to improve the management of the company's customers, products, and purchases by emphasizing…

  4. Database to manage personal dosimetry Hospital Universitario de La Ribera

    International Nuclear Information System (INIS)

    Melchor, M.; Martinez, D.; Asensio, M.; Candela, F.; Camara, A.

    2011-01-01

    For the management of professionally exposed personnel dosimetry, da La are required for the use and return of dosimeters. in the Department of Radio Physics and Radiation Protection have designed and implemented a database management staff dosimetry Hospital and Area Health Centers. The specific objectives were easily import data from the National Center dosimetric dosimetry, consulting records in a simple dosimetry, dosimeters allow rotary handle, and also get reports from different periods of time to know the return data for users, services, etc.

  5. Implementation of the Multidimensional Modeling Concepts into Object-Relational Databases

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available A key to survival in the business world is being able to analyze, plan and react to changing business conditions as fast as possible. With multidimensional models the managers can explore information at different levels of granularity and the decision makers at all levels can quickly respond to changes in the business climate-the ultimate goal of business intelligence. This paper focuses on the implementation of the multidimensional concepts into object-relational databases.

  6. An Object-Relational Ifc Storage Model Based on Oracle Database

    Science.gov (United States)

    Li, Hang; Liu, Hua; Liu, Yong; Wang, Yuan

    2016-06-01

    With the building models are getting increasingly complicated, the levels of collaboration across professionals attract more attention in the architecture, engineering and construction (AEC) industry. In order to adapt the change, buildingSMART developed Industry Foundation Classes (IFC) to facilitate the interoperability between software platforms. However, IFC data are currently shared in the form of text file, which is defective. In this paper, considering the object-based inheritance hierarchy of IFC and the storage features of different database management systems (DBMS), we propose a novel object-relational storage model that uses Oracle database to store IFC data. Firstly, establish the mapping rules between data types in IFC specification and Oracle database. Secondly, design the IFC database according to the relationships among IFC entities. Thirdly, parse the IFC file and extract IFC data. And lastly, store IFC data into corresponding tables in IFC database. In experiment, three different building models are selected to demonstrate the effectiveness of our storage model. The comparison of experimental statistics proves that IFC data are lossless during data exchange.

  7. Selection of nuclear power information database management system

    International Nuclear Information System (INIS)

    Zhang Shuxin; Wu Jianlei

    1996-01-01

    In the condition of the present database technology, in order to build the Chinese nuclear power information database (NPIDB) in the nuclear industry system efficiently at a high starting point, an important task is to select a proper database management system (DBMS), which is the hinge of the matter to build the database successfully. Therefore, this article explains how to build a practical information database about nuclear power, the functions of different database management systems, the reason of selecting relation database management system (RDBMS), the principles of selecting RDBMS, the recommendation of ORACLE management system as the software to build database and so on

  8. Storage and Database Management for Big Data

    Science.gov (United States)

    2015-07-27

    cloud models that satisfy different problem 1.2. THE BIG DATA CHALLENGE 3 Enterprise Big Data - Interactive - On-demand - Virtualization - Java ...replication. Data loss can only occur if three drives fail prior to any one of the failures being corrected. Hadoop is written in Java and is installed in a...visible view into a dataset. There are many popular database management systems such as MySQL [4], PostgreSQL [63], and Oracle [5]. Most commonly

  9. SPIRE Data-Base Management System

    Science.gov (United States)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  10. Managed Objects for Infrastructure Data

    DEFF Research Database (Denmark)

    Kjems, Erik; Bodum, Lars; Kolar, Jan

    2009-01-01

    Using data objects to describe features in the real world is a new idea and several approaches have already been shown to match scientific paradigms exceedingly well. Depending on the required level of abstraction, it is possible to represent the world more or less close to reality. In the realm ...

  11. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    Gagliano, L.; MacLeod, T.; Mason, S.; Percy, T.; Prescott, J.

    The Space Asset Management Database (SAM-D) was implemented in order to effectively track known objects in space by ingesting information from a variety of databases and performing calculations to determine the expected position of the object at a specified time. While SAM-D performs this task very well, it is limited by technology and is not available outside of the local user base. Modeling and simulation can be powerful tools to exploit the information contained in SAM-D. However, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. A more capable data management infrastructure would extend SAM-D to support the larger data sets to be generated by the COI. A service-oriented architecture model will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for visualizations. Based on a web-centric approach, the entire COI will be able to access the data and related analytics. In addition, tight control of information sharing policy will increase confidence in the system, which would encourage industry partners to provide commercial data. SIMON is a Government off the Shelf information sharing platform in use throughout DoD and DHS information sharing and situation awareness communities. SIMON providing fine grained control to data owners allowing them to determine exactly how and when their data is shared. SIMON supports a micro-service approach to system development, meaning M&S and analytic services can be easily built or adapted. It is uniquely positioned to fill this need as an information-sharing platform with a proven track record of successful situational awareness system deployments. Combined with the integration of new and legacy M&S tools, a SIMON-based architecture will provide a robust SA environment for the NASA SA COI that can be extended and expanded indefinitely. First Results of Coherent Uplink from a

  12. Tools for Managing Repository Objects

    OpenAIRE

    Banker, Rajiv D.; Isakowitz, Tomas; Kauffman, Robert J.; Kumar, Rachna; Zweig, Dani

    1993-01-01

    working Paper Series: STERN IS-93-46 The past few years have seen the introduction of repository-based computer aided software engineering (CASE) tools which may finally enable us to develop software which is reliable and affordable. With the new tools come new challenges for management: Repository-based CASE changes software development to such an extent that traditional approaches to estimation, performance, and productivity assessment may no longer suffice - if they ever...

  13. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  14. San Jacinto Tries Management by Objectives

    Science.gov (United States)

    Deegan, William

    1974-01-01

    San Jacinto, California, has adopted a measurable institutional objectives approach to management by objectives. Results reflect, not only improved cost effectiveness of community college education, but also more effective educational programs for students. (Author/WM)

  15. Accessing OSI Managed Objects from ANSAware

    OpenAIRE

    Genilloud, Guy; Gay, David

    1995-01-01

    This paper presents a mechanism allowing an ODP compliant distributed system, ANSA, to access OSI network management objects as if they were ANSA objects. It defines a mapping from the OSI object model to the ANSA object model, and it specifies how an adapter implements this mapping.

  16. STRATEGIC MANAGEMENT OBJECT AS AN OBJECT OF SCIENTIFIC RESEARCH

    Directory of Open Access Journals (Sweden)

    Mykola Bondar

    2015-11-01

    Full Text Available The purpose of research is to highlight the main areas of the system of strategic management accounting, improvement of the principles on which it operates. Subject of research is theoretical and practical aspects of functioning and development of strategic management accounting. Subject area is focused on strategic management information support towards the implementation of the principle of balancing of activity of the entities. Objectives of the research is to determine the place and role of strategic management accounting in the creation of information infrastructure management in the current economic conditions; disclosure of decomposition problems and improvement of the functioning of the system of strategic management accounting, prioritization of development. Hypothesis of the research is based on the assumption that the effectiveness of entities management adapted to the needs of the market environment of complete, accurate and timely information, which is formed in properly organized system of strategic management accounting. Methodology is based on analysis of data of respondents from 125 industrial entities of Kharkiv region. Data was collected through direct surveys and in the preparation of Kharkiv Oblast Development Strategy for the period until 2020. Respondents were asked a number of questions that determine: results of the system of information support of strategic management in enterprises employing respondents; direction of the system of strategic management accounting in enterprises employing respondents. By means of expert assessments was evaluated important source of information for making strategic management decisions. General system of research methodology is based on a systematic approach. Conclusion. During the research was confirmed the role and importance of strategic management accounting information for the purpose of strategic management. According to the results outlined challenges facing the leaders of

  17. Administrative Management by Objectives. Policy 2100.

    Science.gov (United States)

    East Allen County Schools, New Haven, IN.

    Management-by-objectives (MBO) focuses attention on objectives stated as end accomplishments rather than the activities which bring about those accomplishments. MBO identifies eight major areas of management which become involved in the process: (1) planning, (2) performance appraisal, (3) individual motivation, (4) coordination, (5) control, (6)…

  18. Development of operation management database for research reactors

    International Nuclear Information System (INIS)

    Zhang Xinjun; Chen Wei; Yang Jun

    2005-01-01

    An Operation Database for Pulsed Reactor has been developed on the platform for Microsoft visual C++ 6.0. This database includes four function modules, fuel elements management, incident management, experiment management and file management. It is essential for reactor security and information management. (authors)

  19. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    Science.gov (United States)

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  20. Implementation of a database for the management of radioactive sources

    International Nuclear Information System (INIS)

    MOHAMAD, M.

    2012-01-01

    In Madagascar, the application of nuclear technology continues to develop. In order to protect the human health and his environment against the harmful effects of the ionizing radiation, each user of radioactive sources has to implement a program of nuclear security and safety and to declare their sources at Regulatory Authority. This Authority must have access to all the informations relating to all the sources and their uses. This work is based on the elaboration of a software using python as programming language and SQlite as database. It makes possible to computerize the radioactive sources management.This application unifies the various existing databases and centralizes the activities of the radioactive sources management.The objective is to follow the movement of each source in the Malagasy territory in order to avoid the risks related on the use of the radioactive sources and the illicit traffic. [fr

  1. Automatic Verification of Transactions on an Object-Oriented Database

    NARCIS (Netherlands)

    Spelt, D.; Balsters, H.

    1998-01-01

    In the context of the object-oriented data model, a compiletime approach is given that provides for a significant reduction of the amount of run-time transaction overhead due to integrity constraint checking. The higher-order logic Isabelle theorem prover is used to automatically prove which

  2. NGNP Risk Management Database: A Model for Managing Risk

    International Nuclear Information System (INIS)

    Collins, John

    2009-01-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft(reg s ign) Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool's design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  3. NGNP Risk Management Database: A Model for Managing Risk

    Energy Technology Data Exchange (ETDEWEB)

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  4. A Survey of Object-Oriented Database Technology

    Science.gov (United States)

    1990-05-01

    now mention briefly the various security and autho- rization schemes provided by GEMSTONE. 1. Login Authorization. There are two ways to login to...GemStone- through the OPAL programming environment or through the GemStone C interface. A user ID and password is required in both cases to login . 2. Name...lIlj A. Black. Object structure in the Emerald system. Proc. Ist Intl. Conf. on Objcct- Oriented Programming Systems, Languages and Applications, pp

  5. Radioactive waste management - objectives and practices

    International Nuclear Information System (INIS)

    Ali, S.S.

    2002-01-01

    This article deals with the objectives, the legal frame works, regulations and the regulating authorities in India and also the technologies and practices being used for the safe management of radioactive wastes in the country

  6. Strategic Management Accounting Corporate Objective and ...

    African Journals Online (AJOL)

    No organisation operates without a focus and this focus can be termed objective or goal; which should be clearly slated. This study therefore looked at Strategic Management Accounting Corporate Strategy and Production objectives. The study samples are selected manufacturing firms in Port Harcourt. Questionnaires were ...

  7. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  8. Access database application in medical treatment management platform

    International Nuclear Information System (INIS)

    Wu Qingming

    2014-01-01

    For timely, accurate and flexible access to medical expenses data, we applied Microsoft Access 2003 database management software, and we finished the establishment of a management platform for medical expenses. By developing management platform for medical expenses, overall hospital costs for medical expenses can be controlled to achieve a real-time monitoring of medical expenses. Using the Access database management platform for medical expenses not only changes the management model, but also promotes a sound management system for medical expenses. (authors)

  9. Computerized nuclear material database management system for power reactors

    International Nuclear Information System (INIS)

    Cheng Binghao; Zhu Rongbao; Liu Daming; Cao Bin; Liu Ling; Tan Yajun; Jiang Jincai

    1994-01-01

    The software packages for nuclear material database management for power reactors are described. The database structure, data flow and model for management of the database are analysed. Also mentioned are the main functions and characterizations of the software packages, which are successfully installed and used at both the Daya Bay Nuclear Power Plant and the Qinshan Nuclear Power Plant for the purposed of handling nuclear material database automatically

  10. Setting objectives for managing Key deer

    Science.gov (United States)

    Diefenbach, Duane R.; Wagner, Tyler; Stauffer, Glenn E.

    2014-01-01

    The U.S. Fish and Wildlife Service (FWS) is responsible for the protection and management of Key deer (Odocoileus virginianus clavium) because the species is listed as Endangered under the Endangered Species Act (ESA). The purpose of the ESA is to protect and recover imperiled species and the ecosystems upon which they depend. There are a host of actions that could possibly be undertaken to recover the Key deer population, but without a clearly defined problem and stated objectives it can be difficult to compare and evaluate alternative actions. In addition, management goals and the acceptability of alternative management actions are inherently linked to stakeholders, who should be engaged throughout the process of developing a decision framework. The purpose of this project was to engage a representative group of stakeholders to develop a problem statement that captured the management problem the FWS must address with Key deer and identify objectives that, if met, would help solve the problem. In addition, the objectives were organized in a hierarchical manner (i.e., an objectives network) to show how they are linked, and measurable attributes were identified for each objective. We organized a group of people who represented stakeholders interested in and potentially affected by the management of Key deer. These stakeholders included individuals who represented local, state, and federal governments, non-governmental organizations, the general public, and local businesses. This stakeholder group met five full days over the course of an eight-week period to identify objectives that would address the following problem:“As recovery and removal from the Endangered Species list is the purpose of the Endangered Species Act, the U.S. Fish and Wildlife Service needs a management approach that will ensure a sustainable, viable, and healthy Key deer population. Urbanization has affected the behavior and population dynamics of the Key deer and the amount and characteristics

  11. Portuguese food composition database quality management system.

    Science.gov (United States)

    Oliveira, L M; Castanheira, I P; Dantas, M A; Porto, A A; Calhau, M A

    2010-11-01

    The harmonisation of food composition databases (FCDB) has been a recognised need among users, producers and stakeholders of food composition data (FCD). To reach harmonisation of FCDBs among the national compiler partners, the European Food Information Resource (EuroFIR) Network of Excellence set up a series of guidelines and quality requirements, together with recommendations to implement quality management systems (QMS) in FCDBs. The Portuguese National Institute of Health (INSA) is the national FCDB compiler in Portugal and is also a EuroFIR partner. INSA's QMS complies with ISO/IEC (International Organization for Standardisation/International Electrotechnical Commission) 17025 requirements. The purpose of this work is to report on the strategy used and progress made for extending INSA's QMS to the Portuguese FCDB in alignment with EuroFIR guidelines. A stepwise approach was used to extend INSA's QMS to the Portuguese FCDB. The approach included selection of reference standards and guides and the collection of relevant quality documents directly or indirectly related to the compilation process; selection of the adequate quality requirements; assessment of adequacy and level of requirement implementation in the current INSA's QMS; implementation of the selected requirements; and EuroFIR's preassessment 'pilot' auditing. The strategy used to design and implement the extension of INSA's QMS to the Portuguese FCDB is reported in this paper. The QMS elements have been established by consensus. ISO/IEC 17025 management requirements (except 4.5) and 5.2 technical requirements, as well as all EuroFIR requirements (including technical guidelines, FCD compilation flowchart and standard operating procedures), have been selected for implementation. The results indicate that the quality management requirements of ISO/IEC 17025 in place in INSA fit the needs for document control, audits, contract review, non-conformity work and corrective actions, and users' (customers

  12. Applications of GIS and database technologies to manage a Karst Feature Database

    Science.gov (United States)

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  13. Objective-C memory management essentials

    CERN Document Server

    Tang, Gibson

    2015-01-01

    If you are new to Objective-C or a veteran in iOS application development, this is the book for you. This book will ensure that you can actively learn the methods and concepts in relation to memory management in a more engaging way. Basic knowledge of iOS development is required for this book.

  14. GEOINFORMATION DATABASE OF OBJECTS OF HISTORICAL AND CULTURAL HERITAGE OF CHUVASHIA

    Directory of Open Access Journals (Sweden)

    E. Zhitova

    2015-01-01

    Full Text Available In order to preserve the historical and cultural heritage, monitoring the status and possibilities of use in the tourist industry in the Chuvash Republic is supposed to create a database of geographic information. The main objective of geographic information database historical and cultural heritage is divided into functional and semantic groups of tables GIS.

  15. Management-By-Objectives in Healthcare

    DEFF Research Database (Denmark)

    Traberg, Andreas

    that managers are able to incorporate those indicators they find useful in their department, and thus secure sufficient informational support for the department's decision-making processes. The Performance Account thereby eases the identification of areas suited for corrective actions, and provides the decision......; collectively, however, they pose a significant drawback. The vast selection of self-contained initiatives limits the overview for decision makers and imposes an escalating administrative burden on operational staff members. Contrary to the initial objective, the expanding informational burden limits...... the overview and transparency for healthcare decision makers; as a result, well-documented initiatives fail to become integrated support in operational decision-making processes. This research work has thus striven to design a holistic Management-By-Objectives framework that can enable managers and operational...

  16. An Introduction to the DB Relational Database Management System

    OpenAIRE

    Ward, J.R.

    1982-01-01

    This paper is an introductory guide to using the Db programs to maintain and query a relational database on the UNIX operating system. In the past decade. increasing interest has been shown in the development of relational database management systems. Db is an attempt to incorporate a flexible and powerful relational database system within the user environment presented by the UNIX operating system. The family of Db programs is useful for maintaining a database of information that i...

  17. Semantic-Based Concurrency Control for Object-Oriented Database Systems Supporting Real-Time Applications

    National Research Council Canada - National Science Library

    Lee, Juhnyoung; Son, Sang H

    1994-01-01

    .... This paper investigates major issues in designing semantic-based concurrency control for object-oriented database systems supporting real-time applications, and it describes approaches to solving...

  18. Implementation and Integration of the Object Transaction Service of Corba to a Java Application Database Program

    National Research Council Canada - National Science Library

    Hazir, Yildiray

    2000-01-01

    .... For this reason CORBA has attracted our attention. The OMG(Object Management Group), a consortium of object venders, developed the CORBA standard in the fall of 1990 as a common interconnection bus for distributed objects...

  19. Resource Survey Relational Database Management System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Mississippi Laboratories employ both enterprise and localized data collection systems for recording data. The databases utilized by these applications range from...

  20. Reliability database development for use with an object-oriented fault tree evaluation program

    Science.gov (United States)

    Heger, A. Sharif; Harringtton, Robert J.; Koen, Billy V.; Patterson-Hine, F. Ann

    1989-01-01

    A description is given of the development of a fault-tree analysis method using object-oriented programming. In addition, the authors discuss the programs that have been developed or are under development to connect a fault-tree analysis routine to a reliability database. To assess the performance of the routines, a relational database simulating one of the nuclear power industry databases has been constructed. For a realistic assessment of the results of this project, the use of one of existing nuclear power reliability databases is planned.

  1. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    Science.gov (United States)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  2. Database and applications security integrating information security and data management

    CERN Document Server

    Thuraisingham, Bhavani

    2005-01-01

    This is the first book to provide an in-depth coverage of all the developments, issues and challenges in secure databases and applications. It provides directions for data and application security, including securing emerging applications such as bioinformatics, stream information processing and peer-to-peer computing. Divided into eight sections, each of which focuses on a key concept of secure databases and applications, this book deals with all aspects of technology, including secure relational databases, inference problems, secure object databases, secure distributed databases and emerging

  3. A user's manual for managing database system of tensile property

    International Nuclear Information System (INIS)

    Ryu, Woo Seok; Park, S. J.; Kim, D. H.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the tensile database system for managing the tensile property test data. The data base constructed the data produced from tensile property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The tensile database system was developed by internet method using Java, PL/SQL, JSP(Java Server Pages) tool

  4. Database basic design for safe management radioactive waste

    International Nuclear Information System (INIS)

    Son, D. C.; Ahn, K. I.; Jung, D. J.; Cho, Y. B.

    2003-01-01

    As the amount of radioactive waste and related information to be managed are increasing, some organizations are trying or planning to computerize the management on radioactive waste. When we consider that information on safe management of radioactive waste should be used in association with national radioactive waste management project, standardization of data form and its protocol is required, Korea Institute of Nuclear Safety(KINS) will establish and operate nationwide integrated database in order to effectively manage a large amount of information on national radioactive waste. This database allows not only to trace and manage the trend of radioactive waste occurrence and in storage but also to produce reliable analysis results for the quantity accumulated. Consequently, we can provide necessary information for national radioactive waste management policy and related industry's planing. This study explains the database design which is the essential element for information management

  5. MST radar data-base management

    Science.gov (United States)

    Wickwar, V. B.

    1983-01-01

    Data management for Mesospheric-Stratospheric-Tropospheric, (MST) radars is addressed. An incoherent-scatter radar data base is discussed in terms of purpose, centralization, scope, and nature of the data base management system.

  6. An object-oriented language-database integration model: The composition filters approach

    NARCIS (Netherlands)

    Aksit, Mehmet; Bergmans, Lodewijk; Vural, Sinan; Vural, S.

    1991-01-01

    This paper introduces a new model, based on so-called object-composition filters, that uniformly integrates database-like features into an object-oriented language. The focus is on providing persistent dynamic data structures, data sharing, transactions, multiple views and associative access,

  7. Typed Sets as a Basis for Object-Oriented Database Schemas

    NARCIS (Netherlands)

    Balsters, H.; de By, R.A.; Zicari, R.

    The object-oriented data model TM is a language that is based on the formal theory of FM, a typed language with object-oriented features such as attributes and methods in the presence of subtyping. The general (typed) set constructs of FM allow one to deal with (database) constraints in TM. The

  8. An Object-Oriented Language-Database Integration Model: The Composition-Filters Approach

    NARCIS (Netherlands)

    Aksit, Mehmet; Bergmans, Lodewijk; Vural, S.; Vural, Sinan; Lehrmann Madsen, O.

    1992-01-01

    This paper introduces a new model, based on so-called object-composition filters, that uniformly integrates database-like features into an object-oriented language. The focus is on providing persistent dynamic data structures, data sharing, transactions, multiple views and associative access,

  9. HATCHES - a thermodynamic database and management system

    International Nuclear Information System (INIS)

    Cross, J.E.; Ewart, F.T.

    1990-03-01

    The Nirex Safety Assessment Research Programme has been compiling the thermodynamic data necessary to allow simulations of the aqueous behaviour of the elements important to radioactive waste disposal to be made. These data have been obtained from the literature, when available, and validated for the conditions of interest by experiment. In order to maintain these data in an accessible form and to satisfy quality assurance on all data used for assessments, a database has been constructed which resides on a personal computer operating under MS-DOS using the Ashton-Tate dBase III program. This database contains all the input data fields required by the PHREEQE program and, in addition, a body of text which describes the source of the data and the derivation of the PHREEQE input parameters from the source data. The HATCHES system consists of this database, a suite of programs to facilitate the searching and listing of data and a further suite of programs to convert the dBase III files to PHREEQE database format. (Author)

  10. [The future of clinical laboratory database management system].

    Science.gov (United States)

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  11. Wireless Sensor Networks Database: Data Management and Implementation

    Directory of Open Access Journals (Sweden)

    Ping Liu

    2014-04-01

    Full Text Available As the core application of wireless sensor network technology, Data management and processing have become the research hotspot in the new database. This article studied mainly data management in wireless sensor networks, in connection with the characteristics of the data in wireless sensor networks, discussed wireless sensor network data query, integrating technology in-depth, proposed a mobile database structure based on wireless sensor network and carried out overall design and implementation for the data management system. In order to achieve the communication rules of above routing trees, network manager uses a simple maintenance algorithm of routing trees. Design ordinary node end, server end in mobile database at gathering nodes and mobile client end that can implement the system, focus on designing query manager, storage modules and synchronous module at server end in mobile database at gathering nodes.

  12. Nuclear data processing using a database management system

    International Nuclear Information System (INIS)

    Castilla, V.; Gonzalez, L.

    1991-01-01

    A database management system that permits the design of relational models was used to create an integrated database with experimental and evaluated nuclear data.A system that reduces the time and cost of processing was created for computers type EC or compatibles.A set of programs for the conversion from nuclear calculated data output format to EXFOR format was developed.A dictionary to perform a retrospective search in the ENDF database was created too

  13. Cockpit management and Specific Behavioral Objectives (SBOs)

    Science.gov (United States)

    Mudge, R. W.

    1987-01-01

    One of the primary tools used to accomplish the task of effective training is the specific behavioral objective (SBO). An SBO is simply a statement which specifically identifies a small segment of the final behavior sought, and a little more. The key word is specific. The company pinpoints exactly what it is it wants the pilot to do after completing training, and what it should evaluate from the point of view of both the program and the pilot. It tells the junior crewmember exactly, specifically, what he should monitor and support insofar as the management function is concerned. It gives greater meaning to the term second in command. And finally, it tells the supervisory pilot exactly what he should observe, evaluate, and instruct, insofar as the management function is concerned.

  14. NETMARK: A Schema-less Extension for Relational Databases for Managing Semi-structured Data Dynamically

    Science.gov (United States)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  15. International database on ageing management and life extension

    International Nuclear Information System (INIS)

    Ianko, L.; Lyssakov, V.; McLachlan, D.; Russell, J.; Mukhametshin, V.

    1995-01-01

    International database on ageing management and life extension for reactor pressure vessel materials (RPVM) is described with the emphasis on the following issues: requirements of the system; design concepts for RPVM database system; data collection, processing and storage; information retrieval and dissemination; RPVM information assessment and evaluation. 1 fig

  16. Managing Multiuser Database Buffers Using Data Mining Techniques

    NARCIS (Netherlands)

    Feng, L.; Lu, H.J.

    2004-01-01

    In this paper, we propose a data-mining-based approach to public buffer management for a multiuser database system, where database buffers are organized into two areas – public and private. While the private buffer areas contain pages to be updated by particular users, the public

  17. Engineering the object-relation database model in O-Raid

    Science.gov (United States)

    Dewan, Prasun; Vikram, Ashish; Bhargava, Bharat

    1989-01-01

    Raid is a distributed database system based on the relational model. O-raid is an extension of the Raid system and will support complex data objects. The design of O-Raid is evolutionary and retains all features of relational data base systems and those of a general purpose object-oriented programming language. O-Raid has several novel properties. Objects, classes, and inheritance are supported together with a predicate-base relational query language. O-Raid objects are compatible with C++ objects and may be read and manipulated by a C++ program without any 'impedance mismatch'. Relations and columns within relations may themselves be treated as objects with associated variables and methods. Relations may contain heterogeneous objects, that is, objects of more than one class in a certain column, which can individually evolve by being reclassified. Special facilities are provided to reduce the data search in a relation containing complex objects.

  18. CALCOM Database for managing California Commercial Groundfish sample data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The CALCOM database is used by the California Cooperative Groundfish Survey to store and manage Commercial market sample data. This data is ultimately used to...

  19. Development of environment radiation database management system

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun [Daeduk College, Taejon (Korea, Republic of)

    1999-03-15

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation.

  20. Development of environment radiation database management system

    International Nuclear Information System (INIS)

    Kang, Jong Gyu; Chung, Chang Hwa; Ryu, Chan Ho; Lee, Jin Yeong; Kim, Dong Hui; Lee, Hun Sun

    1999-03-01

    In this development, we constructed a database for efficient data processing and operating of radiation-environment related data. Se developed the source documents retrieval system and the current status printing system that supports a radiation environment dta collection, pre-processing and analysis. And, we designed and implemented the user interfaces and DB access routines based on WWW service policies on KINS Intranet. It is expected that the developed system, which organizes the information related to environmental radiation data systematically can be utilize for the accurate interpretation, analysis and evaluation

  1. Components of foreign object management as

    Directory of Open Access Journals (Sweden)

    V.Yu. Gordopolov

    2016-12-01

    Full Text Available In the article the components of foreign economic activity as objects of accounting and management, and analysis of current legislation and scientific literature that allowed classification form shapes and types of foreign economic activities that contribute to the planning process in the enterprise, as well as building an effective system of management and accounting, economic analysis and internal control. As part of the classification described basic forms, types of foreign trade, especially their implementation and legal regulation. On the basis of basic forms and types of foreign trade, set a number of problems conceptual-categorical apparatus applicable law. A nine treatments categories–operations of foreign economic activity of fixing specific legal acts. Operations of foreign economic activity (import transactions, export transactions, international transactions, securities, credit and foreign payment transactions, foreign rental operations, international leasing, foreign exchange transactions, foreign investments and operations associated with the joint activity identified as about objects accounting in foreign trade and is divided according to types of business entities (operating, financial and investment.

  2. Presidential Libraries Museum Collection Management Database

    Data.gov (United States)

    National Archives and Records Administration — MCMD serves as a descriptive catalog for the Presidential Libraries museum collections, and also supports a full range of museum collections management processes...

  3. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    Energy Technology Data Exchange (ETDEWEB)

    Viegas, F; Nairz, A; Goossens, L [CERN, CH-1211 Geneve 23 (Switzerland); Malon, D; Cranshaw, J [Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, IL 60439 (United States); Dimitrov, G [DESY, D-22603 Hamburg (Germany); Nowak, M; Gamboa, C [Brookhaven National Laboratory, PO Box 5000 Upton, NY 11973-5000 (United States); Gallas, E [University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Wong, A [Triumf, 4004 Wesbrook Mall, Vancouver, BC, V6T 2A3 (Canada); Vinek, E [University of Vienna, Dr.-Karl-Lueger-Ring 1, 1010 Vienna (Austria)

    2010-04-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  4. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    International Nuclear Information System (INIS)

    Viegas, F; Nairz, A; Goossens, L; Malon, D; Cranshaw, J; Dimitrov, G; Nowak, M; Gamboa, C; Gallas, E; Wong, A; Vinek, E

    2010-01-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  5. Learning object repositories as knowledge management systems

    Directory of Open Access Journals (Sweden)

    Demetrios G. Sampson

    2013-06-01

    Full Text Available Over the past years, a number of international initiatives that recognize the importance of sharing and reusing digital educational resources among educational communities through the use of Learning Object Repositories (LORs have emerged. Typically, these initiatives focus on collecting digital educational resources that are offered by their creators for open access and potential reuse. Nevertheless, most of the existing LORs are designed more as digital repositories, rather than as Knowledge Management Systems (KMS. By exploiting KMSs functionalities in LORs would bare the potential to support the organization and sharing of educational communities’ explicit knowledge (depicted in digital educational resources constructed by teachers and/or instructional designers and tacit knowledge (depicted in teachers’ and students’ experiences and interactions of using digital educational resources available in LORs. Within this context, in this paper we study the design and the implementation of fourteen operating LORs from the KMSs’ perspective, so as to identify additional functionalities that can support the management of educational communities’ explicit and tacit knowledge. Thus, we propose a list of essential LORs’ functionalities, which aim to facilitate the organization and sharing of educational communities’ knowledge. Finally, we present the added value of these functionalities by identifying their importance towards addressing the current demands of web-facilitated educational communities, as well as the knowledge management activities that they execute.

  6. Report of the SRC working party on databases and database management systems

    International Nuclear Information System (INIS)

    Crennell, K.M.

    1980-10-01

    An SRC working party, set up to consider the subject of support for databases within the SRC, were asked to identify interested individuals and user communities, establish which features of database management systems they felt were desirable, arrange demonstrations of possible systems and then make recommendations for systems, funding and likely manpower requirements. This report describes the activities and lists the recommendations of the working party and contains a list of databses maintained or proposed by those who replied to a questionnaire. (author)

  7. Database management system for large container inspection system

    International Nuclear Information System (INIS)

    Gao Wenhuan; Li Zheng; Kang Kejun; Song Binshan; Liu Fang

    1998-01-01

    Large Container Inspection System (LCIS) based on radiation imaging technology is a powerful tool for the Customs to check the contents inside a large container without opening it. The author has discussed a database application system, as a part of Signal and Image System (SIS), for the LCIS. The basic requirements analysis was done first. Then the selections of computer hardware, operating system, and database management system were made according to the technology and market products circumstance. Based on the above considerations, a database application system with central management and distributed operation features has been implemented

  8. ARACHNID: A prototype object-oriented database tool for distributed systems

    Science.gov (United States)

    Younger, Herbert; Oreilly, John; Frogner, Bjorn

    1994-01-01

    This paper discusses the results of a Phase 2 SBIR project sponsored by NASA and performed by MIMD Systems, Inc. A major objective of this project was to develop specific concepts for improved performance in accessing large databases. An object-oriented and distributed approach was used for the general design, while a geographical decomposition was used as a specific solution. The resulting software framework is called ARACHNID. The Faint Source Catalog developed by NASA was the initial database testbed. This is a database of many giga-bytes, where an order of magnitude improvement in query speed is being sought. This database contains faint infrared point sources obtained from telescope measurements of the sky. A geographical decomposition of this database is an attractive approach to dividing it into pieces. Each piece can then be searched on individual processors with only a weak data linkage between the processors being required. As a further demonstration of the concepts implemented in ARACHNID, a tourist information system is discussed. This version of ARACHNID is the commercial result of the project. It is a distributed, networked, database application where speed, maintenance, and reliability are important considerations. This paper focuses on the design concepts and technologies that form the basis for ARACHNID.

  9. Development of Krsko Severe Accident Management Database (SAMD)

    International Nuclear Information System (INIS)

    Basic, I.; Kocnar, R.

    1996-01-01

    Severe Accident Management is a framework to identify and implement the Emergency Response Capabilities that can be used to prevent or mitigate severe accidents and their consequences. Krsko Severe Accident Management Database documents the severe accident management activities which are developed in the NPP Krsko, based on the Krsko IPE (Individual Plant Examination) insights and Generic WOG SAMGs (Westinghouse Owners Group Severe Accident Management Guidance). (author)

  10. Relational Information Management Data-Base System

    Science.gov (United States)

    Storaasli, O. O.; Erickson, W. J.; Gray, F. P.; Comfort, D. L.; Wahlstrom, S. O.; Von Limbach, G.

    1985-01-01

    DBMS with several features particularly useful to scientists and engineers. RIM5 interfaced with any application program written in language capable of Calling FORTRAN routines. Applications include data management for Space Shuttle Columbia tiles, aircraft flight tests, high-pressure piping, atmospheric chemistry, census, university registration, CAD/CAM Geometry, and civil-engineering dam construction.

  11. TaxMan: a taxonomic database manager

    Directory of Open Access Journals (Sweden)

    Blaxter Mark

    2006-12-01

    Full Text Available Abstract Background Phylogenetic analysis of large, multiple-gene datasets, assembled from public sequence databases, is rapidly becoming a popular way to approach difficult phylogenetic problems. Supermatrices (concatenated multiple sequence alignments of multiple genes can yield more phylogenetic signal than individual genes. However, manually assembling such datasets for a large taxonomic group is time-consuming and error-prone. Additionally, sequence curation, alignment and assessment of the results of phylogenetic analysis are made particularly difficult by the potential for a given gene in a given species to be unrepresented, or to be represented by multiple or partial sequences. We have developed a software package, TaxMan, that largely automates the processes of sequence acquisition, consensus building, alignment and taxon selection to facilitate this type of phylogenetic study. Results TaxMan uses freely available tools to allow rapid assembly, storage and analysis of large, aligned DNA and protein sequence datasets for user-defined sets of species and genes. The user provides GenBank format files and a list of gene names and synonyms for the loci to analyse. Sequences are extracted from the GenBank files on the basis of annotation and sequence similarity. Consensus sequences are built automatically. Alignment is carried out (where possible, at the protein level and aligned sequences are stored in a database. TaxMan can automatically determine the best subset of taxa to examine phylogeny at a given taxonomic level. By using the stored aligned sequences, large concatenated multiple sequence alignments can be generated rapidly for a subset and output in analysis-ready file formats. Trees resulting from phylogenetic analysis can be stored and compared with a reference taxonomy. Conclusion TaxMan allows rapid automated assembly of a multigene datasets of aligned sequences for large taxonomic groups. By extracting sequences on the basis of

  12. Loss Database Architecture for Disaster Risk Management

    OpenAIRE

    RIOS DIAZ FRANCISCO; MARIN FERRER MONTSERRAT

    2018-01-01

    The reformed Union civil protection legislation (Decision on a Union Civil Protection Mechanism), which entered into force on 1 January 2014, is paving the way for more resilient communities by including key actions related to disaster prevention such as developing national risk assessments and the refinement of risk management planning. Under the Decision, Member States agreed to “develop risk assessments at national or appropriate sub- national level and make available to the Commission a s...

  13. Development of a Relational Database for Learning Management Systems

    Science.gov (United States)

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  14. Database management in the new GANIL control system

    International Nuclear Information System (INIS)

    Lecorche, E.; Lermine, P.

    1993-01-01

    At the start of the new control system design, decision was made to manage the huge amount of data by means of a database management system. The first implementations built on the INGRES relational database are described. Real time and data management domains are shown, and problems induced by Ada/SQL interfacing are briefly discussed. Database management concerns the whole hardware and software configuration for the GANIL pieces of equipment and the alarm system either for the alarm configuration or for the alarm logs. An other field of application encompasses the beam parameter archiving as a function of the various kinds of beams accelerated at GANIL (ion species, energies, charge states). (author) 3 refs., 4 figs

  15. Software development for managing nuclear material database

    International Nuclear Information System (INIS)

    Tondin, Julio Benedito Marin

    2011-01-01

    In nuclear facilities, the nuclear material control is one of the most important activities. The Brazilian National Commission of Nuclear Energy (CNEN) and the International Atomic Energy Agency (IAEA), when inspecting routinely, regards the data provided as a major safety factor. Having a control system of nuclear material that allows the amount and location of the various items to be inspected, at any time, is a key factor today. The objective of this work was to enhance the existing system using a more friendly platform of development, through the VisualBasic programming language (Microsoft Corporation), to facilitate the operation team of the reactor IEA-R1 Reactor tasks, providing data that enable a better and prompter control of the IEA-R1 nuclear material. These data have allowed the development of papers presented at national and international conferences and the development of master's dissertations and doctorate theses. The software object of this study was designed to meet the requirements of the CNEN and the IAEA safeguard rules, but its functions may be expanded in accordance with future needs. The program developed can be used in other reactors to be built in the country, since it is very practical and allows an effective control of the nuclear material in the facilities. (author)

  16. Managing Consistency Anomalies in Distributed Integrated Databases with Relaxed ACID Properties

    DEFF Research Database (Denmark)

    Frank, Lars; Ulslev Pedersen, Rasmus

    2014-01-01

    In central databases the consistency of data is normally implemented by using the ACID (Atomicity, Consistency, Isolation and Durability) properties of a DBMS (Data Base Management System). This is not possible if distributed and/or mobile databases are involved and the availability of data also...... has to be optimized. Therefore, we will in this paper use so called relaxed ACID properties across different locations. The objective of designing relaxed ACID properties across different database locations is that the users can trust the data they use even if the distributed database temporarily...... is inconsistent. It is also important that disconnected locations can operate in a meaningful way in socalled disconnected mode. A database is DBMS consistent if its data complies with the consistency rules of the DBMS's metadata. If the database is DBMS consistent both when a transaction starts and when it has...

  17. MM-MDS: a multidimensional scaling database with similarity ratings for 240 object categories from the Massive Memory picture database.

    Directory of Open Access Journals (Sweden)

    Michael C Hout

    Full Text Available Cognitive theories in visual attention and perception, categorization, and memory often critically rely on concepts of similarity among objects, and empirically require measures of "sameness" among their stimuli. For instance, a researcher may require similarity estimates among multiple exemplars of a target category in visual search, or targets and lures in recognition memory. Quantifying similarity, however, is challenging when everyday items are the desired stimulus set, particularly when researchers require several different pictures from the same category. In this article, we document a new multidimensional scaling database with similarity ratings for 240 categories, each containing color photographs of 16-17 exemplar objects. We collected similarity ratings using the spatial arrangement method. Reports include: the multidimensional scaling solutions for each category, up to five dimensions, stress and fit measures, coordinate locations for each stimulus, and two new classifications. For each picture, we categorized the item's prototypicality, indexed by its proximity to other items in the space. We also classified pairs of images along a continuum of similarity, by assessing the overall arrangement of each MDS space. These similarity ratings will be useful to any researcher that wishes to control the similarity of experimental stimuli according to an objective quantification of "sameness."

  18. MM-MDS: a multidimensional scaling database with similarity ratings for 240 object categories from the Massive Memory picture database.

    Science.gov (United States)

    Hout, Michael C; Goldinger, Stephen D; Brady, Kyle J

    2014-01-01

    Cognitive theories in visual attention and perception, categorization, and memory often critically rely on concepts of similarity among objects, and empirically require measures of "sameness" among their stimuli. For instance, a researcher may require similarity estimates among multiple exemplars of a target category in visual search, or targets and lures in recognition memory. Quantifying similarity, however, is challenging when everyday items are the desired stimulus set, particularly when researchers require several different pictures from the same category. In this article, we document a new multidimensional scaling database with similarity ratings for 240 categories, each containing color photographs of 16-17 exemplar objects. We collected similarity ratings using the spatial arrangement method. Reports include: the multidimensional scaling solutions for each category, up to five dimensions, stress and fit measures, coordinate locations for each stimulus, and two new classifications. For each picture, we categorized the item's prototypicality, indexed by its proximity to other items in the space. We also classified pairs of images along a continuum of similarity, by assessing the overall arrangement of each MDS space. These similarity ratings will be useful to any researcher that wishes to control the similarity of experimental stimuli according to an objective quantification of "sameness."

  19. Establishment of database system for management of KAERI wastes

    International Nuclear Information System (INIS)

    Shon, J. S.; Kim, K. J.; Ahn, S. J.

    2004-07-01

    Radioactive wastes generated by KAERI has various types, nuclides and characteristics. To manage and control these kinds of radioactive wastes, it comes to need systematic management of their records, efficient research and quick statistics. Getting information about radioactive waste generated and stored by KAERI is the basic factor to construct the rapid information system for national cooperation management of radioactive waste. In this study, Radioactive Waste Management Integration System (RAWMIS) was developed. It is is aimed at management of record of radioactive wastes, uplifting the efficiency of management and support WACID(Waste Comprehensive Integration Database System) which is a national radioactive waste integrated safety management system of Korea. The major information of RAWMIS supported by user's requirements is generation, gathering, transfer, treatment, and storage information for solid waste, liquid waste, gas waste and waste related to spent fuel. RAWMIS is composed of database, software (interface between user and database), and software for a manager and it was designed with Client/Server structure. RAWMIS will be a useful tool to analyze radioactive waste management and radiation safety management. Also, this system is developed to share information with associated companies. Moreover, it can be expected to support the technology of research and development for radioactive waste treatment

  20. Using a database to manage resolution of comments on standards

    International Nuclear Information System (INIS)

    Holloran, R.W.; Kelley, R.P.

    1995-01-01

    Features of production systems that would enhance development and implementation of procedures and other standards were first suggested in 1988 described how a database could provide the features sought for managing the content of structured documents such as standards and procedures. This paper describes enhancements of the database that manage the more complex links associated with resolution of comments. Displaying the linked information on a computer display aids comment resolvers. A hardcopy report generated by the database permits others to independently evaluate the resolution of comments in context with the original text of the standard, the comment, and the revised text of the standard. Because the links are maintained by the database, consistency between the agreed-upon resolutions and the text of the standard can be maintained throughout the subsequent reviews of the standard. Each of the links is bidirectional; i.e., the relationships between any two documents can be viewed from the perspective of either document

  1. DOE technology information management system database study report

    Energy Technology Data Exchange (ETDEWEB)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  2. TRENDS: The aeronautical post-test database management system

    Science.gov (United States)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  3. Accessing ANSA Objects from OSI Network Management

    OpenAIRE

    Berrah, Karrim; Gay, David; Genilloud, Guy

    1994-01-01

    OSI network management provides a general framework for the management of OSI systems, and by extension of any distributed system. However, it is not yet possible to tell to what extent the tools developed for network management will be applicable to distributed systems management. This paper assumes that network managers will want to have some control of the distributed infrastructure and applications. It examines how access to some of the ANSA management interfaces can be given to OSI netwo...

  4. Kingfisher: a system for remote sensing image database management

    Science.gov (United States)

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  5. The use of modern databases in managing nuclear material inventories

    International Nuclear Information System (INIS)

    Behrens, R.G.

    1994-01-01

    The need for a useful nuclear materials database to assist in the management of nuclear materials within the Department of Energy (DOE) Weapons Complex is becoming significantly more important as the mission of the DOE Complex changes and both international safeguards and storage issues become drivers in determining how these materials are managed. A well designed nuclear material inventory database can provide the Nuclear Materials Manager with an essential cost effective tool for timely analysis and reporting of inventories. This paper discusses the use of databases as a management tool to meet increasing requirements for accurate and timely information on nuclear material inventories and related information. From the end user perspective, this paper discusses the rationale, philosophy, and technical requirements for an integrated database to meet the needs for a variety of users such as those working in the areas of Safeguards, Materials Control and Accountability (MC ampersand A), Nuclear Materials Management, Waste Management, materials processing, packaging and inspection, and interim/long term storage

  6. Development of the ageing management database of PUSPATI TRIGA reactor

    Energy Technology Data Exchange (ETDEWEB)

    Ramli, Nurhayati, E-mail: nurhayati@nm.gov.my; Tom, Phongsakorn Prak; Husain, Nurfazila; Farid, Mohd Fairus Abd; Ramli, Shaharum [Reactor Technology Centre, Malaysian Nuclear Agency, MOSTI, Bangi, 43000 Kajang, Selangor (Malaysia); Maskin, Mazleha [Science Program, Faculty of Science and Technology, Universiti Kebangsaan Malaysia, Selangor (Malaysia); Adnan, Amirul Syazwan; Abidin, Nurul Husna Zainal [Faculty of Petroleum and Renewable Energy Engineering, Universiti Teknologi Malaysia (Malaysia)

    2016-01-22

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  7. Utopia2000: An Online Learning-Object Management Tool.

    Science.gov (United States)

    Aspillaga, Macarena

    2002-01-01

    Describes Utopia2002, a database that contains learning objects that enables faculty to design and develop interactive Web-based instruction. Topics include advanced distributed learning; sharable content objects (SCOs) and sharable content object reference model (SCORM); instructional systems design process; templates; and quality assurance. (LRW)

  8. Community College Management by Objectives: Process, Progress, Problems.

    Science.gov (United States)

    Deegan, William L.; And Others

    The objectives of this book are: (1) to present a theoretical framework for management by objectives in community colleges, (2) to present information about alternative methods for conducting needs assessment and implementing management by objectives, (3) to present a framework for integrating academic and fiscal planning through management by…

  9. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    Science.gov (United States)

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  10. Use of Knowledge Bases in Education of Database Management

    Science.gov (United States)

    Radványi, Tibor; Kovács, Emod

    2008-01-01

    In this article we present a segment of Sulinet Digital Knowledgebase curriculum system in which you can find the sections of subject-matter which aid educating the database management. You can follow the order of the course from the beginning when some topics appearance and raise in elementary school, through the topics accomplish in secondary…

  11. CPU and cache efficient management of memory-resident databases

    NARCIS (Netherlands)

    Pirk, H.; Funke, F.; Grund, M.; Neumann, T.; Leser, U.; Manegold, S.; Kemper, A.; Kersten, M.L.

    2013-01-01

    Memory-Resident Database Management Systems (MRDBMS) have to be optimized for two resources: CPU cycles and memory bandwidth. To optimize for bandwidth in mixed OLTP/OLAP scenarios, the hybrid or Partially Decomposed Storage Model (PDSM) has been proposed. However, in current implementations,

  12. CPU and Cache Efficient Management of Memory-Resident Databases

    NARCIS (Netherlands)

    H. Pirk (Holger); F. Funke; M. Grund; T. Neumann (Thomas); U. Leser; S. Manegold (Stefan); A. Kemper (Alfons); M.L. Kersten (Martin)

    2013-01-01

    htmlabstractMemory-Resident Database Management Systems (MRDBMS) have to be optimized for two resources: CPU cycles and memory bandwidth. To optimize for bandwidth in mixed OLTP/OLAP scenarios, the hybrid or Partially Decomposed Storage Model (PDSM) has been proposed. However, in current

  13. Selecting a Relational Database Management System for Library Automation Systems.

    Science.gov (United States)

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  14. Benefits of a relational database for computerized management

    International Nuclear Information System (INIS)

    Shepherd, W.W.

    1991-01-01

    This paper reports on a computerized relational database which is the basis for a hazardous materials information management system which is comprehensive, effective, flexible and efficient. The system includes product information for Material Safety Data Sheets (MSDSs), labels, shipping, and the environment and is used in Dowell Schlumberger (DS) operations worldwide for a number of programs including planning, training, emergency response and regulatory compliance

  15. Managing Objects in a Relational Framework

    Science.gov (United States)

    1989-01-01

    Database Week, San Jose CA, May.1983, pp.107-113. [Stonebraker 85] Stonebraker,M. and Rowe,L.: "The Design of POSTGRES " Tech.Report UC Berkeley, Nov...latter is equivalent to the definition of an attribute in a POSTGRES relation using the generic Quel facility. Recently, recursive query languages have...utilize rewrite rules. OSQL [Lynl 88] provides a language for associative access. 2. The POSTGRES model [Sto 86] allows Quel and C-procedures as the

  16. Principles and objective of radioactive waste management

    International Nuclear Information System (INIS)

    Warnecke, E.

    1995-01-01

    Radioactive waste is generated in various nuclear applications, for example, in the use of radionuclides in medicine, industry and research or in the nuclear fuel cycle. It must be managed in a safe way independent of its very different characteristics. Establishing the basic safety philosophy is an important contribution to promoting and developing international consensus in radioactive waste management. The principles of radioactive waste management were developed with supporting text to provide such a safety philosophy. They cover the protection of human health and the environment now and in the future within and beyond national borders, the legal framework, the generation and management of radioactive wastes, and the safety of facilities. Details of the legal framework are provided by defining the roles and responsibilities of the Member State, the regulatory body and the waste generators and operators of radioactive waste management facilities. These principles and the responsibilities in radioactive waste management are contained in two recently published top level documents of the Radioactive Waste Safety Standards (RADWASS) programme which is the IAEA's contribution to foster international consensus in radioactive waste management. As the two documents have to cover all aspects of radioactive waste management they have to be formulated in a generic way. Details will be provided in other, more specific documents of the RADWASS programme as outlined in the RADWASS publication plant. The RADWASS documents are published in the Agency's Safety Series, which provides recommendations to Member Sates. Using material from the top level RADWASS documents a convention on the safety of radioactive waste management is under development to provide internationally binding requirements for radioactive waste management. (author). 12 refs

  17. A database system for enhancing fuel records management capabilities

    International Nuclear Information System (INIS)

    Rieke, Phil; Razvi, Junaid

    1994-01-01

    The need to modernize the system of managing a large variety of fuel related data at the TRIGA Reactors Facility at General Atomics, as well as the need to improve NRC nuclear material reporting requirements, prompted the development of a database to cover all aspects of fuel records management. The TRIGA Fuel Database replaces (a) an index card system used for recording fuel movements, (b) hand calculations for uranium burnup, and (c) a somewhat aged and cumbersome system of recording fuel inspection results. It was developed using Microsoft Access, a relational database system for Windows. Instead of relying on various sources for element information, users may now review individual element statistics, record inspection results, calculate element burnup and more, all from within a single application. Taking full advantage of the ease-of-use features designed in to Windows and Access, the user can enter and extract information easily through a number of customized on screen forms, with a wide variety of reporting options available. All forms are accessed through a main 'Options' screen, with the options broken down by categories, including 'Elements', 'Special Elements/Devices', 'Control Rods' and 'Areas'. Relational integrity and data validation rules are enforced to assist in ensuring accurate and meaningful data is entered. Among other items, the database lets the user define: element types (such as FLIP or standard) and subtypes (such as fuel follower, instrumented, etc.), various inspection codes for standardizing inspection results, areas within the facility where elements are located, and the power factors associated with element positions within a reactor. Using fuel moves, power history, power factors and element types, the database tracks uranium burnup and plutonium buildup on a quarterly basis. The Fuel Database was designed with end-users in mind and does not force an operations oriented user to learn any programming or relational database theory in

  18. Risk Management of Jettisoned Objects in LEO

    Science.gov (United States)

    Bacon, John B.; Gray, Charles

    2011-01-01

    The construction and maintenance of the International Space Station (ISS) has led to the release of many objects into its orbital plane, usually during the course of an extra-vehicular activity (EVA). Such releases are often unintentional, but in a growing number of cases, the jettison has been intentional, conducted after a careful assessment of the net risk to the partnership and to other objects in space. Since its launch in 1998 the ISS has contributed on average at least one additional debris object that is simultaneously in orbit with the station, although the number varies widely from zero to eight at any one moment. All of these objects present potential risks to other objects in orbit. Whether it comes from known and tracked orbiting objects or from unknown or untrackable objects, collision with orbital debris can have disastrous consequences. Objects greater than 10cm are generally well documented and tracked, allowing orbiting spacecraft or satellites opportunities to perform evasive maneuvers (commonly known as Debris Avoidance Maneuvers, or DAMs) in the event that imminent collision is predicted. The issue with smaller debris; however, is that it is too numerous to be tracked effectively and yet still poses disastrous consequences if it intercepts a larger object. Due to the immense kinetic energy of any item in orbit, collision with debris as small as 1cm can have catastrophic consequences for many orbiting satellites or spacecraft. Faced with the growing orbital debris threat and the potentially catastrophic consequences of a collision-generated debris shower originating in an orbit crossing the ISS altitude band, in 2007 the ISS program manger asked program specialists to coordinate a multilateral jettison policy amongst the ISS partners. This policy would define the acceptable risk trade rationale for intentional release of a debris object, and other mandatory constraints on such jettisons to minimize the residual risks whenever a jettison was

  19. Moving objects management models, techniques and applications

    CERN Document Server

    Meng, Xiaofeng; Xu, Jiajie

    2014-01-01

    This book describes the topics of moving objects modeling and location tracking, indexing and querying, clustering, location uncertainty, traffic aware navigation and privacy issues as well as the application to intelligent transportation systems.

  20. Catchment scale multi-objective flood management

    Science.gov (United States)

    Rose, Steve; Worrall, Peter; Rosolova, Zdenka; Hammond, Gene

    2010-05-01

    Rural land management is known to affect both the generation and propagation of flooding at the local scale, but there is still a general lack of good evidence that this impact is still significant at the larger catchment scale given the complexity of physical interactions and climatic variability taking place at this level. The National Trust, in partnership with the Environment Agency, are managing an innovative project on the Holnicote Estate in south west England to demonstrate the benefits of using good rural land management practices to reduce flood risk at the both the catchment and sub-catchment scales. The Holnicote Estate is owned by the National Trust and comprises about 5,000 hectares of land, from the uplands of Exmoor to the sea, incorporating most of the catchments of the river Horner and Aller Water. There are nearly 100 houses across three villages that are at risk from flooding which could potentially benefit from changes in land management practices in the surrounding catchment providing a more sustainable flood attenuation function. In addition to the contribution being made to flood risk management there are a range of other ecosystems services that will be enhanced through these targeted land management changes. Alterations in land management will create new opportunities for wildlife and habitats and help to improve the local surface water quality. Such improvements will not only create additional wildlife resources locally but also serve the landscape response to climate change effects by creating and enhancing wildlife networks within the region. Land management changes will also restore and sustain landscape heritage resources and provide opportunities for amenity, recreation and tourism. The project delivery team is working with the National Trust from source to sea across the entire Holnicote Estate, to identify and subsequently implement suitable land management techniques to manage local flood risk within the catchments. These

  1. How many plans are needed in an IMRT multi-objective plan database?

    International Nuclear Information System (INIS)

    Craft, David; Bortfeld, Thomas

    2008-01-01

    In multi-objective radiotherapy planning, we are interested in Pareto surfaces of dimensions 2 up to about 10 (for head and neck cases, the number of structures to trade off can be this large). A key question that has not been answered yet is: how many plans does it take to sufficiently represent a high-dimensional Pareto surface? In this paper, we present a method to answer this question, and we show that the number of points needed is modest: 75 plans always controlled the error to within 5%, and in all cases but one, N + 1 plans, where N is the number of objectives, was enough for <15% error. We introduce objective correlation matrices and principal component analysis (PCA) of the beamlet solutions as two methods to understand this. PCA reveals that the feasible beamlet solutions of a Pareto database lie in a narrow, small dimensional subregion of the full beamlet space, which helps explain why the number of plans needed to characterize the database is small

  2. Databases

    Digital Repository Service at National Institute of Oceanography (India)

    Kunte, P.D.

    Information on bibliographic as well as numeric/textual databases relevant to coastal geomorphology has been included in a tabular form. Databases cover a broad spectrum of related subjects like coastal environment and population aspects, coastline...

  3. Database And Interface Modifications: Change Management Without Affecting The Clients

    CERN Document Server

    Peryt, M; Martin Marquez, M; Zaharieva, Z

    2011-01-01

    The first Oracle®-based Controls Configuration Database (CCDB) was developed in 1986, by which the controls system of CERN’s Proton Synchrotron became data-driven. Since then, this mission-critical system has evolved tremendously going through several generational changes in terms of the increasing complexity of the control system, software technologies and data models. Today, the CCDB covers the whole CERN accelerator complex and satisfies a much wider range of functional requirements. Despite its online usage, everyday operations of the machines must not be disrupted. This paper describes our approach with respect to dealing with change while ensuring continuity. How do we manage the database schema changes? How do we take advantage of the latest web deployed application development frameworks without alienating the users? How do we minimize impact on the dependent systems connected to databases through various APIs? In this paper we will provide our answers to these questions, and to many more.

  4. Effectiveness of alternative management strategies in meeting conservation objectives

    Science.gov (United States)

    Richards S. Holthausen; Carolyn Hull Sieg

    2007-01-01

    This chapter evaluates how well various management strategies meet a variety of conservation objectives, summarizes their effectiveness in meeting objectives for rare or little-known (RLK) species, and proposes ways to combine strategies to meet overall conservation objectives. We address two broad categories of management strategies. Species approaches result in...

  5. The use of database management systems in particle physics

    CERN Document Server

    Stevens, P H; Read, B J; Rittenberg, Alan

    1979-01-01

    Examines data-handling needs and problems in particle physics and looks at three very different efforts by the Particle Data Group (PDG) , the CERN-HERA Group in Geneva, and groups cooperating with ZAED in Germany at resolving these problems. The ZAED effort does not use a database management system (DBMS), the CERN-HERA Group uses an existing, limited capability DBMS, and PDG uses the Berkely Database Management (BDMS), which PDG itself designed and implemented with scientific data-handling needs in mind. The range of problems each group tried to resolve was influenced by whether or not a DBMS was available and by what capabilities it had. Only PDG has been able to systematically address all the problems. The authors discuss the BDMS- centered system PDG is now building in some detail. (12 refs).

  6. Use of SQL Databases to Support Human Resource Management

    OpenAIRE

    Zeman, Jan

    2011-01-01

    Bakalářská práce se zaměřuje na návrh SQL databáze pro podporu Řízení lidských zdrojů a její následné vytvoření v programu MS SQL Server. This thesis focuses on the design of SQL database for support Human resources management and its creation in MS SQL Server. A

  7. Concierge: Personal database software for managing digital research resources

    Directory of Open Access Journals (Sweden)

    Hiroyuki Sakai

    2007-11-01

    Full Text Available This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literaturemanagement, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp.

  8. Software configuration management plan for the Hanford site technical database

    International Nuclear Information System (INIS)

    GRAVES, N.J.

    1999-01-01

    The Hanford Site Technical Database (HSTD) is used as the repository/source for the technical requirements baseline and programmatic data input via the Hanford Site and major Hanford Project Systems Engineering (SE) activities. The Hanford Site SE effort has created an integrated technical baseline for the Hanford Site that supports SE processes at the Site and project levels which is captured in the HSTD. The HSTD has been implemented in Ascent Logic Corporation (ALC) Commercial Off-The-Shelf (COTS) package referred to as the Requirements Driven Design (RDD) software. This Software Configuration Management Plan (SCMP) provides a process and means to control and manage software upgrades to the HSTD system

  9. Using objectives for managing safety and health

    International Nuclear Information System (INIS)

    Shoemaker, D.R.

    1990-01-01

    This morning I am going to talk about the International Mine Safety Rating System of the International Loss Control Institute. At the Questa mine we simply call it the ILCI System. The ILCI System has been in effect at Questa since 1982. Today, I want to offer you an outline of the system and a little bit of our experience with the system at Molycorp. In 1965, Molycorp started large-scale open-pit mining at Questa, New Mexico. In 1978 the decision was made to phase out surface mining and develop a large underground mine. Construction started in 1979, and production commenced in 1983. In 1982, with a work force approaching 900, and a 15-man safety department, we had an accident frequency rate twice the national average. At that point, as we were preparing to start underground production, we decided to become part of the International Safety Rating System. The International Safety Rating System (ISRS) is a modern safety program evaluation system. It provides the means for a systematic analysis of each element of the safety program to determine the extent and quality of management control. Auditing has long been an accepted management practice to ensure that critical business operations are performed in an efficient and profitable manner. Likewise, management has inadequate verification of the effectiveness of a safety program without the kind of audit this rating system provides. Today, largely because of the ILCI system our accident/incident rate has dropped to almost half the national average. Our production costs are nearly half of their historical high. A significant part of the savings has come from decreased expenditures for total accident losses as a result of our lower accident rates

  10. A Support Database System for Integrated System Health Management (ISHM)

    Science.gov (United States)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  11. A multidisciplinary database for geophysical time series management

    Science.gov (United States)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  12. Data Rods: High Speed, Time-Series Analysis of Massive Cryospheric Data Sets Using Object-Oriented Database Methods

    Science.gov (United States)

    Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.

    2011-12-01

    Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical

  13. Computerized database management system for breast cancer patients.

    Science.gov (United States)

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.

  14. Design and implementation of an XML based object-oriented detector description database for CMS

    International Nuclear Information System (INIS)

    Liendl, M.

    2003-04-01

    This thesis deals with the development of a detector description database (DDD) for the compact muon solenoid (CMS) experiment at the large hadron collider (LHC) located at the European organization for nuclear research (CERN). DDD is a fundamental part of the CMS offline software with its main applications, simulation and reconstruction. Both are in need of different models of the detector in order to efficiently solve their specific tasks. In the thesis the requirements to a detector description database are analyzed and the chosen solution is described in detail. It comprises the following components: an XML based detector description language, a runtime system that implements an object-oriented transient representation of the detector, and an application programming interface to be used by client applications. One of the main aspects of the development is the design of the DDD components. The starting point is a domain model capturing concisely the characteristics of the problem domain. The domain model is transformed into several implementation models according to the guidelines of the model driven architecture (MDA). Implementation models and appropriate refinements thereof are foundation for adequate implementations. Using the MDA approach, a fully functional prototype was realized in C++ and XML. The prototype was successfully tested through seamless integration into both the simulation and the reconstruction framework of CMS. (author)

  15. Managing ANSA Objects with OSI Network Management Tools

    OpenAIRE

    Polizzi, Marc; Genilloud, Guy

    1995-01-01

    OSI Network Management provides a general framework for the management of OSI systems, and by extension of any distributed system. However, as this model is not well-adapted for the management of software components, distributed programming environments (e.g. DCE, CORBA, ANSAware) essentially ignore the OSI Network Management model. We assume nevertheless that OSI Network managers will want to have some control of a distributed infrastructure and application. We examine how access to some of ...

  16. An object-oriented-database-system to assist control room staff

    Energy Technology Data Exchange (ETDEWEB)

    Schildt, G H [Vienna Univ. of Technology, Vienna (Austria). Inst. for Automation

    1997-12-31

    In order to assist control room staff of failure of any electrical or mechanical component a new objects-oriented-database-system (OODBS) has been developed and installed. Monitoring and diagnostics may be supported by this OODBS within a well-defined response time. The operator gets a report on different levels: For example, at a first level data about the vendor of a device (like reactor vessel internals, pumps, valves, etc.), data of installation, history of failures since installation, at a second level e.g. technical data of the device, at a next level e.g. a scanned photo of the device with its identification number within a certain compartment, and at another level using a CAD-system presenting technical drawings and corresponding part lists in order to assist necessary communication between operator and maintenance technician. (author). 3 refs, 10 figs.

  17. An object-oriented-database-system to assist control room staff

    International Nuclear Information System (INIS)

    Schildt, G.H.

    1996-01-01

    In order to assist control room staff of failure of any electrical or mechanical component a new objects-oriented-database-system (OODBS) has been developed and installed. Monitoring and diagnostics may be supported by this OODBS within a well-defined response time. The operator gets a report on different levels: For example, at a first level data about the vendor of a device (like reactor vessel internals, pumps, valves, etc.), data of installation, history of failures since installation, at a second level e.g. technical data of the device, at a next level e.g. a scanned photo of the device with its identification number within a certain compartment, and at another level using a CAD-system presenting technical drawings and corresponding part lists in order to assist necessary communication between operator and maintenance technician. (author). 3 refs, 10 figs

  18. Process Architecture for Managing Digital Object Identifiers

    Science.gov (United States)

    Wanchoo, L.; James, N.; Stolte, E.

    2014-12-01

    In 2010, NASA's Earth Science Data and Information System (ESDIS) Project implemented a process for registering Digital Object Identifiers (DOIs) for data products distributed by Earth Observing System Data and Information System (EOSDIS). For the first 3 years, ESDIS evolved the process involving the data provider community in the development of processes for creating and assigning DOIs, and guidelines for the landing page. To accomplish this, ESDIS established two DOI User Working Groups: one for reviewing the DOI process whose recommendations were submitted to ESDIS in February 2014; and the other recently tasked to review and further develop DOI landing page guidelines for ESDIS approval by end of 2014. ESDIS has recently upgraded the DOI system from a manually-driven system to one that largely automates the DOI process. The new automated feature include: a) reviewing the DOI metadata, b) assigning of opaque DOI name if data provider chooses, and c) reserving, registering, and updating the DOIs. The flexibility of reserving the DOI allows data providers to embed and test the DOI in the data product metadata before formally registering with EZID. The DOI update process allows the changing of any DOI metadata except the DOI name unless the name has not been registered. Currently, ESDIS has processed a total of 557 DOIs of which 379 DOIs are registered with EZID and 178 are reserved with ESDIS. The DOI incorporates several metadata elements that effectively identify the data product and the source of availability. Of these elements, the Uniform Resource Locator (URL) attribute has the very important function of identifying the landing page which describes the data product. ESDIS in consultation with data providers in the Earth Science community is currently developing landing page guidelines that specify the key data product descriptive elements to be included on each data product's landing page. This poster will describe in detail the unique automated process and

  19. Development of an integrated database management system to evaluate integrity of flawed components of nuclear power plant

    International Nuclear Information System (INIS)

    Mun, H. L.; Choi, S. N.; Jang, K. S.; Hong, S. Y.; Choi, J. B.; Kim, Y. J.

    2001-01-01

    The object of this paper is to develop an NPP-IDBMS(Integrated DataBase Management System for Nuclear Power Plants) for evaluating the integrity of components of nuclear power plant using relational data model. This paper describes the relational data model, structure and development strategy for the proposed NPP-IDBMS. The NPP-IDBMS consists of database, database management system and interface part. The database part consists of plant, shape, operating condition, material properties and stress database, which are required for the integrity evaluation of each component in nuclear power plants. For the development of stress database, an extensive finite element analysis was performed for various components considering operational transients. The developed NPP-IDBMS will provide efficient and accurate way to evaluate the integrity of flawed components

  20. Computer programme for control and maintenance and object oriented database: application to the realisation of an particle accelerator, the VIVITRON

    International Nuclear Information System (INIS)

    Diaz, A.

    1996-01-01

    The command and control system for the Vivitron, a new generation electrostatic particle accelerator, has been implemented using workstations and front-end computers using VME standards, the whole within an environment of UNIX/VxWorks. This architecture is distributed over an Ethernet network. Measurements and commands of the different sensors and actuators are concentrated in the front-end computers. The development of a second version of the software giving better performance and more functionality is described. X11 based communication has been utilised to transmit all the necessary informations to display parameters within the front-end computers on to the graphic screens. All other communications between processes use the Remote Procedure Call method (RPC). The conception of the system is based largely on the object oriented database O 2 which integrates a full description of equipments and the code necessary to manage it. This code is generated by the database. This innovation permits easy maintenance of the system and bypasses the need of a specialist when adding new equipments. The new version of the command and control system has been progressively installed since August 1995. (author)

  1. Management Guidelines for Database Developers' Teams in Software Development Projects

    Science.gov (United States)

    Rusu, Lazar; Lin, Yifeng; Hodosi, Georg

    Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.

  2. Optimized Database of Higher Education Management Using Data Warehouse

    Directory of Open Access Journals (Sweden)

    Spits Warnars

    2010-04-01

    Full Text Available The emergence of new higher education institutions has created the competition in higher education market, and data warehouse can be used as an effective technology tools for increasing competitiveness in the higher education market. Data warehouse produce reliable reports for the institution’s high-level management in short time for faster and better decision making, not only on increasing the admission number of students, but also on the possibility to find extraordinary, unconventional funds for the institution. Efficiency comparison was based on length and amount of processed records, total processed byte, amount of processed tables, time to run query and produced record on OLTP database and data warehouse. Efficiency percentages was measured by the formula for percentage increasing and the average efficiency percentage of 461.801,04% shows that using data warehouse is more powerful and efficient rather than using OLTP database. Data warehouse was modeled based on hypercube which is created by limited high demand reports which usually used by high level management. In every table of fact and dimension fields will be inserted which represent the loading constructive merge where the ETL (Extraction, Transformation and Loading process is run based on the old and new files.

  3. METODE RESET PASSWORD LEVEL ROOT PADA RELATIONAL DATABASE MANAGEMENT SYSTEM (RDBMS MySQL

    Directory of Open Access Journals (Sweden)

    Taqwa Hariguna

    2011-08-01

    Full Text Available Database merupakan sebuah hal yang penting untuk menyimpan data, dengan database organisasi akan mendapatkan keuntungan dalam beberapa hal, seperti kecepatan akases dan mengurangi penggunaan kertas, namun dengan implementasi database tidak jarang administrator database lupa akan password yang digunakan, hal ini akan mempersulit dalam proses penangganan database. Penelitian ini bertujuan untuk menggali cara mereset password level root pada relational database management system MySQL.

  4. A framework for cross-observatory volcanological database management

    Science.gov (United States)

    Aliotta, Marco Antonio; Amore, Mauro; Cannavò, Flavio; Cassisi, Carmelo; D'Agostino, Marcello; Dolce, Mario; Mastrolia, Andrea; Mangiagli, Salvatore; Messina, Giuseppe; Montalto, Placido; Fabio Pisciotta, Antonino; Prestifilippo, Michele; Rossi, Massimo; Scarpato, Giovanni; Torrisi, Orazio

    2017-04-01

    In the last years, it has been clearly shown how the multiparametric approach is the winning strategy to investigate the complex dynamics of the volcanic systems. This involves the use of different sensor networks, each one dedicated to the acquisition of particular data useful for research and monitoring. The increasing interest devoted to the study of volcanological phenomena led the constitution of different research organizations or observatories, also relative to the same volcanoes, which acquire large amounts of data from sensor networks for the multiparametric monitoring. At INGV we developed a framework, hereinafter called TSDSystem (Time Series Database System), which allows to acquire data streams from several geophysical and geochemical permanent sensor networks (also represented by different data sources such as ASCII, ODBC, URL etc.), located on the main volcanic areas of Southern Italy, and relate them within a relational database management system. Furthermore, spatial data related to different dataset are managed using a GIS module for sharing and visualization purpose. The standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common space and time scale. In order to share data between INGV observatories, and also with Civil Protection, whose activity is related on the same volcanic districts, we designed a "Master View" system that, starting from the implementation of a number of instances of the TSDSystem framework (one for each observatory), makes possible the joint interrogation of data, both temporal and spatial, on instances located in different observatories, through the use of web services technology (RESTful, SOAP). Similarly, it provides metadata for equipment using standard schemas (such as FDSN StationXML). The "Master View" is also responsible for managing the data policy through a "who owns what" system, which allows you to associate viewing/download of

  5. Learning Objectives for Master's theses at DTU Management Engineering

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Rasmussen, Birgitte; Hinz, Hector Nøhr

    2010-01-01

    , different. The DTU Study Handbook states that:”Learning objectives are an integrated part of the supervision”, which provides you with the opportunity – naturally in cooperation with your supervisor – to formulate learning objectives for your Master's thesis. There are at least three good reasons for being...... that you formulate precise and useful learning objectives for your Master's thesis. These notes of inspiration have been written to help you do exactly this. The notes discuss the requirements for the learning objectives, examples of learning objectives and the assessment criteria defined by DTU Management...... Engineering as well as, but not least, some useful things to remember concerning your submission and the assessment of the Master's thesis. DTU Management Engineering Claus Thorp Hansen Birgitte Rasmussen Hector Nøhr Hinz © DTU Management Engineering 2010 ISBN nr. 978-87-90855-94-7 This document...

  6. The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    Science.gov (United States)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-01-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  7. The Kepler DB: a database management system for arrays, sparse arrays, and binary data

    Science.gov (United States)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-07-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30 minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database management system (Kepler DB)was created to act as the repository of this information. After one year of flight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one-dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  8. Advances in probabilistic databases for uncertain information management

    CERN Document Server

    Yan, Li

    2013-01-01

    This book covers a fast-growing topic in great depth and focuses on the technologies and applications of probabilistic data management. It aims to provide a single account of current studies in probabilistic data management. The objective of the book is to provide the state of the art information to researchers, practitioners, and graduate students of information technology of intelligent information processing, and at the same time serving the information technology professional faced with non-traditional applications that make the application of conventional approaches difficult or impossible.

  9. Analysis of disease-associated objects at the Rat Genome Database

    Science.gov (United States)

    Wang, Shur-Jen; Laulederkind, Stanley J. F.; Hayman, G. T.; Smith, Jennifer R.; Petri, Victoria; Lowry, Timothy F.; Nigam, Rajni; Dwinell, Melinda R.; Worthey, Elizabeth A.; Munzenmaier, Diane H.; Shimoyama, Mary; Jacob, Howard J.

    2013-01-01

    The Rat Genome Database (RGD) is the premier resource for genetic, genomic and phenotype data for the laboratory rat, Rattus norvegicus. In addition to organizing biological data from rats, the RGD team focuses on manual curation of gene–disease associations for rat, human and mouse. In this work, we have analyzed disease-associated strains, quantitative trait loci (QTL) and genes from rats. These disease objects form the basis for seven disease portals. Among disease portals, the cardiovascular disease and obesity/metabolic syndrome portals have the highest number of rat strains and QTL. These two portals share 398 rat QTL, and these shared QTL are highly concentrated on rat chromosomes 1 and 2. For disease-associated genes, we performed gene ontology (GO) enrichment analysis across portals using RatMine enrichment widgets. Fifteen GO terms, five from each GO aspect, were selected to profile enrichment patterns of each portal. Of the selected biological process (BP) terms, ‘regulation of programmed cell death’ was the top enriched term across all disease portals except in the obesity/metabolic syndrome portal where ‘lipid metabolic process’ was the most enriched term. ‘Cytosol’ and ‘nucleus’ were common cellular component (CC) annotations for disease genes, but only the cancer portal genes were highly enriched with ‘nucleus’ annotations. Similar enrichment patterns were observed in a parallel analysis using the DAVID functional annotation tool. The relationship between the preselected 15 GO terms and disease terms was examined reciprocally by retrieving rat genes annotated with these preselected terms. The individual GO term–annotated gene list showed enrichment in physiologically related diseases. For example, the ‘regulation of blood pressure’ genes were enriched with cardiovascular disease annotations, and the ‘lipid metabolic process’ genes with obesity annotations. Furthermore, we were able to enhance enrichment of neurological

  10. Identifying marine pelagic ecosystem management objectives and indicators

    DEFF Research Database (Denmark)

    Trenkel, Verena M.; Hintzen, Niels T.; Farnsworth, Keith D.

    2015-01-01

    . Overall 26 objectives were proposed, with 58% agreement in proposed objectives between two workshops. Based on published evidence for pressure-state links, examples of operational objectives and suitable indicators for each of the 26 objectives were then selected. It is argued that given the strong......International policy frameworks such as the Common Fisheries Policy and the European Marine Strategy Framework Directive define high-level strategic goals for marine ecosystems. Strategic goals are addressed via general and operational management objectives. To add credibility and legitimacy...... scale in some cases. In the evidence-based approach used in this study, the selection of species or region specific operational objectives and indicators was based on demonstrated pressure-state links. Hence observed changes in indicators can reliably inform on appropriate management measures. (C) 2015...

  11. Food safety objective: an integral part of food chain management

    NARCIS (Netherlands)

    Gorris, L.G.M.

    2005-01-01

    The concept of food safety objective has been proposed to provide a target for operational food safety management, leaving flexibility in the way equivalent food safety levels are achieved by different food chains. The concept helps to better relate operational food safety management to public

  12. Management by Objectives: When and How Does it Work?

    Science.gov (United States)

    DeFee, Dallas T.

    1977-01-01

    According to the author, management by objectives (formal goal setting and review) depends a great deal on the kinds of goals the organization has and the commitment of top management to the process. He discusses its potential advantages and disadvantages, conditions for adopting it, and successful implementation. (JT)

  13. Representing clinical communication knowledge through database management system integration.

    Science.gov (United States)

    Khairat, Saif; Craven, Catherine; Gong, Yang

    2012-01-01

    Clinical communication failures are considered the leading cause of medical errors [1]. The complexity of the clinical culture and the significant variance in training and education levels form a challenge to enhancing communication within the clinical team. In order to improve communication, a comprehensive understanding of the overall communication process in health care is required. In an attempt to further understand clinical communication, we conducted a thorough methodology literature review to identify strengths and limitations of previous approaches [2]. Our research proposes a new data collection method to study the clinical communication activities among Intensive Care Unit (ICU) clinical teams with a primary focus on the attending physician. In this paper, we present the first ICU communication instrument, and, we introduce the use of database management system to aid in discovering patterns and associations within our ICU communications data repository.

  14. Information Management in Creative Engineering Design and Capabilities of Database Transactions

    DEFF Research Database (Denmark)

    Jacobsen, Kim; Eastman, C. A.; Jeng, T. S.

    1997-01-01

    This paper examines the information management requirements and sets forth the general criteria for collaboration and concurrency control in creative engineering design. Our work attempts to recognize the full range of concurrency, collaboration and complex transactions structure now practiced...... in manual and semi-automated design and the range of capabilities needed as the demands for enhanced but flexible electronic information management unfolds.The objective of this paper is to identify new issues that may advance the use of databases to support creative engineering design. We start...... with a generalized description of the structure of design tasks and how information management in design is dealt with today. After this review, we identify extensions to current information management capabilities that have been realized and/or proposed to support/augment what designers can do now. Given...

  15. Managing Database Services: An Approach Based in Information Technology Services Availabilty and Continuity Management

    Directory of Open Access Journals (Sweden)

    Leonardo Bastos Pontes

    2017-01-01

    Full Text Available This paper is held in the information technology services management environment, with a few ideas of information technology governance, and purposes to implement a hybrid model to manage the services of a database, based on the principles of information technology services management in a supplementary health operator. This approach utilizes fundamental nuances of services management guides, such as CMMI for Services, COBIT, ISO 20000, ITIL and MPS.BR for Services; it studies harmonically Availability and Continuity Management, as most part of the guides also do. This work has its importance because it keeps a good flow in the database and improves the agility of the systems in the accredited clinics in the health plan.

  16. Objectively measured physical activity and sedentary time in youth: the International children's accelerometry database (ICAD).

    Science.gov (United States)

    Cooper, Ashley R; Goodman, Anna; Page, Angie S; Sherar, Lauren B; Esliger, Dale W; van Sluijs, Esther M F; Andersen, Lars Bo; Anderssen, Sigmund; Cardon, Greet; Davey, Rachel; Froberg, Karsten; Hallal, Pedro; Janz, Kathleen F; Kordas, Katarzyna; Kreimler, Susi; Pate, Russ R; Puder, Jardena J; Reilly, John J; Salmon, Jo; Sardinha, Luis B; Timperio, Anna; Ekelund, Ulf

    2015-09-17

    Physical activity and sedentary behaviour in youth have been reported to vary by sex, age, weight status and country. However, supporting data are often self-reported and/or do not encompass a wide range of ages or geographical locations. This study aimed to describe objectively-measured physical activity and sedentary time patterns in youth. The International Children's Accelerometry Database (ICAD) consists of ActiGraph accelerometer data from 20 studies in ten countries, processed using common data reduction procedures. Analyses were conducted on 27,637 participants (2.8-18.4 years) who provided at least three days of valid accelerometer data. Linear regression was used to examine associations between age, sex, weight status, country and physical activity outcomes. Boys were less sedentary and more active than girls at all ages. After 5 years of age there was an average cross-sectional decrease of 4.2% in total physical activity with each additional year of age, due mainly to lower levels of light-intensity physical activity and greater time spent sedentary. Physical activity did not differ by weight status in the youngest children, but from age seven onwards, overweight/obese participants were less active than their normal weight counterparts. Physical activity varied between samples from different countries, with a 15-20% difference between the highest and lowest countries at age 9-10 and a 26-28% difference at age 12-13. Physical activity differed between samples from different countries, but the associations between demographic characteristics and physical activity were consistently observed. Further research is needed to explore environmental and sociocultural explanations for these differences.

  17. Enhanced DIII-D Data Management Through a Relational Database

    Science.gov (United States)

    Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.

    2000-10-01

    A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.

  18. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  19. National Levee Database: monitoring, vulnerability assessment and management in Italy

    Science.gov (United States)

    Barbetta, Silvia; Camici, Stefania; Maccioni, Pamela; Moramarco, Tommaso

    2015-04-01

    Italian levees and historical breach failures to be exploited in the framework of an operational procedure addressed to the seepage vulnerability assessment of river reaches where the levee system is an important structural measure against flooding. For its structure, INLED is a dynamic geospatial database with ongoing efforts to add levee data from authorities with the charge of hydraulic risk mitigation. In particular, the database is aimed to provide the available information about: i) location and condition of levees; ii) morphological and geometrical properties; iii) photographic documentation; iv) historical levee failures; v) assessment of vulnerability to overtopping and seepage carried out through a procedure based on simple vulnerability indexes (Camici et al. 2014); vi) management, control and maintenance; vii)flood hazard maps developed by assuming the levee system undamaged/damaged during the flood event. Currently, INLED contains data of levees that are mostly located in the Tiber basin, Central Italy. References Apel H., Merz B. & Thieken A.H. Quantification of uncertainties in flood risk assessments. Int J River Basin Manag 2008, 6, (2), 149-162. Camici S,, Barbetta S., Moramarco T., Levee body vulnerability to seepage: the case study of the levee failure along the Foenna stream on 1st January 2006 (central Italy)", Journal of Flood Risk Management, in press. Colleselli F. Geotechnical problems related to river and channel embankments. Rotterdam, the Netherlands: Springer, 1994. H. R.Wallingford Consultants (HRWC). Risk assessment for flood and coastal defence for strategic planning: high level methodology technical report, London, 2003. Mazzoleni M., Bacchi B., Barontini S., Di Baldassarre G., Pilotti M. & Ranzi R. Flooding hazard mapping in floodplain areas affected by piping breaches in the Po River, Italy. J Hydrol Eng 2014, 19, (4), 717-731.

  20. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    OpenAIRE

    Raied Salman

    2015-01-01

    In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed an...

  1. The Net Enabled Waste Management Database as an international source of radioactive waste management information

    International Nuclear Information System (INIS)

    Csullog, G.W.; Friedrich, V.; Miaw, S.T.W.; Tonkay, D.; Petoe, A.

    2002-01-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an integral part of the IAEA's policies and strategy related to the collection and dissemination of information, both internal to the IAEA in support of its activities and external to the IAEA (publicly available). The paper highlights the NEWMDB's role in relation to the routine reporting of status and trends in radioactive waste management, in assessing the development and implementation of national systems for radioactive waste management, in support of a newly developed indicator of sustainable development for radioactive waste management, in support of reporting requirements for the Joint Convention on the Safety of Spent Fuel Management and on the Safety of Radioactive Waste Management, in support of IAEA activities related to the harmonization of waste management information at the national and international levels and in relation to the management of spent/disused sealed radioactive sources. (author)

  2. Drug residues in urban water: A database for ecotoxicological risk management.

    Science.gov (United States)

    Destrieux, Doriane; Laurent, François; Budzinski, Hélène; Pedelucq, Julie; Vervier, Philippe; Gerino, Magali

    2017-12-31

    Human-use drug residues (DR) are only partially eliminated by waste water treatment plants (WWTPs), so that residual amounts can reach natural waters and cause environmental hazards. In order to properly manage these hazards in the aquatic environment, a database is made available that integrates the concentration ranges for DR, which cause adverse effects for aquatic organisms, and the temporal variations of the ecotoxicological risks. To implement this database for the ecotoxicological risk assessment (ERA database), the required information for each DR is the predicted no effect concentrations (PNECs), along with the predicted environmental concentrations (PECs). The risk assessment is based on the ratio between the PNECs and the PECs. Adverse effect data or PNECs have been found in the publicly available literature for 45 substances. These ecotoxicity test data have been extracted from 125 different sources. This ERA database contains 1157 adverse effect data and 287 PNECs. The efficiency of this ERA database was tested with a data set coming from a simultaneous survey of WWTPs and the natural environment. In this data set, 26 DR were searched for in two WWTPs and in the river. On five sampling dates, concentrations measured in the river for 10 DR could pose environmental problems of which 7 were measured only downstream of WWTP outlets. From scientific literature and measurements, data implementation with unit homogenisation in a single database facilitates the actual ecotoxicological risk assessment, and may be useful for further risk coming from data arising from the future field survey. Moreover, the accumulation of a large ecotoxicity data set in a single database should not only improve knowledge of higher risk molecules but also supply an objective tool to help the rapid and efficient evaluation of the risk. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Application of database management software to probabilistic risk assessment calculations

    International Nuclear Information System (INIS)

    Wyss, G.D.

    1993-01-01

    Probabilistic risk assessment (PRA) calculations require the management and processing of large amounts of information. This data normally falls into two general categories. For example, a commercial nuclear power plant PRA study makes use of plant blueprints and system schematics, formal plant safety analysis reports, incident reports, letters, memos, handwritten notes from plant visits, and even the analyst's ''engineering judgment''. This information must be documented and cross-referenced in order to properly execute and substantiate the models used in a PRA study. The first category is composed of raw data that is accumulated from equipment testing and operational experiences. These data describe the equipment, its service or testing conditions, its failure mode, and its performance history. The second category is composed of statistical distributions. These distributions can represent probabilities, frequencies, or values of important parameters that are not time-related. Probability and frequency distributions are often obtained by fitting raw data to an appropriate statistical distribution. Database management software is used to store both types of data so that it can be readily queried, manipulated, and archived. This paper provides an overview of the information models used for storing PRA data and illustrates the implementation of these models using examples from current PRA software packages

  4. Cryptanalysis of Password Protection of Oracle Database Management System (DBMS)

    Science.gov (United States)

    Koishibayev, Timur; Umarova, Zhanat

    2016-04-01

    This article discusses the currently available encryption algorithms in the Oracle database, also the proposed upgraded encryption algorithm, which consists of 4 steps. In conclusion we make an analysis of password encryption of Oracle Database.

  5. Accomplishing the objectives of NEA in radioactive waste management

    International Nuclear Information System (INIS)

    Olivier, J.P.; Stadie, K.B.

    1984-01-01

    The objectives of the Nuclear Energy Agency of the OECD in the area of radioactive waste management are to promote studies and improve the data available to support national programmes, to co-ordinate national activities, to promote international projects, and to improve the general level of understanding of waste management issues. The NEA programme concentrates on the disposal of waste and responds to objectives at three levels: sharing of information and organization of joint analytical studies through expert meetings, preparation of technical reports and analysis and dissemination of data; establishment of joint research and development projects designed to support national programmes; and discussion of current issues and strategies, particularly through the Radioactive Waste Management Committee acting as a specialized internatonal forum. The paper discusses, through various specific examples, how the objectives are met. In addition, the paper describes current NEA activities which have not been reported in other papers during the Conference. (author)

  6. Variability in perceived satisfaction of reservoir management objectives

    Science.gov (United States)

    Owen, W.J.; Gates, T.K.; Flug, M.

    1997-01-01

    Fuzzy set theory provides a useful model to address imprecision in interpreting linguistically described objectives for reservoir management. Fuzzy membership functions can be used to represent degrees of objective satisfaction for different values of management variables. However, lack of background information, differing experiences and qualifications, and complex interactions of influencing factors can contribute to significant variability among membership functions derived from surveys of multiple experts. In the present study, probabilistic membership functions are used to model variability in experts' perceptions of satisfaction of objectives for hydropower generation, fish habitat, kayaking, rafting, and scenery preservation on the Green River through operations of Flaming Gorge Dam. Degree of variability in experts' perceptions differed among objectives but resulted in substantial uncertainty in estimation of optimal reservoir releases.

  7. The main objectives of lifetime management of reactor unit components

    International Nuclear Information System (INIS)

    Dragunov, Y.; Kurakov, Y.

    1998-01-01

    The main objectives of the work concerned with life management of reactor components in Russian Federation are as follows: development of regulations in the field of NPP components ageing and lifetime management; investigations of ageing processes; residual life evaluation taking into account the actual state of NPP systems, real loading conditions and number of load cycles, results of in-service inspections; development and implementation of measures for maintaining/enhancing the NPP safety

  8. Study on Mandatory Access Control in a Secure Database Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper proposes a security policy model for mandatory access control in class B1 database management system whose level of labeling is tuple. The relation-hierarchical data model is extended to multilevel relation-hierarchical data model. Based on the multilevel relation-hierarchical data model, the concept of upper-lower layer relational integrity is presented after we analyze and eliminate the covert channels caused by the database integrity. Two SQL statements are extended to process polyinstantiation in the multilevel secure environment. The system is based on the multilevel relation-hierarchical data model and is capable of integratively storing and manipulating multilevel complicated objects (e. g., multilevel spatial data) and multilevel conventional data ( e. g., integer. real number and character string).

  9. The Aalborg Model and management by objectives and resources

    DEFF Research Database (Denmark)

    Qvist, Palle; Spliid, Claus Monrad

    2010-01-01

    Model is successful has never been subject to a scientific study. An educational program in an HEI (Higher Education Institution) can be seen and understood as a system managed by objectives (MBO)5 within a given resource frame and based on an “agreement” between the student and the study board....... The student must achieve the objectives decided by the study board and that achievement is then documented with an exam. The study board supports the student with resources which helps them to fulfill the objectives. When the resources are divided into human, material and methodological resources...

  10. Military Personnel: DOD Has Processes for Operating and Managing Its Sexual Assault Incident Database

    Science.gov (United States)

    2017-01-01

    MILITARY PERSONNEL DOD Has Processes for Operating and Managing Its Sexual Assault Incident Database Report to...to DSAID’s system speed and ease of use; interfaces with MCIO databases ; utility as a case management tool; and users’ ability to query data and... Managing Its Sexual Assault Incident Database What GAO Found As of October 2013, the Department of Defense’s (DOD) Defense Sexual Assault Incident

  11. Effective Use of Java Data Objects in Developing Database Applications; Advantages and Disadvantages

    National Research Council Canada - National Science Library

    Zilidis, Paschalis

    2004-01-01

    .... The major disadvantage of this approach is the well-known impedance mismatch in which some form of mapping is required to connect the objects in the frontend and the relational tuples in the backend. Java Data Objects (JDO...

  12. Objective, Way and Method of Faculty Management Based on Ergonomics

    Science.gov (United States)

    WANG, Hong-bin; Liu, Yu-hua

    2008-01-01

    The core problem that influences educational quality of talents in colleges and universities is the faculty management. Without advanced faculty, it is difficult to cultivate excellent talents. With regard to some problems in present faculty construction of colleges and universities, this paper puts forward the new objectives, ways and methods of…

  13. Load building versus conservation as demand-side management objectives

    International Nuclear Information System (INIS)

    Kexel, D.T.

    1994-01-01

    This paper examines the economics of load building versus conservation as demand-side management objectives. Economic criteria to be used in evaluating each type of program from the perspectives of all impacted parties are provided. The impact of DSM programs on electric rates is shown to be a key focal point of a thorough evaluation

  14. Management by Objectives: The Swedish Experience in Upper Secondary Schools

    Science.gov (United States)

    Lindberg, Erik; Wilson, Timothy L.

    2011-01-01

    Purpose: This paper seeks to explore how managing by objectives (MBO) has been adopted in Swedish schools and to reflect on some of the consequences in a longitudinal study. Results relate to whether introduction has increased student performance and whether it works as a tool for the principals to create more effective schools.…

  15. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  16. UK experience of managing a radioactive materials transport event database

    International Nuclear Information System (INIS)

    Barton, N.J.; Barrett, J.A.

    1999-01-01

    A description is given of the transport event database RAMTED and the related annual accident and incident reports. This database covers accidents and incidents involving the transport of radioactive material in the UK from 1958 to the present day. The paper discusses the history and content of the database, the origin of event data contained in it, the criteria for inclusion and future developments. (author)

  17. National information network and database system of hazardous waste management in China

    Energy Technology Data Exchange (ETDEWEB)

    Ma Hongchang [National Environmental Protection Agency, Beijing (China)

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry, and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.

  18. Dynamic tables: an architecture for managing evolving, heterogeneous biomedical data in relational database management systems.

    Science.gov (United States)

    Corwin, John; Silberschatz, Avi; Miller, Perry L; Marenco, Luis

    2007-01-01

    Data sparsity and schema evolution issues affecting clinical informatics and bioinformatics communities have led to the adoption of vertical or object-attribute-value-based database schemas to overcome limitations posed when using conventional relational database technology. This paper explores these issues and discusses why biomedical data are difficult to model using conventional relational techniques. The authors propose a solution to these obstacles based on a relational database engine using a sparse, column-store architecture. The authors provide benchmarks comparing the performance of queries and schema-modification operations using three different strategies: (1) the standard conventional relational design; (2) past approaches used by biomedical informatics researchers; and (3) their sparse, column-store architecture. The performance results show that their architecture is a promising technique for storing and processing many types of data that are not handled well by the other two semantic data models.

  19. Multi-Dimensional Bitmap Indices for Optimising Data Access within Object Oriented Databases at CERN

    CERN Document Server

    Stockinger, K

    2001-01-01

    Efficient query processing in high-dimensional search spaces is an important requirement for many analysis tools. In the literature on index data structures one can find a wide range of methods for optimising database access. In particular, bitmap indices have recently gained substantial popularity in data warehouse applications with large amounts of read mostly data. Bitmap indices are implemented in various commercial database products and are used for querying typical business applications. However, scientific data that is mostly characterised by non-discrete attribute values cannot be queried efficiently by the techniques currently supported. In this thesis we propose a novel access method based on bitmap indices that efficiently handles multi-dimensional queries against typical scientific data. The algorithm is called GenericRangeEval and is an extension of a bitmap index for discrete attribute values. By means of a cost model we study the performance of queries with various selectivities against uniform...

  20. Design of database management system for 60Co container inspection system

    International Nuclear Information System (INIS)

    Liu Jinhui; Wu Zhifang

    2007-01-01

    The function of the database management system has been designed according to the features of cobalt-60 container inspection system. And the software related to the function has been constructed. The database querying and searching are included in the software. The database operation program is constructed based on Microsoft SQL server and Visual C ++ under Windows 2000. The software realizes database querying, image and graph displaying, statistic, report form and its printing, interface designing, etc. The software is powerful and flexible for operation and information querying. And it has been successfully used in the real database management system of cobalt-60 container inspection system. (authors)

  1. Enabling On-Demand Database Computing with MIT SuperCloud Database Management System

    Science.gov (United States)

    2015-09-15

    arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created

  2. A Systematic Knowledge Management Approach Using Object-Oriented Theory in Customer Complaint Management

    Directory of Open Access Journals (Sweden)

    Wusheng Zhang

    2010-12-01

    Full Text Available Research into the effectiveness of customer complaint management has attracted researchers, yet there has been little discussion on customer complaint management in the context of systematic knowledge management approach particularly in the domain of hotel industry. This paper aims to address such gap through the application of object-oriented theory for which the notation of unified modelling language has been adopted for the representation of the concepts, objects, relationships and vocabularies in the domain. The paper used data from forty seven hotel management staff and academics in hospitalitymanagement to investigate lessons learned and best practices in customer complaint management and knowledge management. By providing insights into the potential of a knowledge management approach using object oriented theory, this study advances our understanding on how a knowledge management approach can systematically support the management of hotel customer complaints.

  3. One approach for Management by Objectives and Results in Scandinavia?

    DEFF Research Database (Denmark)

    Kristiansen, Mads Bøge

    2016-01-01

    Viewed from abroad, Denmark, Norway and Sweden look very similar. In the literature on public management reforms and performance management, these countries are frequently regarded as one, and the literature often refers to a specific Nordic or Scandinavian model. The aim of this paper is to empi...... in which MBOR is implemented. An important implication therefore is that it is unlikely that there is ‘one best way’ of managing or steering an agency, and MBOR will appear and function differently in different contexts.......Viewed from abroad, Denmark, Norway and Sweden look very similar. In the literature on public management reforms and performance management, these countries are frequently regarded as one, and the literature often refers to a specific Nordic or Scandinavian model. The aim of this paper...... is to empirically test the argument concerning the existence of one Nordic perspective on performance management. The paper presents a comparative study of Management by Objectives and Results (MBOR) in Prison and Probation Services, Food Safety, and Meteorology in Denmark, Norway and Sweden. The paper examines...

  4. Well Field Management Using Multi-Objective Optimization

    DEFF Research Database (Denmark)

    Hansen, Annette Kirstine; Hendricks Franssen, H. J.; Bauer-Gottwein, Peter

    2013-01-01

    with infiltration basins, injection wells and abstraction wells. The two management objectives are to minimize the amount of water needed for infiltration and to minimize the risk of getting contaminated water into the drinking water wells. The management is subject to a daily demand fulfilment constraint. Two...... different optimization methods are tested. Constant scheduling where decision variables are held constant during the time of optimization, and sequential scheduling where the optimization is performed stepwise for daily time steps. The latter is developed to work in a real-time situation. Case study...

  5. Symposium overview: incorporating ecosystem objectives within fisheries management

    DEFF Research Database (Denmark)

    Gislason, Henrik; Sinclair, M.; Sainsbury, K.

    2000-01-01

    into account ecosystem considerations. There was not, however, a consensus on what additional restrictions are required, or on what features of ecosystems need to be protected. A way forward is to add ecosystem objectives to the conservation component of fisheries management plans, as well as to the management...... and a greater workload added to the process of provision of scientific advice through peer review. Of equal importance would be the challenges of establishing a governance framework to address multiple uses of marine resources. The spirit of the Symposium was that these coupled scientific and governance...

  6. Health technology management: a database analysis as support of technology managers in hospitals.

    Science.gov (United States)

    Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi

    2011-01-01

    Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy.

  7. The Erasmus insurance case and a related questionnaire for distributed database management systems

    NARCIS (Netherlands)

    S.C. van der Made-Potuijt

    1990-01-01

    textabstractThis is the third report concerning transaction management in the database environment. In the first report the role of the transaction manager in protecting the integrity of a database has been studied [van der Made-Potuijt 1989]. In the second report a model has been given for a

  8. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    Science.gov (United States)

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  9. Using Risk Assessment Methodologies to Meet Management Objectives

    Science.gov (United States)

    DeMott, D. L.

    2015-01-01

    Corporate and program objectives focus on desired performance and results. ?Management decisions that affect how to meet these objectives now involve a complex mix of: technology, safety issues, operations, process considerations, employee considerations, regulatory requirements, financial concerns and legal issues. ?Risk Assessments are a tool for decision makers to understand potential consequences and be in a position to reduce, mitigate or eliminate costly mistakes or catastrophic failures. Using a risk assessment methodology is only a starting point. ?A risk assessment program provides management with important input in the decision making process. ?A pro-active organization looks to the future to avoid problems, a reactive organization can be blindsided by risks that could have been avoided. ?You get out what you put in, how useful your program is will be up to the individual organization.

  10. Environmental Management System Objectives & Targets Results Summary - FY 2015.

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Douglas W

    2016-02-01

    Sandia National Laboratories (SNL) Environmental Management System is the integrated approach for members of the workforce to identify and manage environmental risks. Each Fiscal Year (FY) SNL performs an analysis to identify environmental aspects, and the environmental programs associated with them are charged with the task of routinely monitoring and measuring the objectives and targets that are established to mitigate potential impacts of SNL's operations on the environment. An annual summary of the results achieved towards meeting established Sandia Corporation and SNL Site-specific objectives and targets provides a connection to, and rational for, annually revised environmental aspects. The purpose of this document is to summarize the results achieved and documented in FY 2015.

  11. Towards the management of the databases founded on descriptions ...

    African Journals Online (AJOL)

    The canonical model is defined in the concept language, developed in our research ... the notion of classes to produce descriptions which are, also, used in the reasoning process. ... Keys-Words: Descriptions logic/ Databases/ Semantics.

  12. A multi-objective approach to solid waste management.

    Science.gov (United States)

    Galante, Giacomo; Aiello, Giuseppe; Enea, Mario; Panascia, Enrico

    2010-01-01

    The issue addressed in this paper consists in the localization and dimensioning of transfer stations, which constitute a necessary intermediate level in the logistic chain of the solid waste stream, from municipalities to the incinerator. Contextually, the determination of the number and type of vehicles involved is carried out in an integrated optimization approach. The model considers both initial investment and operative costs related to transportation and transfer stations. Two conflicting objectives are evaluated, the minimization of total cost and the minimization of environmental impact, measured by pollution. The design of the integrated waste management system is hence approached in a multi-objective optimization framework. To determine the best means of compromise, goal programming, weighted sum and fuzzy multi-objective techniques have been employed. The proposed analysis highlights how different attitudes of the decision maker towards the logic and structure of the problem result in the employment of different methodologies and the obtaining of different results. The novel aspect of the paper lies in the proposal of an effective decision support system for operative waste management, rather than a further contribution to the transportation problem. The model was applied to the waste management of optimal territorial ambit (OTA) of Palermo (Italy). 2010 Elsevier Ltd. All rights reserved.

  13. A multi-objective approach to solid waste management

    International Nuclear Information System (INIS)

    Galante, Giacomo; Aiello, Giuseppe; Enea, Mario; Panascia, Enrico

    2010-01-01

    The issue addressed in this paper consists in the localization and dimensioning of transfer stations, which constitute a necessary intermediate level in the logistic chain of the solid waste stream, from municipalities to the incinerator. Contextually, the determination of the number and type of vehicles involved is carried out in an integrated optimization approach. The model considers both initial investment and operative costs related to transportation and transfer stations. Two conflicting objectives are evaluated, the minimization of total cost and the minimization of environmental impact, measured by pollution. The design of the integrated waste management system is hence approached in a multi-objective optimization framework. To determine the best means of compromise, goal programming, weighted sum and fuzzy multi-objective techniques have been employed. The proposed analysis highlights how different attitudes of the decision maker towards the logic and structure of the problem result in the employment of different methodologies and the obtaining of different results. The novel aspect of the paper lies in the proposal of an effective decision support system for operative waste management, rather than a further contribution to the transportation problem. The model was applied to the waste management of optimal territorial ambit (OTA) of Palermo (Italy).

  14. Concept and objectives of accident management in LWR type plants

    International Nuclear Information System (INIS)

    Herttrich, P.M.; Hicken, E.F.

    1990-01-01

    For the sake of putting the previous protection and prevention concept in its proper place, it is shown, first of all, on which basis the prevention against damages required according to the state of the art in science and technology was proved under the licensing practice applied so far. Secondly, the previous practice of dynamic upgrading of safety engineering and risk prevention is explained. The introduction of accident management measures is a consequent continuation of this practice. Concrete approaches and objectives of accident management are outlined; an overview of scientific and technical foundations for the development, assessment and introduction of accident management measures is given, and finally the most important organizational and procedural aspects are dealt with. (orig./DG) [de

  15. SISSY: An example of a multi-threaded, networked, object-oriented databased application

    International Nuclear Information System (INIS)

    Scipioni, B.; Liu, D.; Song, T.

    1993-05-01

    The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is fully automated data collection and analysis system supporting the SSCL's systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shells scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor and analyze the PDSF

  16. Study on parallel and distributed management of RS data based on spatial database

    Science.gov (United States)

    Chen, Yingbiao; Qian, Qinglan; Wu, Hongqiao; Liu, Shijin

    2009-10-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  17. Design and Development of an Objective, Structured Management Examinations (OSMEs) on Management Skills among Pharmacy Students

    Science.gov (United States)

    Augustine, Jill

    2016-01-01

    The purpose of this study was to design, develop, and administer an Objective, Structured Management Exam (OSME) on management skills for pharmacy students. Pharmacy preceptors for the University of Arizona College of Pharmacy participated in focus groups that identified business, management, and human resource skills needed by pharmacy graduates.…

  18. Handling Emergency Management in [an] Object Oriented Modeling Environment

    Science.gov (United States)

    Tokgoz, Berna Eren; Cakir, Volkan; Gheorghe, Adrian V.

    2010-01-01

    It has been understood that protection of a nation from extreme disasters is a challenging task. Impacts of extreme disasters on a nation's critical infrastructures, economy and society could be devastating. A protection plan itself would not be sufficient when a disaster strikes. Hence, there is a need for a holistic approach to establish more resilient infrastructures to withstand extreme disasters. A resilient infrastructure can be defined as a system or facility that is able to withstand damage, but if affected, can be readily and cost-effectively restored. The key issue to establish resilient infrastructures is to incorporate existing protection plans with comprehensive preparedness actions to respond, recover and restore as quickly as possible, and to minimize extreme disaster impacts. Although national organizations will respond to a disaster, extreme disasters need to be handled mostly by local emergency management departments. Since emergency management departments have to deal with complex systems, they have to have a manageable plan and efficient organizational structures to coordinate all these systems. A strong organizational structure is the key in responding fast before and during disasters, and recovering quickly after disasters. In this study, the entire emergency management is viewed as an enterprise and modelled through enterprise management approach. Managing an enterprise or a large complex system is a very challenging task. It is critical for an enterprise to respond to challenges in a timely manner with quick decision making. This study addresses the problem of handling emergency management at regional level in an object oriented modelling environment developed by use of TopEase software. Emergency Operation Plan of the City of Hampton, Virginia, has been incorporated into TopEase for analysis. The methodology used in this study has been supported by a case study on critical infrastructure resiliency in Hampton Roads.

  19. Information flow in the DAMA project beyond database managers: information flow managers

    Science.gov (United States)

    Russell, Lucian; Wolfson, Ouri; Yu, Clement

    1996-12-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point of sale information, is being considered in the Demand Activated Manufacturing Project (DAMA) of the American Textile Partnership (AMTEX) project. A scenario is examined in which 100 000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26 000 suppliers through the use of bill of materials explosions at four levels of detail. Enabling this communication requires an approach that shares common features with both workflows and database management. A new paradigm, the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced so as to keep estimates of demand as current as possible.

  20. A user's manual for the database management system of impact property

    International Nuclear Information System (INIS)

    Ryu, Woo Seok; Park, S. J.; Kong, W. S.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the impact database system for managing the impact property test data. The data base constructed the data produced from impact property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The impact database system was developed by internet method using jsp(Java Server pages) tool

  1. Knowledge Management through a Fully Extensible, Schema Independent, XML Database

    National Research Council Canada - National Science Library

    Direen, H

    2001-01-01

    ... (databases in particular) is that the context must be predefined. In a field that is developing as fast as bioinformatics, it is as impossible to predefine all of the context as it is to predefine all of the data that is being...

  2. Knowledge Based Engineering for Spatial Database Management and Use

    Science.gov (United States)

    Peuquet, D. (Principal Investigator)

    1984-01-01

    The use of artificial intelligence techniques that are applicable to Geographic Information Systems (GIS) are examined. Questions involving the performance and modification to the database structure, the definition of spectra in quadtree structures and their use in search heuristics, extension of the knowledge base, and learning algorithm concepts are investigated.

  3. Application of cloud database in the management of clinical data of patients with skin diseases.

    Science.gov (United States)

    Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning

    2015-04-01

    To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.

  4. Knowledge databases as instrument for a fast assessment in nuclear emergency management

    Energy Technology Data Exchange (ETDEWEB)

    Raskob, Wolfgang; Moehrle, Stella [Institute for Nuclear and Energy Technologies, Karlsruhe Institute of Technology (KIT), Hermann-von- Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany)

    2014-07-01

    The European project PREPARE (Innovative integrated tools and platforms for radiological emergency preparedness and post-accident response in Europe) aims to close gaps that have been identified in nuclear and radiological preparedness following the first evaluation of the Fukushima disaster. Among others, a work package was established to develop a so called Analytical Platform exploring the scientific and operational means to improve information collection, information exchange and the evaluation of such types of disasters. As methodological approach knowledge databases and case-based reasoning (CBR) will be used. The application of knowledge gained from previous events or the establishment of scenarios in advance to anticipate possible event developments are used in many areas, but so far not in nuclear and radiological emergency management and preparedness. However in PREPARE, knowledge databases and CBR should be combined by establishing a database, which contains historic events and scenarios, their propagation with time, and applied emergency measures and using the CBR methodology to find solutions for events that are not part of the database. The objectives are to provide information about consequences and future developments after a nuclear or radiological event and emergency measures, which include early, intermediate and late phase actions. CBR is a methodology to solve new problems by utilizing knowledge of previously experienced problem situations. In order to solve a current problem, similar problems from a case base are retrieved. Their solutions are taken and, if necessary, adapted to the current situation. The suggested solution is revised and if it is confirmed, it is stored in the case base. Hence, a CBR system learns with time by storing new cases with its solutions. CBR has many advantages, such as solutions can be proposed quickly and do not have to be made from scratch, solutions can be proposed in domains that are not understood completely

  5. Data management and database framework for the MICE experiment

    Science.gov (United States)

    Martyniak, J.; Nebrensky, J. J.; Rajaram, D.; MICE Collaboration

    2017-10-01

    The international Muon Ionization Cooling Experiment (MICE) currently operating at the Rutherford Appleton Laboratory in the UK, is designed to demonstrate the principle of muon ionization cooling for application to a future Neutrino Factory or Muon Collider. We present the status of the framework for the movement and curation of both raw and reconstructed data. A raw data-mover has been designed to safely upload data files onto permanent tape storage as soon as they have been written out. The process has been automated, and checks have been built in to ensure the integrity of data at every stage of the transfer. The data processing framework has been recently redesigned in order to provide fast turnaround of reconstructed data for analysis. The automated reconstruction is performed on a dedicated machine in the MICE control room and any reprocessing is done at Tier-2 Grid sites. In conjunction with this redesign, a new reconstructed-data-mover has been designed and implemented. We also review the implementation of a robust database system that has been designed for MICE. The processing of data, whether raw or Monte Carlo, requires accurate knowledge of the experimental conditions. MICE has several complex elements ranging from beamline magnets to particle identification detectors to superconducting magnets. A Configuration Database, which contains information about the experimental conditions (magnet currents, absorber material, detector calibrations, etc.) at any given time has been developed to ensure accurate and reproducible simulation and reconstruction. A fully replicated, hot-standby database system has been implemented with a firewall-protected read-write master running in the control room, and a read-only slave running at a different location. The actual database is hidden from end users by a Web Service layer, which provides platform and programming language-independent access to the data.

  6. A survey of the use of database management systems in accelerator projects

    OpenAIRE

    Poole, John; Strubin, Pierre M

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accele...

  7. A Conceptual Model for Delineating Land Management Units (LMUs Using Geographical Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Deniz Gerçek

    2017-06-01

    Full Text Available Land management and planning is crucial for present and future use of land and the sustainability of land resources. Physical, biological and cultural characteristics of land can be used to define Land Management Units (LMUs that aid in decision making for managing land and communicating information between different research and application domains. This study aims to describe the classification of ecologically relevant land units that are suitable for land management, planning and conservation purposes. Relying on the idea of strong correlation between landform and potential landcover, a conceptual model for creating Land Management Units (LMUs from topographic data and biophysical information is presented. The proposed method employs a multi-level object-based classification of Digital Terrain Models (DTMs to derive landform units. The sensitivity of landform units to changes in segmentation scale is examined, and the outcome of the landform classification is evaluated. Landform classes are then aggregated with landcover information to produce ecologically relevant landform/landcover assemblages. These conceptual units that constitute a framework of connected entities are finally enriched given available socio-economic information e.g., land use, ownership, protection status, etc. to generate LMUs. LMUs attached to a geographic database enable the retrieval of information at various levels to support decision making for land management at various scales. LMUs that are created present a basis for conservation and management in a biodiverse area in the Black Sea region of Turkey.

  8. Application based on ArcObject inquiry and Google maps demonstration to real estate database

    Science.gov (United States)

    Hwang, JinTsong

    2007-06-01

    Real estate industry in Taiwan has been flourishing in recent years. To acquire various and abundant information of real estate for sale is the same goal for the consumers and the brokerages. Therefore, before looking at the property, it is important to get all pertinent information possible. Not only this beneficial for the real estate agent as they can provide the sellers with the most information, thereby solidifying the interest of the buyer, but may also save time and the cost of manpower were something out of place. Most of the brokerage sites are aware of utilizes Internet as form of media for publicity however; the contents are limited to specific property itself and the functions of query are mostly just provided searching by condition. This paper proposes a query interface on website which gives function of zone query by spatial analysis for non-GIS users, developing a user-friendly interface with ArcObject in VB6, and query by condition. The inquiry results can show on the web page which is embedded functions of Google Maps and the UrMap API on it. In addition, the demonstration of inquiry results will give the multimedia present way which includes hyperlink to Google Earth with surrounding of the property, the Virtual Reality scene of house, panorama of interior of building and so on. Therefore, the website provides extra spatial solution for query and demonstration abundant information of real estate in two-dimensional and three-dimensional types of view.

  9. An Investigation on the Correlation of Learner Styles and Learning Objects Characteristics in a Proposed Learning Objects Management Model (LOMM)

    Science.gov (United States)

    Wanapu, Supachanun; Fung, Chun Che; Kerdprasop, Nittaya; Chamnongsri, Nisachol; Niwattanakul, Suphakit

    2016-01-01

    The issues of accessibility, management, storage and organization of Learning Objects (LOs) in education systems are a high priority of the Thai Government. Incorporating personalized learning or learning styles in a learning object management system to improve the accessibility of LOs has been addressed continuously in the Thai education system.…

  10. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    Science.gov (United States)

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  11. Report on the first Twente Data Management Workshop on XML Databases and Information Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Mihajlovic, V.

    2004-01-01

    The Database Group of the University of Twente initiated a new series of workshops called Twente Data Management workshops (TDM), starting with one on XML Databases and Information Retrieval which took place on 21 June 2004 at the University of Twente. We have set ourselves two goals for the

  12. A plant resource and experiment management system based on the Golm Plant Database as a basic tool for omics research

    Directory of Open Access Journals (Sweden)

    Selbig Joachim

    2008-05-01

    Full Text Available Abstract Background For omics experiments, detailed characterisation of experimental material with respect to its genetic features, its cultivation history and its treatment history is a requirement for analyses by bioinformatics tools and for publication needs. Furthermore, meta-analysis of several experiments in systems biology based approaches make it necessary to store this information in a standardised manner, preferentially in relational databases. In the Golm Plant Database System, we devised a data management system based on a classical Laboratory Information Management System combined with web-based user interfaces for data entry and retrieval to collect this information in an academic environment. Results The database system contains modules representing the genetic features of the germplasm, the experimental conditions and the sampling details. In the germplasm module, genetically identical lines of biological material are generated by defined workflows, starting with the import workflow, followed by further workflows like genetic modification (transformation, vegetative or sexual reproduction. The latter workflows link lines and thus create pedigrees. For experiments, plant objects are generated from plant lines and united in so-called cultures, to which the cultivation conditions are linked. Materials and methods for each cultivation step are stored in a separate ACCESS database of the plant cultivation unit. For all cultures and thus every plant object, each cultivation site and the culture's arrival time at a site are logged by a barcode-scanner based system. Thus, for each plant object, all site-related parameters, e.g. automatically logged climate data, are available. These life history data and genetic information for the plant objects are linked to analytical results by the sampling module, which links sample components to plant object identifiers. This workflow uses controlled vocabulary for organs and treatments. Unique

  13. Multi-Objective Optimization of Managed Aquifer Recharge.

    Science.gov (United States)

    Fatkhutdinov, Aybulat; Stefan, Catalin

    2018-04-27

    This study demonstrates the utilization of a multi-objective hybrid global/local optimization algorithm for solving managed aquifer recharge (MAR) design problems, in which the decision variables included spatial arrangement of water injection and abstraction wells and time-variant rates of pumping and injection. The objective of the optimization was to maximize the efficiency of the MAR scheme, which includes both quantitative and qualitative aspects. The case study used to demonstrate the capabilities of the proposed approach is based on a published report on designing a real MAR site with defined aquifer properties, chemical groundwater characteristics as well as quality and volumes of injected water. The demonstration problems include steady-state and transient scenarios. The steady-state scenario demonstrates optimization of spatial arrangement of multiple injection and recovery wells, whereas the transient scenario was developed with the purpose of finding optimal regimes of water injection and recovery at a single location. Both problems were defined as multi-objective problems. The scenarios were simulated by applying coupled numerical groundwater flow and solute transport models: MODFLOW-2005 and MT3D-USGS. The applied optimization method was a combination of global - the Non-Dominated Sorting Genetic Algorithm (NSGA-2), and local - the Nelder-Mead Downhill Simplex search algorithms. The analysis of the resulting Pareto optimal solutions led to the discovery of valuable patterns and dependencies between the decision variables, model properties and problem objectives. Additionally, the performance of the traditional global and the hybrid optimization schemes were compared. This article is protected by copyright. All rights reserved.

  14. Design and implementation of component reliability database management system for NPP

    International Nuclear Information System (INIS)

    Kim, S. H.; Jung, J. K.; Choi, S. Y.; Lee, Y. H.; Han, S. H.

    1999-01-01

    KAERI is constructing the component reliability database for Korean nuclear power plant. This paper describes the development of data management tool, which runs for component reliability database. This is running under intranet environment and is used to analyze the failure mode and failure severity to compute the component failure rate. Now we are developing the additional modules to manage operation history, test history and algorithms for calculation of component failure history and reliability

  15. QUASI-STELLAR OBJECT SELECTION ALGORITHM USING TIME VARIABILITY AND MACHINE LEARNING: SELECTION OF 1620 QUASI-STELLAR OBJECT CANDIDATES FROM MACHO LARGE MAGELLANIC CLOUD DATABASE

    International Nuclear Information System (INIS)

    Kim, Dae-Won; Protopapas, Pavlos; Alcock, Charles; Trichas, Markos; Byun, Yong-Ik; Khardon, Roni

    2011-01-01

    We present a new quasi-stellar object (QSO) selection algorithm using a Support Vector Machine, a supervised classification method, on a set of extracted time series features including period, amplitude, color, and autocorrelation value. We train a model that separates QSOs from variable stars, non-variable stars, and microlensing events using 58 known QSOs, 1629 variable stars, and 4288 non-variables in the MAssive Compact Halo Object (MACHO) database as a training set. To estimate the efficiency and the accuracy of the model, we perform a cross-validation test using the training set. The test shows that the model correctly identifies ∼80% of known QSOs with a 25% false-positive rate. The majority of the false positives are Be stars. We applied the trained model to the MACHO Large Magellanic Cloud (LMC) data set, which consists of 40 million light curves, and found 1620 QSO candidates. During the selection none of the 33,242 known MACHO variables were misclassified as QSO candidates. In order to estimate the true false-positive rate, we crossmatched the candidates with astronomical catalogs including the Spitzer Surveying the Agents of a Galaxy's Evolution LMC catalog and a few X-ray catalogs. The results further suggest that the majority of the candidates, more than 70%, are QSOs.

  16. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T W; Sutton, M

    2011-09-19

    , meaning that they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  17. Generic Natural Systems Evaluation - Thermodynamic Database Development and Data Management

    International Nuclear Information System (INIS)

    Wolery, T.W.; Sutton, M.

    2011-01-01

    they use a large body of thermodynamic data, generally from a supporting database file, to sort out the various important reactions from a wide spectrum of possibilities, given specified inputs. Usually codes of this kind are used to construct models of initial aqueous solutions that represent initial conditions for some process, although sometimes these calculations also represent a desired end point. Such a calculation might be used to determine the major chemical species of a dissolved component, the solubility of a mineral or mineral-like solid, or to quantify deviation from equilibrium in the form of saturation indices. Reactive transport codes such as TOUGHREACT and NUFT generally require the user to determine which chemical species and reactions are important, and to provide the requisite set of information including thermodynamic data in an input file. Usually this information is abstracted from the output of a geochemical modeling code and its supporting thermodynamic data file. The Yucca Mountain Project (YMP) developed two qualified thermodynamic databases to model geochemical processes, including ones involving repository components such as spent fuel. The first of the two (BSC, 2007a) was for systems containing dilute aqueous solutions only, the other (BSC, 2007b) for systems involving concentrated aqueous solutions and incorporating a model for such based on Pitzer's (1991) equations. A 25 C-only database with similarities to the latter was also developed for the Waste Isolation Pilot Plant (WIPP, cf. Xiong, 2005). The NAGRA/PSI database (Hummel et al., 2002) was developed to support repository studies in Europe. The YMP databases are often used in non-repository studies, including studies of geothermal systems (e.g., Wolery and Carroll, 2010) and CO2 sequestration (e.g., Aines et al., 2011).

  18. Zebrafish Database: Customizable, Free, and Open-Source Solution for Facility Management.

    Science.gov (United States)

    Yakulov, Toma Antonov; Walz, Gerd

    2015-12-01

    Zebrafish Database is a web-based customizable database solution, which can be easily adapted to serve both single laboratories and facilities housing thousands of zebrafish lines. The database allows the users to keep track of details regarding the various genomic features, zebrafish lines, zebrafish batches, and their respective locations. Advanced search and reporting options are available. Unique features are the ability to upload files and images that are associated with the respective records and an integrated calendar component that supports multiple calendars and categories. Built on the basis of the Joomla content management system, the Zebrafish Database is easily extendable without the need for advanced programming skills.

  19. Development of the severe accident risk information database management system SARD

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies

  20. Development of the severe accident risk information database management system SARD

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies.

  1. Discussions about acceptance of the free software for management and creation of referencial database for papers

    Directory of Open Access Journals (Sweden)

    Flavio Ribeiro Córdula

    2016-03-01

    Full Text Available Objective. This research aimed to determine the degree of acceptance, by the use of the Technology Acceptance Model - TAM, of the developed software, which allows the construction and database management of scientific articles aimed at assisting in the dissemination and retrieval of stored scientific production in digital media. Method. The research is characterized as quantitative, since the TAM, which guided this study is essentially quantitative. A questionnaire developed according to TAM guidelines was used as a tool for data collection. Results. It was possible to verify that this software, despite the need of fixes and improvements inherent to this type of tool, obtained a relevant degree of acceptance by the sample studied. Conciderations. It also should be noted that although this research has been directed to scholars in the field of information science, the idea that justified the creation of the software used in this study might contribute to the development of science in any field of knowledge, aiming at the optimization results of a search conducted in a specialized database can provide.

  2. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    Science.gov (United States)

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  3. Computer application for database management and networking of service radio physics

    International Nuclear Information System (INIS)

    Ferrando Sanchez, A.; Cabello Murillo, E.; Diaz Fuentes, R.; Castro Novais, J.; Clemente Gutierrez, F.; Casa de Juan, M. A. de la; Adaimi Hernandez, P.

    2011-01-01

    The databases in the quality control prove to be a powerful tool for recording, management and statistical process control. Developed in a Windows environment and under Access (Micros of Office) our service implements this philosophy on the canter's computer network. A computer that acts as the server provides the database to the treatment units to record quality control measures daily and incidents. To remove shortcuts stop working with data migration, possible use of duplicate and erroneous data loss because of errors in network connections, which are common problems, we proceeded to manage connections and access to databases ease of maintenance and use horn to all service personnel.

  4. Role of Database Management Systems in Selected Engineering Institutions of Andhra Pradesh: An Analytical Survey

    Directory of Open Access Journals (Sweden)

    Kutty Kumar

    2016-06-01

    Full Text Available This paper aims to analyze the function of database management systems from the perspective of librarians working in engineering institutions in Andhra Pradesh. Ninety-eight librarians from one hundred thirty engineering institutions participated in the study. The paper reveals that training by computer suppliers and software packages are the significant mode of acquiring DBMS skills by librarians; three-fourths of the librarians are postgraduate degree holders. Most colleges use database applications for automation purposes and content value. Electrical problems and untrained staff seem to be major constraints faced by respondents for managing library databases.

  5. Trade-offs between objectives for ecosystem management of fisheries

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Brander, Keith; Ravn-Jonsen, Lars

    2015-01-01

    The strategic objectives for fisheries, enshrined in international conventions, is to maintain or restore stocks to produce maximum sustainable yield (MSY) and implement the ecosystem approach requiring that interactions between species be taken into account and conservation constraints be respec......The strategic objectives for fisheries, enshrined in international conventions, is to maintain or restore stocks to produce maximum sustainable yield (MSY) and implement the ecosystem approach requiring that interactions between species be taken into account and conservation constraints...... approach to fisheries management is required. We apply a conceptual size- and trait-based model to clarify and resolve these issues, by determining the fishing pattern that maximizes the total yield of an entire fish community in terms of catch weight or economic rent under acceptable conservation...... constraints. Our results indicate that the eradication of large, predatory fish species results in a potential maximum catch atleast twice as high as if conservation constraints are imposed. However, such a large catch could only be achieved at a cost of foregone rent; maximum rent extracts less than half...

  6. A database to manage flood risk in Catalonia

    Science.gov (United States)

    Echeverria, S.; Toldrà, R.; Verdaguer, I.

    2009-09-01

    We call priority action spots those local sites where heavy rain, increased river flow, sea storms and other flooding phenomena can cause human casualties or severe damage to property. Some examples are campsites, car parks, roads, chemical factories… In order to keep to a minimum the risk of these spots, both a prevention programme and an emergency response programme are required. The flood emergency plan of Catalonia (INUNCAT) prepared in 2005 included already a listing of priority action spots compiled by the Catalan Water Agency (ACA), which was elaborated taking into account past experience, hydraulic studies and information available by several knowledgeable sources. However, since land use evolves with time this listing of priority action spots has become outdated and incomplete. A new database is being built. Not only does this new database update and expand the previous listing, but adds to each entry information regarding prevention measures and emergency response: which spots are the most hazardous, under which weather conditions problems arise, which ones should have their access closed as soon as these conditions are forecast or actually given, which ones should be evacuated, who is in charge of the preventive actions or emergency response and so on. Carrying out this programme has to be done with the help and collaboration of all the organizations involved, foremost with the local authorities in the areas at risk. In order to achieve this goal a suitable geographical information system is necessary which can be easily used by all actors involved in this project. The best option has turned out to be the Spatial Data Infrastructure of Catalonia (IDEC), a platform to share spatial data on the Internet involving the Generalitat de Catalunya, Localret (a consortium of local authorities that promotes information technology) and other institutions.

  7. Ageing management database development for PWR NPP steam generator

    International Nuclear Information System (INIS)

    Liu Hongyun; Xu Liangjun; Xiong Changhuai; Wang Xianyuan

    2005-01-01

    Steam generator (SG) is one of the key safe important equipment of NPP, which is covered by NPP aging management program. Steam Generator Aging Management Dabatase (SGAMDB) is developed to provide necessary information for SG aging management. RINPO is developing SGAMDB for domestic NPP. This system contains information and data about SG design, manufacture, operation and maintenance. The information include NPP fundamental data, SG design data, SG aging mechanism, SG operation data, SG ISI data, SG maintenance data and SG evaluation interface. The system runs at the intranet of Qinshan-1 NPP with B/S mode. It can provide information inquire and fundamental analysis for NPP SG aging team and SG aging researcher's. In addition, it provides necessary information and data for SG aging analysis and evaluation, such as all pressure test process and flaws of tubes, and collects the analysis results. (authors)

  8. A relational database for personnel radiation exposure management

    International Nuclear Information System (INIS)

    David, W.; Miller, P.D.

    1993-01-01

    In-house utility personnel developed a relational data base for personnel radiation exposure management computer system during a 2 1/2 year period. The (PREM) Personnel Radiation Exposure Management System was designed to meet current Nuclear Regulatory Commission (NRC) requirements related to radiological access control, Radiation Work Permits (RWP) management, automated personnel dosimetry reporting, ALARA planning and repetitive job history dose archiving. The system has been operational for the past 18 months which includes a full refueling outage at Clinton Power Station. The Radiation Protection Department designed PREM to establish a software platform for implementing future revisions to 10CFR20 in 1993. Workers acceptance of the system has been excellent. Regulatory officials have given the system high marks as a radiological tool because of the system's ability to track the entire job from start to finish

  9. PACSY, a relational database management system for protein structure and chemical shift analysis.

    Science.gov (United States)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L

    2012-10-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu.

  10. PACSY, a relational database management system for protein structure and chemical shift analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Woonghee, E-mail: whlee@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States); Yu, Wookyung [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Kim, Suhkmann [Pusan National University, Department of Chemistry and Chemistry Institute for Functional Materials (Korea, Republic of); Chang, Iksoo [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Lee, Weontae, E-mail: wlee@spin.yonsei.ac.kr [Yonsei University, Structural Biochemistry and Molecular Biophysics Laboratory, Department of Biochemistry (Korea, Republic of); Markley, John L., E-mail: markley@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States)

    2012-10-15

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  11. PACSY, a relational database management system for protein structure and chemical shift analysis

    Science.gov (United States)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu. PMID:22903636

  12. PACSY, a relational database management system for protein structure and chemical shift analysis

    International Nuclear Information System (INIS)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L.

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  13. Databases in the documentation management for big industrial projects

    International Nuclear Information System (INIS)

    Cauchet, A.; Chevillard, F.; Parisot, Y.; Tirefort, C.

    1990-05-01

    The documentation management of a big industrial project involves a continuous update of information, both in the study and realization phase or in the operation phase. The organization of the technical documentation for big industrial projects requests complex information systems. In the first part of this paper are presented the methods appropriate for the analysis of documentation management procedures and in the second part are presented the tools by the combination of which a documentation system for the user is provided. The case of the documentation centres for the Hague reprocessing plant is described

  14. Valuing hydrological alteration in Multi-Objective reservoir management

    Science.gov (United States)

    Bizzi, S.; Pianosi, F.; Soncini-Sessa, R.

    2012-04-01

    Water management through dams and reservoirs is worldwide necessary to support key human-related activities ranging from hydropower production to water allocation for agricultural production, and flood risk mitigation. Advances in multi-objectives (MO) optimization techniques and ever growing computing power make it possible to design reservoir operating policies that represent Pareto-optimal tradeoffs between the multiple interests analysed. These progresses if on one hand are likely to enhance performances of commonly targeted objectives (such as hydropower production or water supply), on the other risk to strongly penalize all the interests not directly (i.e. mathematically) optimized within the MO algorithm. Alteration of hydrological regime, although is a well established cause of ecological degradation and its evaluation and rehabilitation are commonly required by recent legislation (as the Water Framework Directive in Europe), is rarely embedded as an objective in MO planning of optimal releases from reservoirs. Moreover, even when it is explicitly considered, the criteria adopted for its evaluation are doubted and not commonly trusted, undermining the possibility of real implementation of environmentally friendly policies. The main challenges in defining and assessing hydrological alterations are: how to define a reference state (referencing); how to define criteria upon which to build mathematical indicators of alteration (measuring); and finally how to aggregate the indicators in a single evaluation index that can be embedded in a MO optimization problem (valuing). This paper aims to address these issues by: i) discussing benefits and constrains of different approaches to referencing, measuring and valuing hydrological alteration; ii) testing two alternative indices of hydrological alteration in the context of MO problems, one based on the established framework of Indices of Hydrological Alteration (IHA, Richter et al., 1996), and a novel satisfying the

  15. Objectivity

    CERN Document Server

    Daston, Lorraine

    2010-01-01

    Objectivity has a history, and it is full of surprises. In Objectivity, Lorraine Daston and Peter Galison chart the emergence of objectivity in the mid-nineteenth-century sciences--and show how the concept differs from its alternatives, truth-to-nature and trained judgment. This is a story of lofty epistemic ideals fused with workaday practices in the making of scientific images. From the eighteenth through the early twenty-first centuries, the images that reveal the deepest commitments of the empirical sciences--from anatomy to crystallography--are those featured in scientific atlases, the compendia that teach practitioners what is worth looking at and how to look at it. Galison and Daston use atlas images to uncover a hidden history of scientific objectivity and its rivals. Whether an atlas maker idealizes an image to capture the essentials in the name of truth-to-nature or refuses to erase even the most incidental detail in the name of objectivity or highlights patterns in the name of trained judgment is a...

  16. Developing a database management system to support birth defects surveillance in Florida.

    Science.gov (United States)

    Salemi, Jason L; Hauser, Kimberlea W; Tanner, Jean Paul; Sampat, Diana; Correia, Jane A; Watkins, Sharon M; Kirby, Russell S

    2010-01-01

    The value of any public health surveillance program is derived from the ways in which data are managed and used to improve the public's health. Although birth defects surveillance programs vary in their case volume, budgets, staff, and objectives, the capacity to operate efficiently and maximize resources remains critical to long-term survival. The development of a fully-integrated relational database management system (DBMS) can enrich a surveillance program's data and improve efficiency. To build upon the Florida Birth Defects Registry--a statewide registry relying solely on linkage of administrative datasets and unconfirmed diagnosis codes-the Florida Department of Health provided funding to the University of South Florida to develop and pilot an enhanced surveillance system in targeted areas with a more comprehensive approach to case identification and diagnosis confirmation. To manage operational and administrative complexities, a DBMS was developed, capable of managing transmission of project data from multiple sources, tracking abstractor time during record reviews, offering tools for defect coding and case classification, and providing reports to DBMS users. Since its inception, the DBMS has been used as part of our surveillance projects to guide the receipt of over 200 case lists and review of 12,924 fetuses and infants (with associated maternal records) suspected of having selected birth defects in over 90 birthing and transfer facilities in Florida. The DBMS has provided both anticipated and unexpected benefits. Automation of the processes for managing incoming case lists has reduced clerical workload considerably, while improving accuracy of working lists for field abstraction. Data quality has improved through more effective use of internal edits and comparisons with values for other data elements, while simultaneously increasing abstractor efficiency in completion of case abstraction. We anticipate continual enhancement to the DBMS in the future

  17. Data management and database structure at the ARS Culture Collection

    Science.gov (United States)

    The organization and management of collection data for the 96,000 strains held in the ARS Culture Collection has been an ongoing process. Originally, the records for the four separate collections were maintained by individual curators in notebooks and/or card files and subsequently on the National C...

  18. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    Science.gov (United States)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

  19. Database Quality and Access Issues Relevant to Research Using Anesthesia Information Management System Data.

    Science.gov (United States)

    Epstein, Richard H; Dexter, Franklin

    2018-07-01

    For this special article, we reviewed the computer code, used to extract the data, and the text of all 47 studies published between January 2006 and August 2017 using anesthesia information management system (AIMS) data from Thomas Jefferson University Hospital (TJUH). Data from this institution were used in the largest number (P = .0007) of papers describing the use of AIMS published in this time frame. The AIMS was replaced in April 2017, making this finite sample finite. The objective of the current article was to identify factors that made TJUH successful in publishing anesthesia informatics studies. We examined the structured query language used for each study to examine the extent to which databases outside of the AIMS were used. We examined data quality from the perspectives of completeness, correctness, concordance, plausibility, and currency. Our results were that most could not have been completed without external database sources (36/47, 76.6%; P = .0003 compared with 50%). The operating room management system was linked to the AIMS and was used significantly more frequently (26/36, 72%) than other external sources. Access to these external data sources was provided, allowing exploration of data quality. The TJUH AIMS used high-resolution timestamps (to the nearest 3 milliseconds) and created audit tables to track changes to clinical documentation. Automatic data were recorded at 1-minute intervals and were not editable; data cleaning occurred during analysis. Few paired events with an expected order were out of sequence. Although most data elements were of high quality, there were notable exceptions, such as frequent missing values for estimated blood loss, height, and weight. Some values were duplicated with different units, and others were stored in varying locations. Our conclusions are that linking the TJUH AIMS to the operating room management system was a critical step in enabling publication of multiple studies using AIMS data. Access to this and

  20. Noise data management using commercially available data-base software

    International Nuclear Information System (INIS)

    Damiano, B.; Thie, J.A.

    1988-01-01

    A data base has been created using commercially available software to manage the data collected by an automated noise data acquisition system operated by Oak Ridge National Laboratory at the Fast Flux Test Facility (FFTF). The data base was created to store, organize, and retrieve selected features of the nuclear and process signal noise data, because the large volume of data collected by the automated system makes manual data handling and interpretation based on visual examination of noise signatures impractical. Compared with manual data handling, use of the data base allows the automatically collected data to be utilized more fully and effectively. The FFTF noise data base uses the Oracle Relational Data Base Management System implemented on a desktop personal computer

  1. Use of an INGRES database to implement the beam parameter management at GANIL

    Energy Technology Data Exchange (ETDEWEB)

    Gillette, P.; Lecorche, E.; Lermine, P.; Maugeais, C.; Leboucher, Ch.; Moscatello, M.H.; Pain, P.

    1995-12-31

    Since the beginning of the operation driven by the new Ganil control system in February 1993, the relational database management system (RDBMS) Ingres has been more and more widely used. The most significant application relying on the RDBMS is the new beam parameter management which has been entirely redesigned. It has been operational since the end of the machine shutdown in July this year. After a short recall of the use of Ingres inside the control system, the organization of the parameter management is presented. Then the database implementation is shown, including the way how the physical aspects of the Ganil tuning have been integrated in such an environment. (author). 2 refs.

  2. Use of an INGRES database to implement the beam parameter management at GANIL

    Energy Technology Data Exchange (ETDEWEB)

    Gillette, P; Lecorche, E; Lermine, P; Maugeais, C; Leboucher, Ch; Moscatello, M H; Pain, P

    1996-12-31

    Since the beginning of the operation driven by the new Ganil control system in February 1993, the relational database management system (RDBMS) Ingres has been more and more widely used. The most significant application relying on the RDBMS is the new beam parameter management which has been entirely redesigned. It has been operational since the end of the machine shutdown in July this year. After a short recall of the use of Ingres inside the control system, the organization of the parameter management is presented. Then the database implementation is shown, including the way how the physical aspects of the Ganil tuning have been integrated in such an environment. (author). 2 refs.

  3. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  4. Use of an INGRES database to implement the beam parameter management at GANIL

    International Nuclear Information System (INIS)

    Gillette, P.; Lecorche, E.; Lermine, P.; Maugeais, C.; Leboucher, Ch.; Moscatello, M.H.; Pain, P.

    1995-01-01

    Since the beginning of the operation driven by the new Ganil control system in February 1993, the relational database management system (RDBMS) Ingres has been more and more widely used. The most significant application relying on the RDBMS is the new beam parameter management which has been entirely redesigned. It has been operational since the end of the machine shutdown in July this year. After a short recall of the use of Ingres inside the control system, the organization of the parameter management is presented. Then the database implementation is shown, including the way how the physical aspects of the Ganil tuning have been integrated in such an environment. (author)

  5. Nuclear plant operations, maintenance, and configuration management using three-dimensional computer graphics and databases

    International Nuclear Information System (INIS)

    Tutos, N.C.; Reinschmidt, K.F.

    1987-01-01

    Stone and Webster Engineering Corporation has developed the Plant Digital Model concept as a new approach to Configuration Mnagement of nuclear power plants. The Plant Digital Model development is a step-by-step process, based on existing manual procedures and computer applications, and is fully controllable by the plant managers and engineers. The Plant Digital Model is based on IBM computer graphics and relational database management systems, and therefore can be easily integrated with existing plant databases and corporate management-information systems

  6. Operational Research Techniques Used for Addressing Biodiversity Objectives into Forest Management: An Overview

    Directory of Open Access Journals (Sweden)

    Marta Ezquerro

    2016-10-01

    Full Text Available The integration of biodiversity into forest management has traditionally been a challenge for many researchers and practitioners. In this paper, we have provided a survey of forest management papers that use different Operations Research (OR methods in order to integrate biodiversity objectives into their planning models. One hundred and seventy-nine references appearing in the ISI Web of Science database in the last 30 years have been categorized and evaluated according to different attributes like model components, forest management elements, or biodiversity issues. The results show that many OR methods have been applied to deal with this challenging objective. Thus, up to 18 OR techniques, divided into four large groups, which have been employed in four or more articles, have been identified. However, it has been observed how the evolution of these papers in time apparently tended to increase only until 2008. Finally, two clear trends in this set of papers should be highlighted: the incorporation of spatial analysis tools into these operational research models and, second, the setting up of hybrid models, which combine different techniques to solve this type of problem.

  7. Ornamental rocks prospection in Uruguay. A new database territorial management

    International Nuclear Information System (INIS)

    Carmignani, L.; Gattiglio, S.; Masquelin, H.; Gomez Rifas, C.; Medina, E.; Da Silva, J.; Pirelli, H.

    1998-01-01

    Here are exposed the main of the last ornamental rocks inventory and their exploitation ambiental implicances. The project was realized between the Uruguayan government (Ministry of Industries, Energy and Mining) and the economical European Community (C.E.E). a two -fold target was poursuit. The first o administrative order in the sense that results of such recensement could allow to review the ornamental rocks management in a more efficiently and realistic manner. The second of geoeconomical order permits to re-evaluate traditional ornamental rocks facilities (marbles and granites) form the marketing and either the valoration of a new generation of ornamental materials also. (author)

  8. Planning the future of JPL's management and administrative support systems around an integrated database

    Science.gov (United States)

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  9. Object Classification based Context Management for Identity Management in Internet of Things

    DEFF Research Database (Denmark)

    Mahalle, Parikshit N.; Prasad, Neeli R.; Prasad, Ramjee

    2013-01-01

    As computing technology is becoming more tightly coupled into dynamic and mobile world of the Internet of Things (IoT), security mechanism is more stringent, flexible and less intrusive. Scalability issue in IoT makes identity management (IdM) of ubiquitous objects more challenging, and there is ......As computing technology is becoming more tightly coupled into dynamic and mobile world of the Internet of Things (IoT), security mechanism is more stringent, flexible and less intrusive. Scalability issue in IoT makes identity management (IdM) of ubiquitous objects more challenging......, and there is a need of context-aware access control solution for IdM. Confronting uncertainty of different types of objects in IoT is not easy. This paper presents the logical framework for object classification in context aware IoT, as richer contextual information creates an impact on the access control. This paper...

  10. Comparison of scientific and administrative database management systems

    Science.gov (United States)

    Stoltzfus, J. C.

    1983-01-01

    Some characteristics found to be different for scientific and administrative data bases are identified and some of the corresponding generic requirements for data base management systems (DBMS) are discussed. The requirements discussed are especially stringent for either the scientific or administrative data bases. For some, no commercial DBMS is fully satisfactory, and the data base designer must invent a suitable approach. For others, commercial systems are available with elegant solutions, and a wrong choice would mean an expensive work-around to provide the missing features. It is concluded that selection of a DBMS must be based on the requirements for the information system. There is no unique distinction between scientific and administrative data bases or DBMS. The distinction comes from the logical structure of the data, and understanding the data and their relationships is the key to defining the requirements and selecting an appropriate DBMS for a given set of applications.

  11. Features of the choice of object of management and object of research in socially-educational systems

    Directory of Open Access Journals (Sweden)

    O G Fedorov

    2009-12-01

    Full Text Available In work features of modeling and research of socially-educational systems are analyzed, principles of a choice of objects of management and objects of research, and also definition of factors of the importance of subsystems are considered.

  12. Modelling a critical infrastructure-driven spatial database for proactive disaster management: A developing country context

    Directory of Open Access Journals (Sweden)

    David O. Baloye

    2016-04-01

    Full Text Available The understanding and institutionalisation of the seamless link between urban critical infrastructure and disaster management has greatly helped the developed world to establish effective disaster management processes. However, this link is conspicuously missing in developing countries, where disaster management has been more reactive than proactive. The consequence of this is typified in poor response time and uncoordinated ways in which disasters and emergency situations are handled. As is the case with many Nigerian cities, the challenges of urban development in the city of Abeokuta have limited the effectiveness of disaster and emergency first responders and managers. Using geospatial techniques, the study attempted to design and deploy a spatial database running a web-based information system to track the characteristics and distribution of critical infrastructure for effective use during disaster and emergencies, with the purpose of proactively improving disaster and emergency management processes in Abeokuta. Keywords: Disaster Management; Emergency; Critical Infrastructure; Geospatial Database; Developing Countries; Nigeria

  13. Maintaining the Database for Information Object Analysis, Intent, Dissemination and Enhancement (IOAIDE) and the US Army Research Laboratory Campus Sensor Network (ARL CSN)

    Science.gov (United States)

    2017-01-01

    operations as well as basic knowledge of Microsoft Structured Query Language Server Management Studio (2014 or 2016). 15. SUBJECT TERMS Microsoft SQL ...designed and is maintained with Microsoft SQL Server Management Studio. The basic requirements for the IOAIDE/ARL CSN database development and... SQL server (2014 or 2016) installed. All images in this report were generated using Windows 10. The IOAIDE/ARL CSN database could reside on the

  14. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    Science.gov (United States)

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  15. The Future of Asset Management for Human Space Exploration: Supply Classification and an Integrated Database

    Science.gov (United States)

    Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert

    2006-01-01

    One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.

  16. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    Science.gov (United States)

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  17. Planning and managing future space facility projects. [management by objectives and group dynamics

    Science.gov (United States)

    Sieber, J. E.; Wilhelm, J. A.; Tanner, T. A.; Helmreich, R. L.; Burgenbauch, S. F.

    1979-01-01

    To learn how ground-based personnel of a space project plan and organize their work and how such planning and organizing relate to work outcomes, longitudinal study of the management and execution of the Space Lab Mission Development Test 3 (SMD 3) was performed at NASA Ames Research Center. A view of the problems likely to arise in organizations and some methods of coping with these problems are presented as well as the conclusions and recommendations that pertain strictly to SMD 3 management. Emphasis is placed on the broader context of future space facility projects and additional problems that may be anticipated. A model of management that may be used to facilitate problem solving and communication - management by objectives (MBO) is presented. Some problems of communication and emotion management that MBO does not address directly are considered. Models for promoting mature, constructive and satisfying emotional relationships among group members are discussed.

  18. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    Science.gov (United States)

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  20. A control model for object virtualization in supply chain management

    NARCIS (Netherlands)

    Verdouw, C.N.; Beulens, A.J.M.; Reijers, H.A.; van der Vorst, J.G.A.J.

    2015-01-01

    Due to the emergence of the Internet of Things, supply chain control can increasingly be based on virtual objects instead of on the direct observation of physical objects. Object virtualization allows the decoupling of control activities from the handling and observing of physical products and

  1. Forest management under uncertainty for multiple bird population objectives

    Science.gov (United States)

    Moore, C.T.; Plummer, W.T.; Conroy, M.J.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in a set of alternative models. The models generate testable predictions about the response of populations to management, and monitoring data provide the basis for assessing these predictions and informing future management decisions. To illustrate these principles, we examine forest management at the Piedmont National Wildlife Refuge, where management attention is focused on the recovery of the Red-cockaded Woodpecker (Picoides borealis) population. However, managers are also sensitive to the habitat needs of many non-target organisms, including Wood Thrushes (Hylocichla mustelina) and other forest interior Neotropical migratory birds. By simulating several management policies on a set of-alternative forest and bird models, we found a decision policy that maximized a composite response by woodpeckers and Wood Thrushes despite our complete uncertainty regarding system behavior. Furthermore, we used monitoring data to update our measure of belief in each alternative model following one cycle of forest management. This reduction of uncertainty translates into a reallocation of model influence on the choice of optimal decision action at the next decision opportunity.

  2. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    Science.gov (United States)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  3. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    Science.gov (United States)

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    The area of the Swabian-Franconian cuesta landscape (Southern Germany) is highly prone to landslides. This was apparent in the late spring of 2013, when numerous landslides occurred as a consequence of heavy and long-lasting rainfalls. The specific climatic situation caused numerous damages with serious impact on settlements and infrastructure. Knowledge on spatial distribution of landslides, processes and characteristics are important to evaluate the potential risk that can occur from mass movements in those areas. In the frame of two projects about 400 landslides were mapped and detailed data sets were compiled during years 2011 to 2014 at the Franconian Alb. The studies are related to the project "Slope stability and hazard zones in the northern Bavarian cuesta" (DFG, German Research Foundation) as well as to the LfU (The Bavarian Environment Agency) within the project "Georisks and climate change - hazard indication map Jura". The central goal of the present study is to create a spatial database for landslides. The database should contain all fundamental parameters to characterize the mass movements and should provide the potential for secure data storage and data management, as well as statistical evaluations. The spatial database was created with PostgreSQL, an object-relational database management system and PostGIS, a spatial database extender for PostgreSQL, which provides the possibility to store spatial and geographic objects and to connect to several GIS applications, like GRASS GIS, SAGA GIS, QGIS and GDAL, a geospatial library (Obe et al. 2011). Database access for querying, importing, and exporting spatial and non-spatial data is ensured by using GUI or non-GUI connections. The database allows the use of procedural languages for writing advanced functions in the R, Python or Perl programming languages. It is possible to work directly with the (spatial) data entirety of the database in R. The inventory of the database includes (amongst others

  4. The IAEA's Net Enabled Waste Management Database: Overview and current status

    International Nuclear Information System (INIS)

    Csullog, G.W.; Bell, M.J.; Pozdniakov, I.; Petison, G.; Kostitsin, V.

    2002-01-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) contains information on national radioactive waste management programmes and organizations, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. The NEWMDB, which was launched on the Internet on 6 July 2001, is the successor to the IAEA's Waste Management Database (WMDB), which was in use during the 1990's. The NEWMDB's first data collection cycle took place from July 2001 to March 2002. This paper provides an overview of the NEWMDB, it describes the results of the first data collection cycle, and it discusses the way forward for additional data collection cycles. Three companion papers describe (1) the role of the NEWMDB as an international source of information about radioactive waste management, (2) issues related to the variety of waste classification schemes used by IAEA Member States, and (3) the NEWMDB in the context of an indicator of sustainable development for radioactive waste management. (author)

  5. Are Managed Futures Indices Telling Truth? Biases in CTA Databases and Proposals of Potential Enhancements

    Directory of Open Access Journals (Sweden)

    Adam Zaremba

    2011-07-01

    Full Text Available Managed futures are an alternative asset class which has recently became considerably popular among investment industry. However, due to its characteristics, access to managed futures historical performance statistics is relatively confined. All available information originates from commercial and academic databases, reporting to which is entirely voluntary. This situation results in series of biases which distort the managed futures performance in the eyes of investors. The paper consists of two parts. First, the author reviews and describes various biases that influence the reliability of the managed futures indices and databases. The second section encompasses author’s proposals of potential enhancements, which aim to reduce the impact of the biases in order to derive a benchmark that could better reflect characteristics of managed futures investment from the point of view of a potential investor.

  6. Context and Content Aware Routing of Managed Information Objects

    Science.gov (United States)

    2014-05-01

    datatype . Siena, however, does not support incremental updates (i.e., subscription posting and deletion) and so updates must be done in batch mode...Although the present implementation of PUBSUB does not support the string datatype , its architecture is sufficiently versatile to accommodate this... datatype with the inclusion of additional data structures as de- scribed in Section 3. 3. PUBSUB Section 3.1 describes how PUBSUB organizes its database of

  7. Development of subsurface drainage database system for use in environmental management issues

    International Nuclear Information System (INIS)

    Azhar, A.H.; Rafiq, M.; Alam, M.M.

    2007-01-01

    A simple user-friendly menue-driven system for database management pertinent to the Impact of Subsurface Drainage Systems on Land and Water Conditions (ISIAW) has been developed for use in environment-management issues of the drainage areas. This database has been developed by integrating four soft wares, viz; Microsoft Excel, MS Word Acrobat and MS Access. The information, in the form of tables and figures, with respect to various drainage projects has been presented in MS Word files. The major data-sets of various subsurface drainage projects included in the ISLaW database are: i) technical aspects, ii) groundwater and soil-salinity aspects, iii) socio-technical aspects, iv) agro-economic aspects, and v) operation and maintenance aspects. The various ISlAW file can be accessed just by clicking at the Menu buttons of the database system. This database not only gives feed back on the functioning of different subsurface drainage projects, with respect to the above-mentioned aspects, but also serves as a resource-document for these data for future studies on other drainage projects. The developed database-system is useful for planners, designers and Farmers Organisations for improved operation of existing drainage projects as well as development of future ones. (author)

  8. Management by Objectives: Authentic Assessment in a Public Relations Practicum.

    Science.gov (United States)

    Fall, Lisa T.

    Incorporation of management principles in the classroom can motivate students to successfully complete project work. The Communication Arts Department at Georgia Southern University developed a Public Relations Event Management course in which the students were responsible for planning a campus-wide special event to raise funds for two clients.…

  9. A survey of the use of database management systems in accelerator projects

    CERN Document Server

    Poole, John

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accelerator projects and what they are being used for. Initially introduced to handle equipment builders' data, commercial DBMS are now being used in almost all areas of accelerators from on-line control to personnel data. A variety of commercial systems are being used in conjunction with a diverse selection of application software for data maintenance/manipulation and controls. This paper reviews the database activities known to IADBG.

  10. The development of technical database of advanced spent fuel management process

    Energy Technology Data Exchange (ETDEWEB)

    Ro, Seung Gy; Byeon, Kee Hoh; Song, Dae Yong; Park, Seong Won; Shin, Young Jun

    1999-03-01

    The purpose of this study is to develop the technical database system to provide useful information to researchers who study on the back end nuclear fuel cycle. Technical database of advanced spent fuel management process was developed for a prototype system in 1997. In 1998, this database system is improved into multi-user systems and appended special database which is composed of thermochemical formation data and reaction data. In this report, the detailed specification of our system design is described and the operating methods are illustrated as a user's manual. Also, expanding current system, or interfacing between this system and other system, this report is very useful as a reference. (Author). 10 refs., 18 tabs., 46 fig.

  11. The development of technical database of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, Seung Gy; Byeon, Kee Hoh; Song, Dae Yong; Park, Seong Won; Shin, Young Jun

    1999-03-01

    The purpose of this study is to develop the technical database system to provide useful information to researchers who study on the back end nuclear fuel cycle. Technical database of advanced spent fuel management process was developed for a prototype system in 1997. In 1998, this database system is improved into multi-user systems and appended special database which is composed of thermochemical formation data and reaction data. In this report, the detailed specification of our system design is described and the operating methods are illustrated as a user's manual. Also, expanding current system, or interfacing between this system and other system, this report is very useful as a reference. (Author). 10 refs., 18 tabs., 46 fig

  12. DOG-SPOT database for comprehensive management of dog genetic research data

    Directory of Open Access Journals (Sweden)

    Sutter Nathan B

    2010-12-01

    Full Text Available Abstract Research laboratories studying the genetics of companion animals have no database tools specifically designed to aid in the management of the many kinds of data that are generated, stored and analyzed. We have developed a relational database, "DOG-SPOT," to provide such a tool. Implemented in MS-Access, the database is easy to extend or customize to suit a lab's particular needs. With DOG-SPOT a lab can manage data relating to dogs, breeds, samples, biomaterials, phenotypes, owners, communications, amplicons, sequences, markers, genotypes and personnel. Such an integrated data structure helps ensure high quality data entry and makes it easy to track physical stocks of biomaterials and oligonucleotides.

  13. A database system for the management of severe accident risk information, SARD

    International Nuclear Information System (INIS)

    Ahn, K. I.; Kim, D. H.

    2003-01-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies

  14. Microsoft Enterprise Consortium: A Resource for Teaching Data Warehouse, Business Intelligence and Database Management Systems

    Science.gov (United States)

    Kreie, Jennifer; Hashemi, Shohreh

    2012-01-01

    Data is a vital resource for businesses; therefore, it is important for businesses to manage and use their data effectively. Because of this, businesses value college graduates with an understanding of and hands-on experience working with databases, data warehouses and data analysis theories and tools. Faculty in many business disciplines try to…

  15. Microcomputer Database Management Systems that Interface with Online Public Access Catalogs.

    Science.gov (United States)

    Rice, James

    1988-01-01

    Describes a study that assessed the availability and use of microcomputer database management interfaces to online public access catalogs. The software capabilities needed to effect such an interface are identified, and available software packages are evaluated by these criteria. A directory of software vendors is provided. (4 notes with…

  16. Content Based Retrieval Database Management System with Support for Similarity Searching and Query Refinement

    Science.gov (United States)

    2002-01-01

    to the OODBMS approach. The ORDBMS approach produced such research prototypes as Postgres [155], and Starburst [67] and commercial products such as...Kemnitz. The POSTGRES Next-Generation Database Management System. Communications of the ACM, 34(10):78–92, 1991. [156] Michael Stonebreaker and Dorothy

  17. Supporting Telecom Business Processes by means of Workflow Management and Federated Databases

    NARCIS (Netherlands)

    Nijenhuis, Wim; Jonker, Willem; Grefen, P.W.P.J.

    This report addresses the issues related to the use of workflow management systems and federated databases to support business processes that operate on large and heterogeneous collections of autonomous information systems. We discuss how they can enhance the overall IT-architecture. Starting from

  18. A database system for the management of severe accident risk information, SARD

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, K. I.; Kim, D. H. [KAERI, Taejon (Korea, Republic of)

    2003-10-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies.

  19. Toward public volume database management: a case study of NOVA, the National Online Volumetric Archive

    Science.gov (United States)

    Fletcher, Alex; Yoo, Terry S.

    2004-04-01

    Public databases today can be constructed with a wide variety of authoring and management structures. The widespread appeal of Internet search engines suggests that public information be made open and available to common search strategies, making accessible information that would otherwise be hidden by the infrastructure and software interfaces of a traditional database management system. We present the construction and organizational details for managing NOVA, the National Online Volumetric Archive. As an archival effort of the Visible Human Project for supporting medical visualization research, archiving 3D multimodal radiological teaching files, and enhancing medical education with volumetric data, our overall database structure is simplified; archives grow by accruing information, but seldom have to modify, delete, or overwrite stored records. NOVA is being constructed and populated so that it is transparent to the Internet; that is, much of its internal structure is mirrored in HTML allowing internet search engines to investigate, catalog, and link directly to the deep relational structure of the collection index. The key organizational concept for NOVA is the Image Content Group (ICG), an indexing strategy for cataloging incoming data as a set structure rather than by keyword management. These groups are managed through a series of XML files and authoring scripts. We cover the motivation for Image Content Groups, their overall construction, authorship, and management in XML, and the pilot results for creating public data repositories using this strategy.

  20. Combining objective and subjective techniques for assessing quality of management

    International Nuclear Information System (INIS)

    Arueti, S.; Okrent, D.

    1987-01-01

    The basic assumption is that utility and plant corporate management have a significant role in plant safety which may be quantifiable. From this point of view we try to identify symptoms and paths through which management effectiveness affects plant safety, partly in terms of measureable parameters. Some of the available data are analyzed in light of the proposed parameters, in order to examine possible correlations. Preliminary proposals are made of methods for including management performance as a variable in future PRA studies. This paper focuses primarily on measures of management quality from relating to what are frequently called performance indicators, such as SALP ratings, number of scrams, ESF actuations and safety system failures and challanges; forced outages; availability; enforcement actions; and licensee event reports (LERs). (orig./HP)

  1. Knowledge management: An abstraction of knowledge base and database management systems

    Science.gov (United States)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  2. Load management in electrical networks. Objectives, methods, prospects

    International Nuclear Information System (INIS)

    Gabioud, D.

    2008-01-01

    This illustrated article takes up the problems related to the variation of the load in electricity networks. How to handle the peak load? Different solutions in the energy demand management are discussed. Method based on the price, method based on the reduction of the load by electric utilities. Information systems are presented which gives the consumer the needed data to participate in the local load management.

  3. Applying CIPP Model for Learning-Object Management

    Science.gov (United States)

    Morgado, Erla M. Morales; Peñalvo, Francisco J. García; Martín, Carlos Muñoz; Gonzalez, Miguel Ángel Conde

    Although knowledge management process needs to receive some evaluation in order to determine their suitable functionality. There is not a clear definition about the stages where LOs need to be evaluated and the specific metrics to continuously promote their quality. This paper presents a proposal for LOs evaluation during their management for e-learning systems. To achieve this, we suggest specific steps for LOs design, implementation and evaluation into the four stages proposed by CIPP model (Context, Input, Process, Product).

  4. Development of SRS.php, a Simple Object Access Protocol-based library for data acquisition from integrated biological databases.

    Science.gov (United States)

    Barbosa-Silva, A; Pafilis, E; Ortega, J M; Schneider, R

    2007-12-11

    Data integration has become an important task for biological database providers. The current model for data exchange among different sources simplifies the manner that distinct information is accessed by users. The evolution of data representation from HTML to XML enabled programs, instead of humans, to interact with biological databases. We present here SRS.php, a PHP library that can interact with the data integration Sequence Retrieval System (SRS). The library has been written using SOAP definitions, and permits the programmatic communication through webservices with the SRS. The interactions are possible by invoking the methods described in WSDL by exchanging XML messages. The current functions available in the library have been built to access specific data stored in any of the 90 different databases (such as UNIPROT, KEGG and GO) using the same query syntax format. The inclusion of the described functions in the source of scripts written in PHP enables them as webservice clients to the SRS server. The functions permit one to query the whole content of any SRS database, to list specific records in these databases, to get specific fields from the records, and to link any record among any pair of linked databases. The case study presented exemplifies the library usage to retrieve information regarding registries of a Plant Defense Mechanisms database. The Plant Defense Mechanisms database is currently being developed, and the proposal of SRS.php library usage is to enable the data acquisition for the further warehousing tasks related to its setup and maintenance.

  5. Organizing, exploring, and analyzing antibody sequence data: the case for relational-database managers.

    Science.gov (United States)

    Owens, John

    2009-01-01

    Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.

  6. The ‘Polycronic’ Effects of Management by Objectives

    DEFF Research Database (Denmark)

    Thygesen, Niels Thyge

    2012-01-01

    of the organization relative to its environment (polycontextuality) and in particular how these effects emerge due to different timebindings within organizations (organized temporality). As such the hypothesis is expanded in three ways: first of all, the hypothesis is expanded as polycontextuality is comprehended...... with other media of communication too, than the one of the computer communication. Third of all, the implications of identity problems of modern organizations are often associated with the impossibility of management or with a need for more complex ways of managing. The article is an attempt to specify...

  7. Towards Efficient Energy Management: Defining HEMS and Smart Grid Objectives

    DEFF Research Database (Denmark)

    Rossello Busquet, Ana; Soler, José

    2011-01-01

    in home environments, researches have been designing Home Energy Management Systems (HEMS). Efficiently managing and distributing electricity in the grid will also help to reduce the increase of energy consumption in the future. The power grid is evolving into the Smart Grid, which is being developed...... to distribute and produce electricity more efficiently. This paper presents the high level goals and requirements of HEMS and the Smart Grid. Additionally, it provides an overview on how Information and Communication Technologies (ICT) is involved in the Smart Grid and how they help to achieve the emerging...... functionalities that the Smart Grid can provide....

  8. Informed multi-objective decision-making in environmental management using Pareto optimality

    Science.gov (United States)

    Maureen C. Kennedy; E. David Ford; Peter Singleton; Mark Finney; James K. Agee

    2008-01-01

    Effective decisionmaking in environmental management requires the consideration of multiple objectives that may conflict. Common optimization methods use weights on the multiple objectives to aggregate them into a single value, neglecting valuable insight into the relationships among the objectives in the management problem.

  9. Database Foundation For The Configuration Management Of The CERN Accelerator Controls Systems

    CERN Document Server

    Zaharieva, Z; Peryt, M

    2011-01-01

    The Controls Configuration Database (CCDB) and its interfaces have been developed over the last 25 years in order to become nowadays the basis for the Configuration Management of the Controls System for all accelerators at CERN. The CCDB contains data for all configuration items and their relationships, required for the correct functioning of the Controls System. The configuration items are quite heterogeneous, depicting different areas of the Controls System – ranging from 3000 Front-End Computers, 75 000 software devices allowing remote control of the accelerators, to valid states of the Accelerators Timing System. The article will describe the different areas of the CCDB, their interdependencies and the challenges to establish the data model for such a diverse configuration management database, serving a multitude of clients. The CCDB tracks the life of the configuration items by allowing their clear identification, triggering of change management processes as well as providing status accounting and aud...

  10. GSIMF: a web service based software and database management system for the next generation grids

    International Nuclear Information System (INIS)

    Wang, N; Ananthan, B; Gieraltowski, G; May, E; Vaniachine, A

    2008-01-01

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

  11. Reactor pressure vessel embrittlement management through EPRI-Developed material property databases

    International Nuclear Information System (INIS)

    Rosinski, S.T.; Server, W.L.; Griesbach, T.J.

    1997-01-01

    Uncertainties and variability in U.S. reactor pressure vessel (RPV) material properties have caused the U.S. Nuclear Regulatory Commission (NRC) to request information from all nuclear utilities in order to assess the impact of these data scatter and uncertainties on compliance with existing regulatory criteria. Resolving the vessel material uncertainty issues requires compiling all available data into a single integrated database to develop a better understanding of irradiated material property behavior. EPRI has developed two comprehensive databases for utility implementation to compile and evaluate available material property and surveillance data. RPVDATA is a comprehensive reactor vessel materials database and data management program that combines data from many different sources into one common database. Searches of the data can be easily performed to identify plants with similar materials, sort through measured test results, compare the ''best-estimates'' for reported chemistries with licensing basis values, quantify variability in measured weld qualification and test data, identify relevant surveillance results for characterizing embrittlement trends, and resolve uncertainties in vessel material properties. PREP4 has been developed to assist utilities in evaluating existing unirradiated and irradiated data for plant surveillance materials; PREP4 evaluations can be used to assess the accuracy of new trend curve predictions. In addition, searches of the data can be easily performed to identify available Charpy shift and upper shelf data, review surveillance material chemistry and fabrication information, review general capsule irradiation information, and identify applicable source reference information. In support of utility evaluations to consider thermal annealing as a viable embrittlement management option, EPRI is also developing a database to evaluate material response to thermal annealing. Efforts are underway to develop an irradiation

  12. Multi-objective optimization approach for air traffic flow management

    Directory of Open Access Journals (Sweden)

    Fadil Rabie

    2017-01-01

    The decision-making stage was then performed with the aid of data clustering techniques to reduce the sizeof the Pareto-optimal set and obtain a smaller representation of the multi-objective design space, there by making it easier for the decision-maker to find satisfactory and meaningful trade-offs, and to select a preferred final design solution.

  13. Infrastructure asset management : The valuation of complex objects

    NARCIS (Netherlands)

    Verlaan, J.G.; De Ridder, H.A.J.

    2010-01-01

    For two years now, Rijkswaterstaat, an agency of the Dutch Ministry of Transport, Public Works and Water Management, is required to have a financial administration, which is based on accrual accounting principles. This involves the use of a balance sheet and the equivalent of a profit and loss

  14. Management of radiological related equipments. Creating the equipment management database and analysis of the repair and maintenance records

    International Nuclear Information System (INIS)

    Eguchi, Megumu; Taguchi, Keiichi; Oota, Takashi; Kajiwara, Hiroki; Ono, Kiyotune; Hagio, Kiyofumi; Uesugi, Ekizo; Kajishima, Tetuo; Ueda, Kenji

    2002-01-01

    In 1997, we established the committee of equipments maintenance and management in our department. We designed the database in order to classify and register all the radiological related equipments using Microsoft Access. The management of conditions and cost of each equipment has become easier, by keeping and recording the database in the equipments management ledger and by filing the history of repairs or maintenances occurred to modalities. We then accounted numbers, cost of repairs and downtimes from the data of the repair and maintenance records for four years, and we reexamined the causal analysis of failures and the contents of the regular maintenance for CT and MRI equipments that had shown the higher numbers of repairs. Consequently, we have found the improvement of registration method of the data and the more economical way to use of the cost of repair. (author)

  15. Preparation of Database for Land use Management in North East of Cairo

    International Nuclear Information System (INIS)

    El-Ghawaby, A.M.

    2012-01-01

    Environmental management in urban areas is difficult due to the amount and miscellaneous data needed for decision making. This amount of data is splendid without adequate database systems and modern methodologies. A geo-database building for East Cairo City Area (ECCA) is built to be used in the process of urban land-use suitability to achieve better performance compared with usual methods used. This Geo-database has required availability of detailed, accurate, updated and geographically referenced data on its terrain physical characteristics and its expected environmental hazards that may occur. A smart environmental suitability model for ECCA is developed and implemented using ERDAS IMAGINE 9.2. This model is capable of suggesting the more appropriate urban land-use, based on the existing spatial and non-spatial potentials and constraints.

  16. Development of intelligent database program for PSI/ISI data management of nuclear power plant

    International Nuclear Information System (INIS)

    Um, Byong Guk; Park, Un Su; Park, Ik Keun; Park, Yun Won; Kang, Suk Chul

    1998-01-01

    An intelligent database program has been developed under fully compatible with windows 95 for the construction of total support system and the effective management of Pre-/In-Service Inspection data. Using the database program, it can be executed the analysis and multi-dimensional evaluation of the defects detected during PSI/ISI in the pipe and the pressure vessel of the nuclear power plants. And also it can be used to investigate the NDE data inspected repetitively and the contents of treatment, and to offer the fundamental data for application of evaluation data related to Fracture Mechanics Analysis(FMA). Furthermore, the PSI/ISI database loads and material properties can be utilized to secure the higher degree of safety, integrity, reliability, and life-prediction of components and systems in nuclear power plant.

  17. ALARA database value in future outage work planning and dose management

    International Nuclear Information System (INIS)

    Miller, D.W.; Green, W.H.

    1995-01-01

    ALARA database encompassing job-specific duration and man-rem plant specific information over three refueling outages represents an invaluable tool for the outage work planner and ALARA engineer. This paper describes dose-management trends emerging based on analysis of three refueling outages at Clinton Power Station. Conclusions reached based on hard data available from a relational database dose-tracking system is a valuable tool for planning of future outage work. The system's ability to identify key problem areas during a refueling outage is improving as more outage comparative data becomes available. Trends over a three outage period are identified in this paper in the categories of number and type of radiation work permits implemented, duration of jobs, projected vs. actual dose rates in work areas, and accuracy of outage person-rem projection. The value of the database in projecting 1 and 5 year station person-rem estimates is discussed

  18. ALARA database value in future outage work planning and dose management

    Energy Technology Data Exchange (ETDEWEB)

    Miller, D.W.; Green, W.H. [Clinton Power Station Illinois Power Co., IL (United States)

    1995-03-01

    ALARA database encompassing job-specific duration and man-rem plant specific information over three refueling outages represents an invaluable tool for the outage work planner and ALARA engineer. This paper describes dose-management trends emerging based on analysis of three refueling outages at Clinton Power Station. Conclusions reached based on hard data available from a relational database dose-tracking system is a valuable tool for planning of future outage work. The system`s ability to identify key problem areas during a refueling outage is improving as more outage comparative data becomes available. Trends over a three outage period are identified in this paper in the categories of number and type of radiation work permits implemented, duration of jobs, projected vs. actual dose rates in work areas, and accuracy of outage person-rem projection. The value of the database in projecting 1 and 5 year station person-rem estimates is discussed.

  19. The MANAGE database: nutrient load and site characteristic updates and runoff concentration data.

    Science.gov (United States)

    Harmel, Daren; Qian, Song; Reckhow, Ken; Casebolt, Pamela

    2008-01-01

    The "Measured Annual Nutrient loads from AGricultural Environments" (MANAGE) database was developed to be a readily accessible, easily queried database of site characteristic and field-scale nutrient export data. The original version of MANAGE, which drew heavily from an early 1980s compilation of nutrient export data, created an electronic database with nutrient load data and corresponding site characteristics from 40 studies on agricultural (cultivated and pasture/range) land uses. In the current update, N and P load data from 15 additional studies of agricultural runoff were included along with N and P concentration data for all 55 studies. The database now contains 1677 watershed years of data for various agricultural land uses (703 for pasture/rangeland; 333 for corn; 291 for various crop rotations; 177 for wheat/oats; and 4-33 yr for barley, citrus, vegetables, sorghum, soybeans, cotton, fallow, and peanuts). Across all land uses, annual runoff loads averaged 14.2 kg ha(-1) for total N and 2.2 kg ha(-1) for total P. On average, these losses represented 10 to 25% of applied fertilizer N and 4 to 9% of applied fertilizer P. Although such statistics produce interesting generalities across a wide range of land use, management, and climatic conditions, regional crop-specific analyses should be conducted to guide regulatory and programmatic decisions. With this update, MANAGE contains data from a vast majority of published peer-reviewed N and P export studies on homogeneous agricultural land uses in the USA under natural rainfall-runoff conditions and thus provides necessary data for modeling and decision-making related to agricultural runoff. The current version can be downloaded at http://www.ars.usda.gov/spa/manage-nutrient.

  20. Management Approaches to Accomplish Contemporary Livestock Production-Conservation Objectives in Shortgrass Steppe

    Science.gov (United States)

    Traditional rangeland management in the shortgrass steppe has emphasized livestock production with moderate stocking rates, but alternative approaches will be needed to meet production objectives under increasing demands for conservation-oriented management. We investigated the utility of very inten...

  1. Using Behavior Objects to Manage Complexity in Virtual Worlds

    OpenAIRE

    Černý, Martin; Plch, Tomáš; Marko, Matěj; Gemrot, Jakub; Ondráček, Petr; Brom, Cyril

    2015-01-01

    The quality of high-level AI of non-player characters (NPCs) in commercial open-world games (OWGs) has been increasing during the past years. However, due to constraints specific to the game industry, this increase has been slow and it has been driven by larger budgets rather than adoption of new complex AI techniques. Most of the contemporary AI is still expressed as hard-coded scripts. The complexity and manageability of the script codebase is one of the key limiting factors for further AI ...

  2. Object-Relational Management of Multiply Represented Geographic Entities

    DEFF Research Database (Denmark)

    Friis-Christensen, Anders; Jensen, Christian Søndergaard

    2003-01-01

    Multiple representation occurs when information about the same geographic entity is represented electronically more than once. This occurs frequently in practice, and it invariably results in the occurrence of inconsistencies among the different representations. We propose to resolve this situation...... by introducing a multiple representation management system (MRMS), the schema of which includes rules that specify how to identify representations of the same entity, rules that specify consistency requirements, and rules used to restore consistency when necessary. In this paper, we demonstrate by means...

  3. A Spatio-Temporal Building Exposure Database and Information Life-Cycle Management Solution

    Directory of Open Access Journals (Sweden)

    Marc Wieland

    2017-04-01

    Full Text Available With an ever-increasing volume and complexity of data collected from a variety of sources, the efficient management of geospatial information becomes a key topic in disaster risk management. For example, the representation of assets exposed to natural disasters is subjected to changes throughout the different phases of risk management reaching from pre-disaster mitigation to the response after an event and the long-term recovery of affected assets. Spatio-temporal changes need to be integrated into a sound conceptual and technological framework able to deal with data coming from different sources, at varying scales, and changing in space and time. Especially managing the information life-cycle, the integration of heterogeneous information and the distributed versioning and release of geospatial information are important topics that need to become essential parts of modern exposure modelling solutions. The main purpose of this study is to provide a conceptual and technological framework to tackle the requirements implied by disaster risk management for describing exposed assets in space and time. An information life-cycle management solution is proposed, based on a relational spatio-temporal database model coupled with Git and GeoGig repositories for distributed versioning. Two application scenarios focusing on the modelling of residential building stocks are presented to show the capabilities of the implemented solution. A prototype database model is shared on GitHub along with the necessary scenario data.

  4. Preliminary study for unified management of CANDU safety codes and construction of database system

    International Nuclear Information System (INIS)

    Min, Byung Joo; Kim, Hyoung Tae

    2003-03-01

    It is needed to develop the Graphical User Interface(GUI) for the unified management of CANDU safety codes and to construct database system for the validation of safety codes, for which the preliminary study is done in the first stage of the present work. The input and output structures and data flow of CATHENA and PRESCON2 are investigated and the interaction of the variables between CATHENA and PRESCON2 are identified. Furthermore, PC versions of CATHENA and PRESCON2 codes are developed for the interaction of these codes and GUI(Graphic User Interface). The PC versions are assessed by comparing the calculation results with those by HP workstation or from FSAR(Final Safety Analysis Report). Preliminary study on the GUI for the safety codes in the unified management system are done. The sample of GUI programming is demonstrated preliminarily. Visual C++ is selected as the programming language for the development of GUI system. The data for Wolsong plants, reactor core, and thermal-hydraulic experiments executed in the inside and outside of the country, are collected and classified following the structure of the database system, of which two types are considered for the final web-based database system. The preliminary GUI programming for database system is demonstrated, which is updated in the future work

  5. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    Science.gov (United States)

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  6. Effective Management for National or Local Policy Objectives?

    DEFF Research Database (Denmark)

    Winter, Søren; Skou, Mette; Beer, Frederikke

    This research considers the role of local policies and management in affecting street-level bureaucrats’ actions in implementing national policy mandates. The focus on sanctioning behavior by social workers provides a strong test of these effects, given that the behaviors are both visible and have...... workers with a better fit with the goals of the organization increases workers’ compliance with local policy goals, but only when these diverge from national ones! Increasing staff capacity and information provision have simpler effects in fostering more compliance with the national policy mandate among...... workers. Managers’ addressing adverse selection problems seems more effective than coping with moral hazard. The combination of local politicians’ influence on the formation of local policy goals and managers’ influence in getting workers to comply with those indicates a very important role for policy...

  7. Database system for management of health physics and industrial hygiene records

    International Nuclear Information System (INIS)

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-01-01

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection

  8. Modified Delphi study to determine optimal data elements for inclusion in an emergency management database system

    Directory of Open Access Journals (Sweden)

    A. Jabar

    2012-03-01

    Conclusion: The use of a modified Expert Delphi study achieved consensus in aspects of hospital institutional capacity that can be translated into practical recommendations for implementation by the local emergency management database system. Additionally, areas of non-consensus have been identified where further work is required. This purpose of this study is to contribute to and aid in the development of this new system.

  9. Demonstration of SLUMIS: a clinical database and management information system for a multi organ transplant program.

    OpenAIRE

    Kurtz, M.; Bennett, T.; Garvin, P.; Manuel, F.; Williams, M.; Langreder, S.

    1991-01-01

    Because of the rapid evolution of the heart, heart/lung, liver, kidney and kidney/pancreas transplant programs at our institution, and because of a lack of an existing comprehensive database, we were required to develop a computerized management information system capable of supporting both clinical and research requirements of a multifaceted transplant program. SLUMIS (ST. LOUIS UNIVERSITY MULTI-ORGAN INFORMATION SYSTEM) was developed for the following reasons: 1) to comply with the reportin...

  10. Experience of MAPS in monitoring of personnel movement with on-line database management system

    International Nuclear Information System (INIS)

    Rajendran, T.S.; Anand, S.D.

    1992-01-01

    As a part of physical protection system, access control system has been installed in Madras Atomic Power Station(MAPS) to monitor and regulate the movement of persons within MAPS. The present system in its original form was meant only for security monitoring. A PC based database management system was added to this to computerize the availability of work force for actual work. (author). 2 annexures

  11. Development of database management system for monitoring of radiation workers for actinides

    International Nuclear Information System (INIS)

    Kalyane, G.N.; Mishra, L.; Nadar, M.Y.; Singh, I.S.; Rao, D.D.

    2012-01-01

    Annually around 500 radiation workers are monitored for estimation of lung activities and internal dose due to Pu/Am and U from various divisions of Bhabha Atomic Research Centre (Trombay) and from PREFRE and A3F facilities (Tarapur) in lung counting laboratory located at Bhabha Atomic Research Centre hospital under Routine and Special monitoring program. A 20 cm diameter phoswich and an array of HPGe detector were used for this purpose. In case of positive contamination, workers are followed up and monitored using both the detection systems in different geometries. Management of this huge data becomes difficult and therefore an easily retrievable database system containing all the relevant data of the monitored radiation workers. Materials and methods: The database management system comprises of three main modules integrated together: 1) Apache server installed on a Windows (XP) platform (Apache version 2.2.17) 2) MySQL database management system (MySQL version 5.5.8) 3) PHP (Preformatted Hypertext) programming language (PHP version 5.3.5). All the 3 modules work together seamlessly as a single software program. The front end user interaction is through an user friendly and interactive local web page where internet connection is not required. This front page has hyperlinks to many other pages, which have different utilities for the user. The user has to log in using username and password. Results and Conclusions: Database management system is used for entering, updating and management of lung monitoring data of radiation workers, The program is having following utilities: bio-data entry of new subjects, editing of bio-data of old subjects (only one subject at a time), entry of counting data of that day's lung monitoring, retrieval of old records based on a number of parameters and filters like date of counting, employee number, division, counts fulfilling a given criterion, etc. and calculation of MEQ CWT (Muscle Equivalent Chest Wall Thickness), energy

  12. SNPpy--database management for SNP data from genome wide association studies.

    Directory of Open Access Journals (Sweden)

    Faheem Mitha

    Full Text Available BACKGROUND: We describe SNPpy, a hybrid script database system using the Python SQLAlchemy library coupled with the PostgreSQL database to manage genotype data from Genome-Wide Association Studies (GWAS. This system makes it possible to merge study data with HapMap data and merge across studies for meta-analyses, including data filtering based on the values of phenotype and Single-Nucleotide Polymorphism (SNP data. SNPpy and its dependencies are open source software. RESULTS: The current version of SNPpy offers utility functions to import genotype and annotation data from two commercial platforms. We use these to import data from two GWAS studies and the HapMap Project. We then export these individual datasets to standard data format files that can be imported into statistical software for downstream analyses. CONCLUSIONS: By leveraging the power of relational databases, SNPpy offers integrated management and manipulation of genotype and phenotype data from GWAS studies. The analysis of these studies requires merging across GWAS datasets as well as patient and marker selection. To this end, SNPpy enables the user to filter the data and output the results as standardized GWAS file formats. It does low level and flexible data validation, including validation of patient data. SNPpy is a practical and extensible solution for investigators who seek to deploy central management of their GWAS data.

  13. Metabolonote: A wiki-based database for managing hierarchical metadata of metabolome analyses

    Directory of Open Access Journals (Sweden)

    Takeshi eAra

    2015-04-01

    Full Text Available Metabolomics—technology for comprehensive detection of small molecules in an organism—lags behind the other omics in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata, existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called TogoMD, with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers' understanding and use of data, but also submitters' motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitates the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http://metabolonote.kazusa.or.jp/.

  14. [Community health in primary health care teams: a management objective].

    Science.gov (United States)

    Nebot Adell, Carme; Pasarin Rua, Maribel; Canela Soler, Jaume; Sala Alvarez, Clara; Escosa Farga, Alex

    2016-12-01

    To describe the process of development of community health in a territory where the Primary Health Care board decided to include it in its roadmap as a strategic line. Evaluative research using qualitative techniques, including SWOT analysis on community health. Two-steps study. Primary care teams (PCT) of the Catalan Health Institute in Barcelona city. The 24 PCT belonging to the Muntanya-Dreta Primary Care Service in Barcelona city, with 904 professionals serving 557,430 inhabitants. Application of qualitative methodology using SWOT analysis in two steps (two-step study). Step 1: Setting up a core group consisting of local PCT professionals; collecting the community projects across the territory; SWOT analysis. Step 2: From the needs identified in the previous phase, a plan was developed, including a set of training activities in community health: basic, advanced, and a workshop to exchange experiences from the PCTs. A total of 80 team professionals received specific training in the 4 workshops held, one of them an advanced level. Two workshops were held to exchange experiences with 165 representatives from the local teams, and 22 PCTs presenting their practices. In 2013, 6 out of 24 PCTs have had a community diagnosis performed. Community health has achieved a good level of development in some areas, but this is not the general situation in the health care system. Its progression depends on the management support they have, the local community dynamics, and the scope of the Primary Health Care. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.

  15. The Net Enabled Waste Management Database in the context of an indicator of sustainable development for radioactive waste management

    International Nuclear Information System (INIS)

    Csullog, G.W.; Selling, H.; Holmes, R.; Benitez, J.C.

    2002-01-01

    The IAEA was selected by the UN to be the lead agency for the development and implementation of indicators of sustainable development for radioactive waste management (ISD-RW). Starting in late 1999, the UN initiated a program to consolidate a large number of indicators into a smaller set and advised the IAEA that a single ISD-RW was needed. In September 2001, a single indicator was developed by the IAEA and subsequently revised in February 2002. In parallel with its work on the ISD-RW, the IAEA developed and implemented the Net Enabled Waste Management Database (NEWMDB). The NEWMDB is an international database to collect, compile and disseminate information about nationally-based radioactive waste management programmes and waste inventories. The first data collection cycle with the NEWMDB (July 2001 to March 2002) demonstrated that much of the information needed to calculate the ISD-RW could be collected by the IAEA for its international database. However, the first data collection cycle indicated that capacity building, in the area of identifying waste classification schemes used in countries, is required. (author)

  16. Estimating US federal wildland fire managers' preferences toward competing strategic suppression objectives

    Science.gov (United States)

    David E. Calkin; Tyron Venn; Matthew Wibbenmeyer; Matthew P. Thompson

    2012-01-01

    Wildfire management involves significant complexity and uncertainty, requiring simultaneous consideration of multiple, non-commensurate objectives. This paper investigates the tradeoffs fire managers are willing to make among these objectives using a choice experiment methodology that provides three key advancements relative to previous stated-preference studies...

  17. Advanced technology for the reuse of learning objects in a course-management system

    NARCIS (Netherlands)

    Strijker, A.; Collis, Betty

    2005-01-01

    The creation, labelling, use, and re-use of learning objects is an important area of development involving learning technology. In the higher education context, instructors typically use a course management system (CMS) to organize and manage their own learning objects. The needs and practices of

  18. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 5

    International Nuclear Information System (INIS)

    2003-05-01

    The document consists of two parts: Overview and Country Waste Profile Reports for Reporting Year 2000. The first section contains overview reports that provide assessments of the achievements and shortcomings of the Net Enabled Waste Management Database (NEWMDB) during the first two data collection cycles (July 2001 to March 2002 and July 2002 to February 2003). The second part of the report includes a summary and compilation of waste management data submitted by Agency Member States in both the first and second data collection cycles

  19. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 9, May 2008

    International Nuclear Information System (INIS)

    2008-05-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an Internet-based application which contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories in IAEA Member States. It can be accessed via the following Internet address: http://www-newmdb.iaea.org. The Country Waste Profiles provide a concise summary of the information entered into the NEWMDB system by each participating Member State. This Profiles report is based on data collected using the NEWMDB from May to December 2007

  20. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 8, August 2007

    International Nuclear Information System (INIS)

    2007-08-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an Internet-based application which contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories in IAEA Member States. It can be accessed via the following Internet address: http://www-newmdb.iaea.org. The Country Waste Profiles provide a concise summary of the information entered into the NEWMDB system by each participating Member State. This Profiles report is based on data collected using the NEWMDB from May to December 2006

  1. Multi-objective congestion management by modified augmented ε-constraint method

    International Nuclear Information System (INIS)

    Esmaili, Masoud; Shayanfar, Heidar Ali; Amjady, Nima

    2011-01-01

    Congestion management is a vital part of power system operations in recent deregulated electricity markets. However, after relieving congestion, power systems may be operated with a reduced voltage or transient stability margin because of hitting security limits or increasing the contribution of risky participants. Therefore, power system stability margins should be considered within the congestion management framework. The multi-objective congestion management provides not only more security but also more flexibility than single-objective methods. In this paper, a multi-objective congestion management framework is presented while simultaneously optimizing the competing objective functions of congestion management cost, voltage security, and dynamic security. The proposed multi-objective framework, called modified augmented ε-constraint method, is based on the augmented ε-constraint technique hybridized by the weighting method. The proposed framework generates candidate solutions for the multi-objective problem including only efficient Pareto surface enhancing the competitiveness and economic effectiveness of the power market. Besides, the relative importance of the objective functions is explicitly modeled in the proposed framework. Results of testing the proposed multi-objective congestion management method on the New-England test system are presented and compared with those of the previous single objective and multi-objective techniques in detail. These comparisons confirm the efficiency of the developed method. (author)

  2. Integration of the ATLAS tag database with data management and analysis components

    International Nuclear Information System (INIS)

    Cranshaw, J; Malon, D; Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C

    2008-01-01

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted

  3. Development of a database system for the management of non-treated radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso, E-mail: ajp@cdtn.br, E-mail: cbf@cdtn.br, E-mail: vc@cdtn.br, E-mail: pos@cdtn.br, E-mail: seless@cdtn.br, E-mail: hauczmj@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  4. Integration of the ATLAS tag database with data management and analysis components

    Energy Technology Data Exchange (ETDEWEB)

    Cranshaw, J; Malon, D [Argonne National Laboratory, Argonne, IL 60439 (United States); Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C [Department of Physics and Astronomy, University of Glasgow, Glasgow, G12 8QQ, Scotland (United Kingdom)], E-mail: c.nicholson@physics.gla.ac.uk

    2008-07-15

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted.

  5. Development of a database system for the management of non-treated radioactive waste

    International Nuclear Information System (INIS)

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso

    2017-01-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  6. Public perceptions of planning objectives for regional level management of wild reindeer in Norway

    OpenAIRE

    Kaltenborn, Bjørn Petter; Hongslo, Eirin; Gundersen, Vegard; Andersen, Oddgeir

    2015-01-01

    We examined community perceptions of preferred objectives for wild reindeer management in Southern Norway as the former population-based model is being replaced with an area-based, multi-level regional management model spanning large mountain regions. Communally oriented objectives are favoured over economic benefits to landowners. Environmental attitudes discriminate on many of the issues and can be useful factors in sorting out levels of support for proposed management actions and compromis...

  7. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  8. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  9. 33 CFR 96.230 - What objectives must a safety management system meet?

    Science.gov (United States)

    2010-07-01

    ... management system meet? 96.230 Section 96.230 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE SAFE OPERATION OF VESSELS AND SAFETY MANAGEMENT SYSTEMS Company and Vessel Safety Management Systems § 96.230 What objectives must a safety...

  10. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    Science.gov (United States)

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance. Copyright © 2015. Published

  11. REALIZING BUSINESS PROCESS MANAGEMENT BY HELP OF A PROCESS MAPPING DATABASE TOOL

    CERN Document Server

    Vergili, Ceren

    2016-01-01

    In a typical business sector, processes are the building blocks of the achievement. A considerable percentage of the processes are consisting of business processes. This fact is bringing the fact that business sectors are in need of a management discipline. Business Process Management (BPM) is a discipline that combines modelling, automation, execution, control, measurement, and optimization of process by considering enterprise goals, spanning systems, employees, customers, and partners. CERN’s EN – HE – HM section desires to apply the BPM discipline appropriately for improving their necessary technical, administrative and managerial actions to supply appropriate CERN industrial transport, handling and lifting equipment and to maintain it. For this reason, a Process Mapping Database Tool is created to develop a common understanding about how the section members can visualize their processes, agree on quality standards and on how to improve. It provides a management support by establishing Process Charts...

  12. Integrated Storage and Management of Vector and Raster Data Based on Oracle Database

    Directory of Open Access Journals (Sweden)

    WU Zheng

    2017-05-01

    Full Text Available At present, there are many problems in the storage and management of multi-source heterogeneous spatial data, such as the difficulty of transferring, the lack of unified storage and the low efficiency. By combining relational database and spatial data engine technology, an approach for integrated storage and management of vector and raster data is proposed on the basis of Oracle in this paper. This approach establishes an integrated storage model on vector and raster data and optimizes the retrieval mechanism at first, then designs a framework for the seamless data transfer, finally realizes the unified storage and efficient management of multi-source heterogeneous data. By comparing experimental results with the international leading similar software ArcSDE, it is proved that the proposed approach has higher data transfer performance and better query retrieval efficiency.

  13. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    Science.gov (United States)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  14. Radioactive waste management profiles. Compilation from the Waste Management Database. No. 3

    International Nuclear Information System (INIS)

    2000-07-01

    In 1989, the International Atomic Energy Agency began development of the Waste Management Data Base (WMDB) to, primarily, establish a mechanism for the collection, archival and dissemination of information about radioactive waste management in Member States. This current report is a summary and compilation of waste management collected from Member States from February 1998 to December 1999 in response to the Agency's 1997/98 WMDB Questionnaire. Member States were asked to report waste accumulations up to the end of 1996 and to predict waste accumulations up to the end of 2014

  15. Ultra-Structure database design methodology for managing systems biology data and analyses

    Directory of Open Access Journals (Sweden)

    Hemminger Bradley M

    2009-08-01

    Full Text Available Abstract Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping. Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find

  16. Database foundation for the configuration management of the CERN accelerator controls systems

    International Nuclear Information System (INIS)

    Zaharieva, Z.; Martin Marquez, M.; Peryt, M.

    2012-01-01

    The Controls Configuration Database (CCDB) and its interfaces have been developed over the last 25 years in order to become nowadays the basis for the Configuration Management of the Control System for all accelerators at CERN. The CCDB contains data for all configuration items and their relationships, required for the correct functioning of the Control System. The configuration items are quite heterogeneous, depicting different areas of the Control System - ranging from 3000 Front-End Computers, 75000 software devices allowing remote control of the accelerators, to valid states of the Accelerators Timing System. The article will describe the different areas of the CCDB, their inter-dependencies and the challenges to establish the data model for such a diverse configuration management database, serving a multitude of clients. The CCDB tracks the life of the configuration items by allowing their clear identification, triggering of change management processes as well as providing status accounting and audits. This required the development and implementation of a combination of tailored processes and tools. The Controls System is a data-driven one - the data stored in the CCDB is extracted and propagated to the controls hardware in order to configure it remotely. Therefore a special attention is placed on data security and data integrity as an incorrectly configured item can have a direct impact on the operation of the accelerators. (authors)

  17. Data management in the TJ-II multi-layer database

    International Nuclear Information System (INIS)

    Vega, J.; Cremy, C.; Sanchez, E.; Portas, A.; Fabregas, J.A.; Herrera, R.

    2000-01-01

    The handling of TJ-II experimental data is performed by means of several software modules. These modules provide the resources for data capture, data storage and management, data access as well as general-purpose data visualisation. Here we describe the module related to data storage and management. We begin by introducing the categories in which data can be classified. Then, we describe the TJ-II data flow through the several file systems involved, before discussing the architecture of the TJ-II database. We review the concept of the 'discharge file' and identify the drawbacks that would result from a direct application of this idea to the TJ-II data. In order to overcome these drawbacks, we propose alternatives based on our concepts of signal family, user work-group and data priority. Finally, we present a model for signal storage. This model is in accordance with the database architecture and provides a proper framework for managing the TJ-II experimental data. In the model, the information is organised in layers and is distributed according to the generality of the information, from the common fields of all signals (first layer), passing through the specific records of signal families (second layer) and reaching the particular information of individual signals (third layer)

  18. Database security - how can developers and DBAs do it together and what can other Service Managers learn from it

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    This talk gives an overview of security threats affecting databases, preventive measures that we are taking at CERN and best practices in the industry. The presentation will describe how generic the threats are and how can other service managers profit from the database experience to protect other systems.

  19. A Survey of Health Management User Objectives Related to Diagnostic and Prognostic Metrics

    Science.gov (United States)

    Wheeler, Kevin R.; Kurtoglu, Tolga; Poll, Scott D.

    2010-01-01

    One of the most prominent technical challenges to effective deployment of health management systems is the vast difference in user objectives with respect to engineering development. In this paper, a detailed survey on the objectives of different users of health management systems is presented. These user objectives are then mapped to the metrics typically encountered in the development and testing of two main systems health management functions: diagnosis and prognosis. Using this mapping, the gaps between user goals and the metrics associated with diagnostics and prognostics are identified and presented with a collection of lessons learned from previous studies that include both industrial and military aerospace applications.

  20. Managing vulnerabilities and achieving compliance for Oracle databases in a modern ERP environment

    Science.gov (United States)

    Hölzner, Stefan; Kästle, Jan

    In this paper we summarize good practices on how to achieve compliance for an Oracle database in combination with an ERP system. We use an integrated approach to cover both the management of vulnerabilities (preventive measures) and the use of logging and auditing features (detective controls). This concise overview focusses on the combination Oracle and SAP and it’s dependencies, but also outlines security issues that arise with other ERP systems. Using practical examples, we demonstrate common vulnerabilities and coutermeasures as well as guidelines for the use of auditing features.

  1. Software configuration management plan for the TWRS controlled baseline database system [TCBD

    International Nuclear Information System (INIS)

    Spencer, S.G.

    1998-01-01

    LHMC, TWRS Business Management Organization (BMO) is designated as system owner, operator, and maintenance authority. The TWAS BMO identified the need for the TCBD. The TWRS BMO users have established all requirements for the database and are responsible for maintaining database integrity and control (after the interface data has been received). Initial interface data control and integrity is maintained through functional and administrative processes and is the responsibility of the database owners who are providing the data. The specific groups within the TWRS BMO affected by this plan are the Financial Management and TWRS Management Support Project, Master Planning, and the Financial Control Integration and Reporting. The interfaces between these organizations are through normal line management chain of command. The Master Planning Group is assigned the responsibility to continue development and maintenance of the TCBD. This group maintains information that includes identification of requirements and changes to those requirements in a TCBD project file. They are responsible for the issuance, maintenance, and change authority of this SCW. LHMC, TWRS TCBD Users are designated as providing the project's requirement changes for implementation and also testing of the TCBD during development. The Master Planning Group coordinates and monitors the user's requests for system requirements (new/existing) as well as beta and acceptance testing. Users are those individuals and organizations needing data or information from the TCBD and having both a need-to-know and the proper training and authority to access the database. Each user or user organization is required to comply with the established requirements and procedures governing the TCBD. Lockheed Martin Services, Inc. (LMSI) is designated the TCBD developer, maintainer, and custodian until acceptance and process testing of the system has been completed via the TWRS BMO. Once this occurs, the TCBD will be completed and

  2. Automation of a Beckman liquid scintillation counter for data capture and data-base management

    International Nuclear Information System (INIS)

    Neil, W.; Irwin, T.J.; Yang, J.J.

    1988-01-01

    A software package for the automation of a Beckman LS9000 liquid scintillation counter is presented. The package provides effective on-line data capture (with a Perkin Elmer 3230 32-bit minicomputer), data-base management, audit trail and archiving facilities. Key features of the package are rapid and flexible data entry, background subtraction, half-life correction, ability to queue several sample sets pending scintillation counting, and formatted report generation. A brief discussion is given on the development of customized data processing programs. (author)

  3. Applying Stochastic Metaheuristics to the Problem of Data Management in a Multi-Tenant Database Cluster

    Directory of Open Access Journals (Sweden)

    E. A. Boytsov

    2014-01-01

    Full Text Available A multi-tenant database cluster is a concept of a data-storage subsystem for cloud applications with the multi-tenant architecture. The cluster is a set of relational database servers with the single entry point, combined into one unit with a cluster controller. This system is aimed to be used by applications developed according to Software as a Service (SaaS paradigm and allows to place tenants at database servers so that providing their isolation, data backup and the most effective usage of available computational power. One of the most important problems about such a system is an effective distribution of data into servers, which affects the degree of individual cluster nodes load and faulttolerance. This paper considers the data-management approach, based on the usage of a load-balancing quality measure function. This function is used during initial placement of new tenants and also during placement optimization steps. Standard schemes of metaheuristic optimization such as simulated annealing and tabu search are used to find a better tenant placement.

  4. An outline of compilation and processing of metadata in agricultural database management system WebAgris

    Directory of Open Access Journals (Sweden)

    Tomaž Bartol

    2008-01-01

    Full Text Available The paper tackles international information system for agriculture Agris and local processing of metadata with database management software WebAgris. Operations are coordinated by the central repository at the FAO in Rome. Based on international standards and unified methodology, national and regional centers collect and process local publications, and then send the records to the central unit, which enables global website accessibility of the data. Earlier DOS-run application was based on package Agrin CDS/ISIS. The Current package WebAgris runs on web servers. Database construction tools and instructions are accessible on FAO Web pages. Data are entered through unified input masks. International consistency is achieved through authority control of certain elements, such as author or corporate affiliation. Central authority control is made available for subject headings, such as descriptors and subject categories.Subject indexing is based on controlled multilingual thesaurus Agrovoc, also available freely on the Internet. This glossary has become an important tool in the area of the international agricultural ontology. The data are exported to the central unit in XML format. Global database is currently eccessible to everyone. This international cooperative information network combines elements of a document repository,electronic publishing, open archiving and full text open access. Links with Google Scholar provide a good possibility for international promotion of publishing.

  5. Maintenance services of nuclear power plant using 3D as-built database management system

    International Nuclear Information System (INIS)

    Okumura, Kazutaka; Nakashima, Kazuhito; Mori, Norimasa; Azuma, Takashi

    2017-01-01

    Three dimensional As-built DAtabase Management System (NUSEC-ADAMS) is a system whose goal is to produce economical, speedy and accurate maintenance services of nuclear power plants by using 3D point group data. This system makes it possible to understand the plant situation remotely without field measurements. 3D point group data are collected before and after plant equipment installations, and it is stored to database after converted to viewable data on the web. Therefore, it can be shared in domestic network of a company and it can be connected with system diagram, specification of equipment, and additional information (e.g. maintenance record) by registering key information between 3D point group data and equipment's data. Thus, it reduces workload of pre-job field survey and improves work efficiency. In case of problem at a plant, if 3D as-built data is set to be seen on the network, it is possible to understand accurate information and the cause remotely in the beginning of problem. Collecting 3D point group data and updating database continuously keep as-built information up to date, therefore it improves accuracy of off-site study, and plant situation can be grasped timely. As a result, we can reduce workload and improve quality of maintenance services of nuclear power plants. (author)

  6. Managing expectations: assessment of chemistry databases generated by automated extraction of chemical structures from patents.

    Science.gov (United States)

    Senger, Stefan; Bartek, Luca; Papadatos, George; Gaulton, Anna

    2015-12-01

    First public disclosure of new chemical entities often takes place in patents, which makes them an important source of information. However, with an ever increasing number of patent applications, manual processing and curation on such a large scale becomes even more challenging. An alternative approach better suited for this large corpus of documents is the automated extraction of chemical structures. A number of patent chemistry databases generated by using the latter approach are now available but little is known that can help to manage expectations when using them. This study aims to address this by comparing two such freely available sources, SureChEMBL and IBM SIIP (IBM Strategic Intellectual Property Insight Platform), with manually curated commercial databases. When looking at the percentage of chemical structures successfully extracted from a set of patents, using SciFinder as our reference, 59 and 51 % were also found in our comparison in SureChEMBL and IBM SIIP, respectively. When performing this comparison with compounds as starting point, i.e. establishing if for a list of compounds the databases provide the links between chemical structures and patents they appear in, we obtained similar results. SureChEMBL and IBM SIIP found 62 and 59 %, respectively, of the compound-patent pairs obtained from Reaxys. In our comparison of automatically generated vs. manually curated patent chemistry databases, the former successfully provided approximately 60 % of links between chemical structure and patents. It needs to be stressed that only a very limited number of patents and compound-patent pairs were used for our comparison. Nevertheless, our results will hopefully help to manage expectations of users of patent chemistry databases of this type and provide a useful framework for more studies like ours as well as guide future developments of the workflows used for the automated extraction of chemical structures from patents. The challenges we have encountered

  7. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    Science.gov (United States)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  8. Adaptive multi-objective Optimization scheme for cognitive radio resource management

    KAUST Repository

    Alqerm, Ismail; Shihada, Basem

    2014-01-01

    configuration by exploiting optimization and machine learning techniques. In this paper, we propose an Adaptive Multi-objective Optimization Scheme (AMOS) for cognitive radio resource management to improve spectrum operation and network performance

  9. Utility of collecting metadata to manage a large scale conditions database in ATLAS

    International Nuclear Information System (INIS)

    Gallas, E J; Albrand, S; Borodin, M; Formica, A

    2014-01-01

    The ATLAS Conditions Database, based on the LCG Conditions Database infrastructure, contains a wide variety of information needed in online data taking and offline analysis. The total volume of ATLAS conditions data is in the multi-Terabyte range. Internally, the active data is divided into 65 separate schemas (each with hundreds of underlying tables) according to overall data taking type, detector subsystem, and whether the data is used offline or strictly online. While each schema has a common infrastructure, each schema's data is entirely independent of other schemas, except at the highest level, where sets of conditions from each subsystem are tagged globally for ATLAS event data reconstruction and reprocessing. The partitioned nature of the conditions infrastructure works well for most purposes, but metadata about each schema is problematic to collect in global tools from such a system because it is only accessible via LCG tools schema by schema. This makes it difficult to get an overview of all schemas, collect interesting and useful descriptive and structural metadata for the overall system, and connect it with other ATLAS systems. This type of global information is needed for time critical data preparation tasks for data processing and has become more critical as the system has grown in size and diversity. Therefore, a new system has been developed to collect metadata for the management of the ATLAS Conditions Database. The structure and implementation of this metadata repository will be described. In addition, we will report its usage since its inception during LHC Run 1, how it has been exploited in the process of conditions data evolution during LSI (the current LHC long shutdown) in preparation for Run 2, and long term plans to incorporate more of its information into future ATLAS Conditions Database tools and the overall ATLAS information infrastructure.

  10. Migration check tool: automatic plan verification following treatment management systems upgrade and database migration.

    Science.gov (United States)

    Hadley, Scott W; White, Dale; Chen, Xiaoping; Moran, Jean M; Keranen, Wayne M

    2013-11-04

    Software upgrades of the treatment management system (TMS) sometimes require that all data be migrated from one version of the database to another. It is necessary to verify that the data are correctly migrated to assure patient safety. It is impossible to verify by hand the thousands of parameters that go into each patient's radiation therapy treatment plan. Repeating pretreatment QA is costly, time-consuming, and may be inadequate in detecting errors that are introduced during the migration. In this work we investigate the use of an automatic Plan Comparison Tool to verify that plan data have been correctly migrated to a new version of a TMS database from an older version. We developed software to query and compare treatment plans between different versions of the TMS. The same plan in the two TMS systems are translated into an XML schema. A plan comparison module takes the two XML schemas as input and reports any differences in parameters between the two versions of the same plan by applying a schema mapping. A console application is used to query the database to obtain a list of active or in-preparation plans to be tested. It then runs in batch mode to compare all the plans, and a report of success or failure of the comparison is saved for review. This software tool was used as part of software upgrade and database migration from Varian's Aria 8.9 to Aria 11 TMS. Parameters were compared for 358 treatment plans in 89 minutes. This direct comparison of all plan parameters in the migrated TMS against the previous TMS surpasses current QA methods that relied on repeating pretreatment QA measurements or labor-intensive and fallible hand comparisons.

  11. Information flow in the DAMA Project beyond database managers: Information flow managers

    Energy Technology Data Exchange (ETDEWEB)

    Russell, L. [Argonne National Lab., IL (United States); Wolfson, O.; Yu, C. [Illinois Univ., Chicago, IL (United States)

    1996-03-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point-of-sale information, is being considered in the Demand Activated Manufacturing Project of the American Textile Partnership project. A scenario is examined in which 100,000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26,000 suppliers through the use of bill-of-materials explosions at four levels of detail. A new paradign the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced to keep estimates of demand as current as possible.

  12. An Object-Oriented View of Backend Databases in a Mobile Environment for Navy and Marine Corps Applications

    Science.gov (United States)

    2006-09-01

    Each of these layers will be described in more detail to include relevant technologies ( Java , PDA, Hibernate , and PostgreSQL) used to implement...Logic Layer -Object-Relational Mapper ( Hibernate ) Data 35 capable in order to interface with Java applications. Based on meeting the selection...further discussed. Query List Application Logic Layer HibernateApache - Java Servlet - Hibernate Interface -OR Mapper -RDBMS Interface

  13. Radiation risks management during implementation of activities at the 'Ukryttia' object

    International Nuclear Information System (INIS)

    Batij, V.G.; Rubezhanskij, Yu.I.; Rud'ko, V.M.; Stoyanov, A.I.

    2002-01-01

    A methodology for assessment and analysis of radiological risks occurring in the course of activities aimed at 'Ukryttia' Object operation and conversion was developed, as well as general scheme of risks management was prepared. The priority measures that are to be realized in order to create an effective system of radiological risks management at the 'Ukryttia' were proposed

  14. BIM Guidelines Inform Facilities Management Databases: A Case Study over Time

    Directory of Open Access Journals (Sweden)

    Karen Kensek

    2015-08-01

    Full Text Available A building information model (BIM contains data that can be accessed and exported for other uses during the lifetime of the building especially for facilities management (FM and operations. Working under the guidance of well-designed BIM guidelines to insure completeness and compatibility with FM software, architects and contractors can deliver an information rich data model that is valuable to the client. Large owners such as universities often provide these detailed guidelines and deliverable requirements to their building teams. Investigation of the University of Southern California (USC Facilities Management Service’s (FMS website showed a detailed plan including standards, file names, parameter lists, and other requirements of BIM data, which were specifically designated for facilities management use, as deliverables on new construction projects. Three critical details were also unearthed in the reading of these documents: Revit was the default BIM software; COBie was adapted to help meet facilities management goals; and EcoDomus provided a display of the collected data viewed through Navisworks. Published accounts about the Cinema Arts Complex developed with and under these guidelines reported positive results. Further examination with new projects underway reveal the rapidly changing relational database landscape evident in the new USC “Project Record Revit Requirement Execution Plan (PRxP”.

  15. A Relational Database Model for Managing Accelerator Control System Software at Jefferson Lab

    International Nuclear Information System (INIS)

    Sally Schaffner; Theodore Larrieu

    2001-01-01

    The operations software group at the Thomas Jefferson National Accelerator Facility faces a number of challenges common to facilities which manage a large body of software developed in-house. Developers include members of the software group, operators, hardware engineers and accelerator physicists.One management problem has been ensuring that all software has an identified owner who is still working at the lab. In some cases, locating source code for ''orphaned'' software has also proven to be difficult. Other challenges include ensuring that working versions of all operational software are available, testing changes to operational software without impacting operations, upgrading infrastructure software (OS, compilers, interpreters, commercial packages, share/freeware, etc), ensuring that appropriate documentation is available and up to date, underutilization of code reuse, input/output file management,and determining what other software will break if a software package is upgraded. This paper will describe a relational database model which has been developed to track this type of information and make it available to managers and developers.The model also provides a foundation for developing productivity-enhancing tools for automated building, versioning, and installation of software. This work was supported by the U.S. DOE contract No. DE-AC05-84ER40150

  16. Defining ecological and economical hydropoweroperations: a framework for managing dam releasesto meet multiple conflicting objectives

    Science.gov (United States)

    Irwin, Elise R.

    2014-01-01

    Hydroelectric dams are a flexible source of power, provide flood control, and contribute to the economic growth of local communities through real-estate and recreation. Yet the impoundment of rivers can alter and fragment miles of critical riverine habitat needed for other competing needs such as downstream consumptive water use, fish and wildlife population viability, or other forms of recreation. Multiple conflicting interests can compromise progressive management especially with recognized uncertainties related to whether management actions will fulfill the objectives of policy makers, resource managers and/or facility owners. Decision analytic tools were used in a stakeholder-driven process to develop and implement a template for evaluation and prediction of the effects of water resource management of multiple-use systems under the context provided by R.L. Harris Dam on the Tallapoosa River, Alabama, USA. The approach provided a transparent and structured framework for decision-making and incorporated both existing and new data to meet multiple management objectives. Success of the template has been evaluated by the stakeholder governing body in an adaptive resource management framework since 2005 and is ongoing. Consequences of management of discharge at the dam were evaluated annually relative to stakeholder satisfaction to allow for adjustment of both management scenarios and objectives. This template can be applied to attempt to resolve conflict inherent in many dam-regulated systems where management decisions impact diverse values of stakeholders.

  17. Application of multiple objective models to water resources planning and management

    International Nuclear Information System (INIS)

    North, R.M.

    1993-01-01

    Over the past 30 years, we have seen the birth and growth of multiple objective analysis from an idea without tools to one with useful applications. Models have been developed and applications have been researched to address the multiple purposes and objectives inherent in the development and management of water resources. A practical approach to multiple objective modelling incorporates macroeconomic-based policies and expectations in order to optimize the results from both engineering (structural) and management (non-structural) alternatives, while taking into account the economic and environmental trade-offs. (author). 27 refs, 4 figs, 3 tabs

  18. [Management by objectives: an experience by transfusion and immunology service in Rabat].

    Science.gov (United States)

    Essakalli, M; Atouf, O; Ouadghiri, S; Bouayad, A; Drissi, A; Sbain, K; Sakri, L; Benseffaj, N; Brick, C

    2013-09-01

    The management by objectives method has become highly used in health management. In this context, the blood transfusion and haemovigilance service has been chosen for a pilot study by the Head Department of the Ibn Sina Hospital in Rabat. This study was conducted from 2009 to 2011, in four steps. The first one consisted in preparing human resources (information and training), identifying the strengths and weaknesses of the service and the identification and classification of the service's users. The second step was the elaboration of the terms of the contract, which helped to determine two main strategic objectives: to strengthen the activities of the service and move towards the "status of reference." Each strategic objective had been declined in operational objectives, then in actions and the means required for the implementation of each action. The third step was the implementation of each action (service, head department) so as to comply with the terms of the contract as well as to meet the deadlines. Based on assessment committees, the last step consisted in the evaluation process. This evaluation was performed using monitoring indicators and showed that management by objectives enabled the Service to reach the "clinical governance level", to optimize its human and financial resources and to reach the level of "national laboratory of reference in histocompatibility". The scope of this paper is to describe the four steps of this pilot study and to explain the usefulness of the management by objectives method in health management. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  19. IMPROVEMENT OF THE F-PERCEPTORY APPROACH THROUGH MANAGEMENT OF FUZZY COMPLEX GEOGRAPHIC OBJECTS

    Directory of Open Access Journals (Sweden)

    B. Khalfi

    2015-08-01

    Full Text Available In the real world, data is imperfect and in various ways such as imprecision, vagueness, uncertainty, ambiguity and inconsistency. For geographic data, the fuzzy aspect is mainly manifested in time, space and the function of objects and is due to a lack of precision. Therefore, the researchers in the domain emphasize the importance of modeling data structures in GIS but also their lack of adaptation to fuzzy data. The F-Perceptory approachh manages the modeling of imperfect geographic information with UML. This management is essential to maintain faithfulness to reality and to better guide the user in his decision-making. However, this approach does not manage fuzzy complex geographic objects. The latter presents a multiple object with similar or different geographic shapes. So, in this paper, we propose to improve the F-Perceptory approach by proposing to handle fuzzy complex geographic objects modeling. In a second step, we propose its transformation to the UML modeling.

  20. THE DECISION-MAKING SUBSYSTEM OF THE MANAGEMENT BY OBJECTIVES WITH FRAMEWORK PROCEDURE

    Directory of Open Access Journals (Sweden)

    Munteanu Stolojanu Victoria-Ileana

    2011-12-01

    Full Text Available Management by objectives framwork- a management system based on strict targets to determine the executors, who participate directly in establishing their close correlation and that rewards and sanctions to achieve the predetermined objectives. The main goal of the article is represented by the goals as management elements but also as the nervous system of the managerial actions. The presence of the functional deviationism in organizations, the need of reconsidering the functional deviationism in organizations, the need of reconsidering the strategic planning, all these lead under economic and financial crisis to the need of elaborating a tool which should the manager find an optimum combination between the limited resources and carryng out the objectives with a minimum cost.

  1. New perspectives in toxicological information management, and the role of ISSTOX databases in assessing chemical mutagenicity and carcinogenicity.

    Science.gov (United States)

    Benigni, Romualdo; Battistelli, Chiara Laura; Bossa, Cecilia; Tcheremenskaia, Olga; Crettaz, Pierre

    2013-07-01

    Currently, the public has access to a variety of databases containing mutagenicity and carcinogenicity data. These resources are crucial for the toxicologists and regulators involved in the risk assessment of chemicals, which necessitates access to all the relevant literature, and the capability to search across toxicity databases using both biological and chemical criteria. Towards the larger goal of screening chemicals for a wide range of toxicity end points of potential interest, publicly available resources across a large spectrum of biological and chemical data space must be effectively harnessed with current and evolving information technologies (i.e. systematised, integrated and mined), if long-term screening and prediction objectives are to be achieved. A key to rapid progress in the field of chemical toxicity databases is that of combining information technology with the chemical structure as identifier of the molecules. This permits an enormous range of operations (e.g. retrieving chemicals or chemical classes, describing the content of databases, finding similar chemicals, crossing biological and chemical interrogations, etc.) that other more classical databases cannot allow. This article describes the progress in the technology of toxicity databases, including the concepts of Chemical Relational Database and Toxicological Standardized Controlled Vocabularies (Ontology). Then it describes the ISSTOX cluster of toxicological databases at the Istituto Superiore di Sanitá. It consists of freely available databases characterised by the use of modern information technologies and by curation of the quality of the biological data. Finally, this article provides examples of analyses and results made possible by ISSTOX.

  2. Developing genomic knowledge bases and databases to support clinical management: current perspectives.

    Science.gov (United States)

    Huser, Vojtech; Sincan, Murat; Cimino, James J

    2014-01-01

    Personalized medicine, the ability to tailor diagnostic and treatment decisions for individual patients, is seen as the evolution of modern medicine. We characterize here the informatics resources available today or envisioned in the near future that can support clinical interpretation of genomic test results. We assume a clinical sequencing scenario (germline whole-exome sequencing) in which a clinical specialist, such as an endocrinologist, needs to tailor patient management decisions within his or her specialty (targeted findings) but relies on a genetic counselor to interpret off-target incidental findings. We characterize the genomic input data and list various types of knowledge bases that provide genomic knowledge for generating clinical decision support. We highlight the need for patient-level databases with detailed lifelong phenotype content in addition to genotype data and provide a list of recommendations for personalized medicine knowledge bases and databases. We conclude that no single knowledge base can currently support all aspects of personalized recommendations and that consolidation of several current resources into larger, more dynamic and collaborative knowledge bases may offer a future path forward.

  3. PlantDB – a versatile database for managing plant research

    Directory of Open Access Journals (Sweden)

    Gruissem Wilhelm

    2008-01-01

    Full Text Available Abstract Background Research in plant science laboratories often involves usage of many different species, cultivars, ecotypes, mutants, alleles or transgenic lines. This creates a great challenge to keep track of the identity of experimental plants and stored samples or seeds. Results Here, we describe PlantDB – a Microsoft® Office Access database – with a user-friendly front-end for managing information relevant for experimental plants. PlantDB can hold information about plants of different species, cultivars or genetic composition. Introduction of a concise identifier system allows easy generation of pedigree trees. In addition, all information about any experimental plant – from growth conditions and dates over extracted samples such as RNA to files containing images of the plants – can be linked unequivocally. Conclusion We have been using PlantDB for several years in our laboratory and found that it greatly facilitates access to relevant information.

  4. GaMeTix – new software for management of MCQ databases

    Directory of Open Access Journals (Sweden)

    Dimitrolos Krajčí

    2015-12-01

    Full Text Available We have developed new software named GaMeTix for management of large collections of examination questions written in a variety of MCQ (Multiple Choice Question formats. This application provides a wide scale of functionality modes like collecting and editing sets of questions, generating electronic versions of examination tests, printing examination paper sheets and exporting sets of questions in a plain text document for hard copy archiving or transfer to specific electronic testing applications. The content of the database is searchable according to several criteria using sets of filters that characterize each question. Collections of MC questions can be divided or merged together according to results of the filtering function. Examination questions can be complemented with pictures or diagrams in .jpg format. GaMeTix is a portable, freeware application that runs on MS Windows operating systems.

  5. Covariant Evolutionary Event Analysis for Base Interaction Prediction Using a Relational Database Management System for RNA.

    Science.gov (United States)

    Xu, Weijia; Ozer, Stuart; Gutell, Robin R

    2009-01-01

    With an increasingly large amount of sequences properly aligned, comparative sequence analysis can accurately identify not only common structures formed by standard base pairing but also new types of structural elements and constraints. However, traditional methods are too computationally expensive to perform well on large scale alignment and less effective with the sequences from diversified phylogenetic classifications. We propose a new approach that utilizes coevolutional rates among pairs of nucleotide positions using phylogenetic and evolutionary relationships of the organisms of aligned sequences. With a novel data schema to manage relevant information within a relational database, our method, implemented with a Microsoft SQL Server 2005, showed 90% sensitivity in identifying base pair interactions among 16S ribosomal RNA sequences from Bacteria, at a scale 40 times bigger and 50% better sensitivity than a previous study. The results also indicated covariation signals for a few sets of cross-strand base stacking pairs in secondary structure helices, and other subtle constraints in the RNA structure.

  6. Performance Evaluation of an Object Management Policy Approach for P2P Networks

    Directory of Open Access Journals (Sweden)

    Dario Vieira

    2012-01-01

    Full Text Available The increasing popularity of network-based multimedia applications poses many challenges for content providers to supply efficient and scalable services. Peer-to-peer (P2P systems have been shown to be a promising approach to provide large-scale video services over the Internet since, by nature, these systems show high scalability and robustness. In this paper, we propose and analyze an object management policy approach for video web cache in a P2P context, taking advantage of object's metadata, for example, video popularity, and object's encoding techniques, for example, scalable video coding (SVC. We carry out trace-driven simulations so as to evaluate the performance of our approach and compare it against traditional object management policy approaches. In addition, we study as well the impact of churn on our approach and on other object management policies that implement different caching strategies. A YouTube video collection which records over 1.6 million video's log was used in our experimental studies. The experiment results have showed that our proposed approach can improve the performance of the cache substantially. Moreover, we have found that neither the simply enlargement of peers' storage capacity nor a zero replicating strategy is effective actions to improve performance of an object management policy.

  7. SeedStor: A Germplasm Information Management System and Public Database.

    Science.gov (United States)

    Horler, R S P; Turner, A S; Fretter, P; Ambrose, M

    2018-01-01

    SeedStor (https://www.seedstor.ac.uk) acts as the publicly available database for the seed collections held by the Germplasm Resources Unit (GRU) based at the John Innes Centre, Norwich, UK. The GRU is a national capability supported by the Biotechnology and Biological Sciences Research Council (BBSRC). The GRU curates germplasm collections of a range of temperate cereal, legume and Brassica crops and their associated wild relatives, as well as precise genetic stocks, near-isogenic lines and mapping populations. With >35,000 accessions, the GRU forms part of the UK's plant conservation contribution to the Multilateral System (MLS) of the International Treaty for Plant Genetic Resources for Food and Agriculture (ITPGRFA) for wheat, barley, oat and pea. SeedStor is a fully searchable system that allows our various collections to be browsed species by species through to complicated multipart phenotype criteria-driven queries. The results from these searches can be downloaded for later analysis or used to order germplasm via our shopping cart. The user community for SeedStor is the plant science research community, plant breeders, specialist growers, hobby farmers and amateur gardeners, and educationalists. Furthermore, SeedStor is much more than a database; it has been developed to act internally as a Germplasm Information Management System that allows team members to track and process germplasm requests, determine regeneration priorities, handle cost recovery and Material Transfer Agreement paperwork, manage the Seed Store holdings and easily report on a wide range of the aforementioned tasks. © The Author(s) 2017. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.

  8. 'Isotopo' a database application for facile analysis and management of mass isotopomer data.

    Science.gov (United States)

    Ahmed, Zeeshan; Zeeshan, Saman; Huber, Claudia; Hensel, Michael; Schomburg, Dietmar; Münch, Richard; Eylert, Eva; Eisenreich, Wolfgang; Dandekar, Thomas

    2014-01-01

    The composition of stable-isotope labelled isotopologues/isotopomers in metabolic products can be measured by mass spectrometry and supports the analysis of pathways and fluxes. As a prerequisite, the original mass spectra have to be processed, managed and stored to rapidly calculate, analyse and compare isotopomer enrichments to study, for instance, bacterial metabolism in infection. For such applications, we provide here the database application 'Isotopo'. This software package includes (i) a database to store and process isotopomer data, (ii) a parser to upload and translate different data formats for such data and (iii) an improved application to process and convert signal intensities from mass spectra of (13)C-labelled metabolites such as tertbutyldimethylsilyl-derivatives of amino acids. Relative mass intensities and isotopomer distributions are calculated applying a partial least square method with iterative refinement for high precision data. The data output includes formats such as graphs for overall enrichments in amino acids. The package is user-friendly for easy and robust data management of multiple experiments. The 'Isotopo' software is available at the following web link (section Download): http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. The package contains three additional files: software executable setup (installer), one data set file (discussed in this article) and one excel file (which can be used to convert data from excel to '.iso' format). The 'Isotopo' software is compatible only with the Microsoft Windows operating system. http://spp1316.uni-wuerzburg.de/bioinformatics/isotopo/. © The Author(s) 2014. Published by Oxford University Press.

  9. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    Science.gov (United States)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  10. Multi-objective game-theory models for conflict analysis in reservoir watershed management.

    Science.gov (United States)

    Lee, Chih-Sheng

    2012-05-01

    This study focuses on the development of a multi-objective game-theory model (MOGM) for balancing economic and environmental concerns in reservoir watershed management and for assistance in decision. Game theory is used as an alternative tool for analyzing strategic interaction between economic development (land use and development) and environmental protection (water-quality protection and eutrophication control). Geographic information system is used to concisely illustrate and calculate the areas of various land use types. The MOGM methodology is illustrated in a case study of multi-objective watershed management in the Tseng-Wen reservoir, Taiwan. The innovation and advantages of MOGM can be seen in the results, which balance economic and environmental concerns in watershed management and which can be interpreted easily by decision makers. For comparison, the decision-making process using conventional multi-objective method to produce many alternatives was found to be more difficult. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Configuration management issues and objectives for a real-time research flight test support facility

    Science.gov (United States)

    Yergensen, Stephen; Rhea, Donald C.

    1988-01-01

    Presented are some of the critical issues and objectives pertaining to configuration management for the NASA Western Aeronautical Test Range (WATR) of Ames Research Center. The primary mission of the WATR is to provide a capability for the conduct of aeronautical research flight test through real-time processing and display, tracking, and communications systems. In providing this capability, the WATR must maintain and enforce a configuration management plan which is independent of, but complimentary to, various research flight test project configuration management systems. A primary WATR objective is the continued development of generic research flight test project support capability, wherein the reliability of WATR support provided to all project users is a constant priority. Therefore, the processing of configuration change requests for specific research flight test project requirements must be evaluated within a perspective that maintains this primary objective.

  12. Design and management of database using microsoft access program: application in neurointerventional unit

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Seon Moon; Jeong, Gyeong Un; Kim, Tae Il; Cha, Ji Hyeon; Pyun, Hae Wook; Woo, Ryu Chang; Kim, Ho Sung; Suh, Dae Chul [University of Ulsan, College of Medicine, Seoul (Korea, Republic of)

    2005-10-15

    Complex clinical information in cerebral angiointervention unit requires effective management of statistical analysis for the classification of diagnosis and intervention including follow-up data from the interventional treatment. We present an application of Microsoft Access program for the management of patient data in cerebral angiointervention unit which suggests practical methods in recording and analyzing the patient data. Since January 2002, patient information from cerebral angiointervention was managed by a database with over 4000 patients. We designed a program which incorporates size items; Table, Query, Form, Report, Page and Macro. Patient data, follow-up data and information regarding diagnosis and intervention were established in the Form section, related by serial number, and connected to each other to an independent Table. Problems in running the program were corrected by establishing Entity Relationship (ER) diagnosis of Tables to define relationships between Tables. Convenient Queries, Forms and Reports were created to display expected information were applied from selected Tables. The relationship program which incorporated six items conveniently provided the number of cases per year, incidence of disease, lesion site, and case analysis based on interventional treatment. We were able to follow the patients after the interventional procedures by creating queries and reports. Lists of disease and patients files were identified easily each time by the Macro function. In addition, product names, size and characteristics of materials used were indexed and easily available. Microsoft Access program is effective in the management of patient data in cerebral angiointervention unit. Accumulation of large amounts of complex data handled by multiple users may require client/sever solutions such as a Microsoft SQL Server.

  13. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    International Nuclear Information System (INIS)

    Shao, Weber; Kupelian, Patrick A; Wang, Jason; Low, Daniel A; Ruan, Dan

    2014-01-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  14. The use of database management systems and artificial intelligence in automating the planning of optical navigation pictures

    Science.gov (United States)

    Davis, Robert P.; Underwood, Ian M.

    1987-01-01

    The use of database management systems (DBMS) and AI to minimize human involvement in the planning of optical navigation pictures for interplanetary space probes is discussed, with application to the Galileo mission. Parameters characterizing the desirability of candidate pictures, and the program generating them, are described. How these parameters automatically build picture records in a database, and the definition of the database structure, are then discussed. The various rules, priorities, and constraints used in selecting pictures are also described. An example is provided of an expert system, written in Prolog, for automatically performing the selection process.

  15. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    Science.gov (United States)

    Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan

    2014-03-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  16. The role of a multiagent organization in the management of nuclear plants operational objectives

    International Nuclear Information System (INIS)

    Papin, B.; Arnaud, G.; Fiche, Ch.; Dumas, M.

    1998-01-01

    The aim of the ESCRIME project, now reaching conclusions, was to optimize the collaboration between plant operators and computerized systems for the management of plant safety and availability. The requirement for putting such a collaboration into effective application was to clarify the operational strategies in order to enable a dialog between men and systems based on a ''shared representation'' of the plant operation. Thus a new vision of this operation, focusing on the completion of the plant fundamental objectives has been defined. This so-called means-ends strategy enables the simultaneous management of these objectives and provides explicit rules (priority...) to solve eventual conflicts (Physical interaction, sharing of resources...) apt to occur. The direct implementation of such strategies by the operators, using a dedicated man/machine interface, has made the object of the second phase of the ESCRIME project, already presented in LOEN in 1996. The third phase of the program, more prospective, aimed at finding an efficient way of sharing the management of plant objectives between the operator(s) and ''intelligent controllers'', preserving the role of the man in the decision loop and enabling him to maintain his awareness about the evolution of the plant. The multiagent organization has been identified as a promising way of sharing executive and decisive tasks between human and computerized agents. This organization, based on explicit ''social'' rules managing the negotiation between autonomous agents, appeared quite perfectly adapted to the management of conflicting objectives like in the means - ends strategy. The main problem to solve was, in fact, to check whether humans could be brought under this social organization in such a way as they could communicate and negotiate efficiently with the computerized agents. After a brief recall of the principles of the means - ends strategy for plant objectives management, this paper will detail the work made on

  17. Implementation of a framework for multi-species, multi-objective adaptive management in Delaware Bay

    Science.gov (United States)

    McGowan, Conor P.; Smith, David R.; Nichols, James D.; Lyons, James E.; Sweka, John A.; Kalasz, Kevin; Niles, Lawrence J.; Wong, Richard; Brust, Jeffrey; Davis, Michelle C.; Spear, Braddock

    2015-01-01

    Decision analytic approaches have been widely recommended as well suited to solving disputed and ecologically complex natural resource management problems with multiple objectives and high uncertainty. However, the difference between theory and practice is substantial, as there are very few actual resource management programs that represent formal applications of decision analysis. We applied the process of structured decision making to Atlantic horseshoe crab harvest decisions in the Delaware Bay region to develop a multispecies adaptive management (AM) plan, which is currently being implemented. Horseshoe crab harvest has been a controversial management issue since the late 1990s. A largely unregulated horseshoe crab harvest caused a decline in crab spawning abundance. That decline coincided with a major decline in migratory shorebird populations that consume horseshoe crab eggs on the sandy beaches of Delaware Bay during spring migration. Our approach incorporated multiple stakeholders, including fishery and shorebird conservation advocates, to account for diverse management objectives and varied opinions on ecosystem function. Through consensus building, we devised an objective statement and quantitative objective function to evaluate alternative crab harvest policies. We developed a set of competing ecological models accounting for the leading hypotheses on the interaction between shorebirds and horseshoe crabs. The models were initially weighted based on stakeholder confidence in these hypotheses, but weights will be adjusted based on monitoring and Bayesian model weight updating. These models were used together to predict the effects of management actions on the crab and shorebird populations. Finally, we used a dynamic optimization routine to identify the state dependent optimal harvest policy for horseshoe crabs, given the possible actions, the stated objectives and our competing hypotheses about system function. The AM plan was reviewed, accepted and

  18. A case study of resources management planning with multiple objectives and projects

    Science.gov (United States)

    Peterson, David L.; Silsbee, David G.; Schmoldt, Daniel L.

    1994-09-01

    Each National Park Service unit in the United States produces a resources management plan (RMP) every four years or less. The plans commit budgets and personnel to specific projects for four years, but they are prepared with little quantitative and analytical rigor and without formal decision-making tools. We have previously described a multiple objective planning process for inventory and monitoring programs (Schmoldt and others 1994). To test the applicability of that process for the more general needs of resources management planning, we conducted an exercise on the Olympic National Park (NP) in Washington State, USA. Eight projects were selected as typical of those considered in RMPs and five members of the Olympic NP staff used the analytic hierarchy process (AHP) to prioritize the eight projects with respect to their implicit management objectives. By altering management priorities for the park, three scenarios were generated. All three contained some similarities in rankings for the eight projects, as well as some differences. Mathematical allocations of money and people differed among these scenarios and differed substantially from what the actual 1990 Olympic NP RMP contains. Combining subjective priority measures with budget dollars and personnel time into an objective function creates a subjective economic metric for comparing different RMP’s. By applying this planning procedure, actual expenditures of budget and personnel in Olympic NP can agree more closely with the staff’s management objectives for the park.

  19. How the architect-engineer manages design objectives and restraints for optimizing sodium cooled reactors

    International Nuclear Information System (INIS)

    Roe, K.A.; Roe, K.K.

    1978-01-01

    The design objectives of low capital and operating costs and high reliability are best attained by carefully defining criteria early in the development stage. Throughout the design development, unusual attention to constructibility, reliability and availability requirements, and the early resolution of licensing issues by designated engineering specialists are some of the approaches used to minimize design restraints. Effective management of these design objectives and restraints can assure that, on balance, LMFBR costs can be improved, reliability increased, and maintenance can be effective. (author)

  20. An Object-oriented Knowledge Link Model for General Knowledge Management

    OpenAIRE

    Xiao-hong, CHEN; Bang-chuan, LAI

    2005-01-01

    The knowledge link is the basic on knowledge share and the indispensable part in knowledge standardization management. In this paper, a object-oriented knowledge link model is proposed for general knowledge management by using objectoriented representation based on knowledge levels system. In the model, knowledge link is divided into general knowledge link and integrated knowledge with corresponding link properties and methods. What’s more, its BNF syntax is described and designed.

  1. Flood risk management in Flanders: from flood risk objectives to appropriate measures through state assessment

    Directory of Open Access Journals (Sweden)

    Verbeke Sven

    2016-01-01

    Full Text Available In compliance with the EU Flood Directive to reduce flood risk, flood risk management objectives are indispensable for the delineation of necessary measures. In Flanders, flood risk management objectives are part of the environmental objectives which are judicially integrated by the Decree on Integrated Water Policy. Appropriate objectives were derived by supporting studies and extensive consultation on a local, regional and policy level. Under a general flood risk objective sub-objectives are formulated for different aspects: water management and safety, shipping, ecology, and water supply. By developing a risk matrix, it is possible to assess the current state of flood risk and to judge where action is needed to decrease the risk. Three different states of flood risk are distinguished: a acceptable risk, where no action is needed, b intermediate risk where the risk should be reduced by cost efficient actions, and c unacceptable risk, where action is necessary. For each particular aspect, the severity of the consequences of flooding is assessed by quantifiable indicators, such as economic risk, people at risk and ecological flood tolerance. The framework also allows evaluating the effects of the implemented measures and the autonomous development such as climate change and land use change. This approach gives a quantifiable assessment of state, and enables a prioritization of flood risk measures for the reduction of flood risk in a cost efficient and sustainable way.

  2. Obesity Management in Europe: Current Status and Objectives for the Future

    Science.gov (United States)

    Uerlich, Magdalena F.; Yumuk, Volkan; Finer, Nick; Basdevant, Arnaud; Visscher, Tommy L.S.

    2016-01-01

    Objective This study aims at assessing the status of obesity management in the European region and identifying future goals and objectives of professionals working in the field of obesity. Methods Presidents of all 31 EASO-affiliated (EASO = European Association for the Study of Obesity) national associations for the study of obesity were asked to invite 5 obesity experts from their country to participate in a survey. A total of 74 obesity professionals out of 23 countries participated. Questions addressed the development of guidelines, the status of obesity management, and goals and objectives for the future in obesity management. Further, EASO's three vice-presidents participated in in-depth, semi-structured interviews, in which they were asked to provide their reflection on the survey data. Results Most countries define obesity as a clinical and chronic disease, but various differences in obesity management standards exist across Europe. Existing guidelines mainly focus on the acute treatment of obesity rather than on long-term approaches. Conclusion Multidisciplinary approaches for obesity management and the collaboration between general practitioners and hospitals as well as between professionals at the local level and networks of obesity management centers need to be improved across Europe. Good practices and evidence are available. PMID:27553443

  3. Obesity Management in Europe: Current Status and Objectives for the Future

    Directory of Open Access Journals (Sweden)

    Magdalena F. Uerlich

    2016-08-01

    Full Text Available Objective: This study aims at assessing the status of obesity management in the European region and identifying future goals and objectives of professionals working in the field of obesity. Methods: Presidents of all 31 EASO-affiliated (EASO = European Association for the Study of Obesity national associations for the study of obesity were asked to invite 5 obesity experts from their country to participate in a survey. A total of 74 obesity professionals out of 23 countries participated. Questions addressed the development of guidelines, the status of obesity management, and goals and objectives for the future in obesity management. Further, EASO's three vice-presidents participated in in-depth, semi-structured interviews, in which they were asked to provide their reflection on the survey data. Results: Most countries define obesity as a clinical and chronic disease, but various differences in obesity management standards exist across Europe. Existing guidelines mainly focus on the acute treatment of obesity rather than on long-term approaches. Conclusion: Multidisciplinary approaches for obesity management and the collaboration between general practitioners and hospitals as well as between professionals at the local level and networks of obesity management centers need to be improved across Europe. Good practices and evidence are available.

  4. Software for radioactive wastes database

    International Nuclear Information System (INIS)

    Souza, Eros Viggiano de; Reis, Luiz Carlos Alves

    1996-01-01

    A radioactive waste database was implemented at CDTN in 1991. The objectives are to register and retrieve information about wastes ge in 1991. The objectives are to register and retrieve information about wastes generated and received at the Centre in order to improve the waste management. Since 1995, the database has being reviewed and a software has being developed aiming at processing information in graphical environment (Windows 95 and Windows NT), minimising the possibility of errors and making the users access more friendly. It was also envisaged to ease graphics and reports edition and to make this database available to other CNEN institutes and even to external organizations. (author)

  5. Utilization of a Clinical Trial Management System for the Whole Clinical Trial Process as an Integrated Database: System Development.

    Science.gov (United States)

    Park, Yu Rang; Yoon, Young Jo; Koo, HaYeong; Yoo, Soyoung; Choi, Chang-Min; Beck, Sung-Ho; Kim, Tae Won

    2018-04-24

    Clinical trials pose potential risks in both communications and management due to the various stakeholders involved when performing clinical trials. The academic medical center has a responsibility and obligation to conduct and manage clinical trials while maintaining a sufficiently high level of quality, therefore it is necessary to build an information technology system to support standardized clinical trial processes and comply with relevant regulations. The objective of the study was to address the challenges identified while performing clinical trials at an academic medical center, Asan Medical Center (AMC) in Korea, by developing and utilizing a clinical trial management system (CTMS) that complies with standardized processes from multiple departments or units, controlled vocabularies, security, and privacy regulations. This study describes the methods, considerations, and recommendations for the development and utilization of the CTMS as a consolidated research database in an academic medical center. A task force was formed to define and standardize the clinical trial performance process at the site level. On the basis of the agreed standardized process, the CTMS was designed and developed as an all-in-one system complying with privacy and security regulations. In this study, the processes and standard mapped vocabularies of a clinical trial were established at the academic medical center. On the basis of these processes and vocabularies, a CTMS was built which interfaces with the existing trial systems such as the electronic institutional review board health information system, enterprise resource planning, and the barcode system. To protect patient data, the CTMS implements data governance and access rules, and excludes 21 personal health identifiers according to the Health Insurance Portability and Accountability Act (HIPAA) privacy rule and Korean privacy laws. Since December 2014, the CTMS has been successfully implemented and used by 881 internal and

  6. Review on management of horticultural plant germplasm resources and construction of related database

    Directory of Open Access Journals (Sweden)

    Pan Jingxian

    2017-02-01

    Full Text Available The advances of databases on horticulture germplasm resources from China and abroad was briefly reviewed and the key technologies were discussed in details,especially in descriptors of data collection of germplasm resources. The prospective and challenges of databases were also discussed. It was evident that there was an urgent need to develop the databases of horticulture germplasm resources,with increasing diversity of germplasm,more user friendly and systematically access to the databases.

  7. Applying the archetype approach to the database of a biobank information management system.

    Science.gov (United States)

    Späth, Melanie Bettina; Grimson, Jane

    2011-03-01

    The purpose of this study is to investigate the feasibility of applying the openEHR archetype approach to modelling the data in the database of an existing proprietary biobank information management system. A biobank information management system stores the clinical/phenotypic data of the sample donor and sample related information. The clinical/phenotypic data is potentially sourced from the donor's electronic health record (EHR). The study evaluates the reuse of openEHR archetypes that have been developed for the creation of an interoperable EHR in the context of biobanking, and proposes a new set of archetypes specifically for biobanks. The ultimate goal of the research is the development of an interoperable electronic biomedical research record (eBMRR) to support biomedical knowledge discovery. The database of the prostate cancer biobank of the Irish Prostate Cancer Research Consortium (PCRC), which supports the identification of novel biomarkers for prostate cancer, was taken as the basis for the modelling effort. First the database schema of the biobank was analyzed and reorganized into archetype-friendly concepts. Then, archetype repositories were searched for matching archetypes. Some existing archetypes were reused without change, some were modified or specialized, and new archetypes were developed where needed. The fields of the biobank database schema were then mapped to the elements in the archetypes. Finally, the archetypes were arranged into templates specifically to meet the requirements of the PCRC biobank. A set of 47 archetypes was found to cover all the concepts used in the biobank. Of these, 29 (62%) were reused without change, 6 were modified and/or extended, 1 was specialized, and 11 were newly defined. These archetypes were arranged into 8 templates specifically required for this biobank. A number of issues were encountered in this research. Some arose from the immaturity of the archetype approach, such as immature modelling support tools

  8. Ecosystem-based management objectives for the North Sea: riding the forage fish rollercoaster

    DEFF Research Database (Denmark)

    Dickey-Collas, Mark; Engelhard, Georg H.; Rindorf, Anna

    2014-01-01

    The North Sea provides a useful model for considering forage fish (FF) within ecosystem-based management as it has a complex assemblage of FF species. This paper is designed to encourage further debate and dialogue between stakeholders about management objectives. Changing the management...... whether maintaining the reserves of prey biomass or a more integral approach of monitoring mortality rates across the trophic system is more robust under the ecosystem approach. In terms of trophic energy transfer, stability, and resilience of the ecosystem, FF should be considered as both a sized-based...... pool of biomass and as species components of the system by managers and modellers. Policy developers should not consider the knowledge base robust enough to embark on major projects of ecosystem engineering. Management plans appear able to maintain sustainable exploitation in the short term. Changes...

  9. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    Science.gov (United States)

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.

  10. An Integrated Photogrammetric and Spatial Database Management System for Producing Fully Structured Data Using Aerial and Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Farshid Farnood Ahmadi

    2009-03-01

    Full Text Available 3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs; direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS is presented.

  11. Harmonization Without Homogenization: The Virginia Community College System's Approach to Management By Objectives.

    Science.gov (United States)

    Puyear, Donald E.; And Others

    This panel report on the development of management of objectives (MBO) in the Virginia Community College System (VCCS) will be useful to any community college or community college system interested in changing to the MBO method of administration. Following a discussion of the history of centralized administration and funding which preceded the…

  12. An Object-Oriented Information Model for Policy-based Management of Distributed Applications

    NARCIS (Netherlands)

    Diaz, G.; Gay, V.C.J.; Horlait, E.; Hamza, M.H.

    2002-01-01

    This paper presents an object-oriented information model to support a policy-based management for distributed multimedia applications. The information base contains application-level information about the users, the applications, and their profile. Our Information model is described in details and

  13. A Case Study of Resources Management Planning with Multiple Objectives and Projects

    Science.gov (United States)

    David L. Peterson; David G. Silsbee; Daniel L. Schmoldt

    1995-01-01

    Each National Park Service unit in the United States produces a resources management plan (RMP) every four years or less. The plans commit budgets and personnel to specific projects for four years, but they are prepared with little quantitative and analytical rigor and without formal decisionmaking tools. We have previously described a multiple objective planning...

  14. Hydro-environmental management of groundwater resources: A fuzzy-based multi-objective compromise approach

    Science.gov (United States)

    Alizadeh, Mohammad Reza; Nikoo, Mohammad Reza; Rakhshandehroo, Gholam Reza

    2017-08-01

    Sustainable management of water resources necessitates close attention to social, economic and environmental aspects such as water quality and quantity concerns and potential conflicts. This study presents a new fuzzy-based multi-objective compromise methodology to determine the socio-optimal and sustainable policies for hydro-environmental management of groundwater resources, which simultaneously considers the conflicts and negotiation of involved stakeholders, uncertainties in decision makers' preferences, existing uncertainties in the groundwater parameters and groundwater quality and quantity issues. The fuzzy multi-objective simulation-optimization model is developed based on qualitative and quantitative groundwater simulation model (MODFLOW and MT3D), multi-objective optimization model (NSGA-II), Monte Carlo analysis and Fuzzy Transformation Method (FTM). Best compromise solutions (best management policies) on trade-off curves are determined using four different Fuzzy Social Choice (FSC) methods. Finally, a unanimity fallback bargaining method is utilized to suggest the most preferred FSC method. Kavar-Maharloo aquifer system in Fars, Iran, as a typical multi-stakeholder multi-objective real-world problem is considered to verify the proposed methodology. Results showed an effective performance of the framework for determining the most sustainable allocation policy in groundwater resource management.

  15. Improving Educational Objectives of the Industrial and Management Systems Engineering Programme at Kuwait University

    Science.gov (United States)

    Aldowaisan, Tariq; Allahverdi, Ali

    2016-01-01

    This paper describes the process of developing programme educational objectives (PEOs) for the Industrial and Management Systems Engineering programme at Kuwait University, and the process of deployment of these PEOs. Input of the four constituents of the programme, faculty, students, alumni, and employers, is incorporated in the development and…

  16. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    Science.gov (United States)

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  17. Management by Objectives (MBO) Imperatives for Transforming Higher Education for a Globalised World

    Science.gov (United States)

    Ofojebe, Wenceslaus N.; Olibie, Eyiuche Ifeoma

    2014-01-01

    This study was conducted to determine the extent to which the stipulations and visions of Management by Objectives (MBO) would be integrated in higher education institutions in South Eastern Nigeria to enhance higher education transformation in a globalised world. Four research questions and a null hypothesis guided the study. A sample of 510…

  18. A Team Approach to Management by Objectives with Special Emphasis on Managerial Self-Evaluation.

    Science.gov (United States)

    Alvir, Howard P.

    This kit contains everything needed to explain, criticize and plan, simulate, and evaluate a management by objectives (MBO) program. The kit has been field tested in state agencies, schools, businesses, and volunteer organizations. Rather than present only the strengths of MBO, this program defines MBO, presents its strong points in discussing the…

  19. Herbicides as an alternative to prescribed burning for achieving wildlife management objectives

    Science.gov (United States)

    T. Bently Wigley; Karl V. Miller; David S. deCalesta; Mark W. Thomas

    2002-01-01

    Prescribed burning is used for many silvicultural and wildlife management objectives. However, the use of prescribed burning can be constrained due to difficulties in obtaining burning permits, concerns about liability, potential effects of scorch on growth and survival of crop trees, its sometimes ineffective results, limited burning days, and the costs of applying,...

  20. Multi objective optimization of line pack management of gas pipeline system

    International Nuclear Information System (INIS)

    Chebouba, A

    2015-01-01

    This paper addresses the Line Pack Management of the ''GZ1 Hassi R'mell-Arzew'' gas pipeline. For a gas pipeline system, the decision-making on the gas line pack management scenarios usually involves a delicate balance between minimization of the fuel consumption in the compression stations and maximizing gas line pack. In order to select an acceptable Line Pack Management of Gas Pipeline scenario from these two angles for ''GZ1 Hassi R'mell- Arzew'' gas pipeline, the idea of multi-objective decision-making has been introduced. The first step in developing this approach is the derivation of a numerical method to analyze the flow through the pipeline under transient isothermal conditions. In this paper, the solver NSGA-II of the modeFRONTIER, coupled with a matlab program was used for solving the multi-objective problem

  1. Hazardous waste database: Waste management policy implications for the US Department of Energy's Environmental Restoration and Waste Management Programmatic Environmental Impact Statement

    International Nuclear Information System (INIS)

    Lazaro, M.A.; Policastro, A.J.; Antonopoulos, A.A.; Hartmann, H.M.; Koebnick, B.; Dovel, M.; Stoll, P.W.

    1994-01-01

    The hazardous waste risk assessment modeling (HaWRAM) database is being developed to analyze the risk from treatment technology operations and potential transportation accidents associated with the hazardous waste management alternatives. These alternatives are being assessed in the Department of Energy's Environmental Restoration and Waste Management Programmatic Environmental Impact Statement (EM PEIS). To support the risk analysis, the current database contains complexwide detailed information on hazardous waste shipments from 45 Department of Energy installations during FY 1992. The database is currently being supplemented with newly acquired data. This enhancement will improve database information on operational hazardous waste generation rates, and the level and type of current on-site treatment at Department of Energy installations

  2. The database system for the management of technical documentations of PWR fuel design project using CD-ROM

    International Nuclear Information System (INIS)

    Park, Bong Sik; Lee, Won Jae; Ryu, Jae Kwon; Jo, In Hang; Chang, Jong Hwa.

    1996-12-01

    In this report, the database system developed for the management of technical documentation of PWR fuel design project using CD-ROM (compact disk - read only memory) is described. The database system, KIRDOCM (KAERI Initial and Reload Fuel project technical documentation management), is developed and installed on PC using Visual Foxpro 3.0. Descriptions are focused on the user interface of the KIRDOCM. Introduction addresses the background and concept of the development. The main chapter describes the user requirements, the analysis of computing environment, the design of KIRDOCM, the implementation of the KIRDOCM, user's manual of KIRDOCM and the maintenance of the KIRDOCM for future improvement. The implementation of KIRDOCM system provides the efficiency in the management, maintenance and indexing of the technical documents. And, it is expected that KIRDOCM may be a good reference in applying Visual Foxpro for the development of information management system. (author). 2 tabs., 13 figs., 8 refs

  3. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 6, November 2004 (last updated 2004.12.16)

    International Nuclear Information System (INIS)

    2005-03-01

    This Radioactive Waste Management Profiles report is a compilation of data collected by the Net Enabled Waste Management Database (NEWDB) from March to July 2004. The report contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. It provides or references details of the scope of NEWMDB data collections and it explains the formats of individual NEWMDB report pages

  4. Obesity Management in Europe: Current Status and Objectives for the Future.

    Science.gov (United States)

    Uerlich, Magdalena F; Yumuk, Volkan; Finer, Nick; Basdevant, Arnaud; Visscher, Tommy L S

    2016-01-01

    This study aims at assessing the status of obesity management in the European region and identifying future goals and objectives of professionals working in the field of obesity. Presidents of all 31 EASO-affiliated (EASO = European Association for the Study of Obesity) national associations for the study of obesity were asked to invite 5 obesity experts from their country to participate in a survey. A total of 74 obesity professionals out of 23 countries participated. Questions addressed the development of guidelines, the status of obesity management, and goals and objectives for the future in obesity management. Further, EASO's three vice-presidents participated in in-depth, semi-structured interviews, in which they were asked to provide their reflection on the survey data. Most countries define obesity as a clinical and chronic disease, but various differences in obesity management standards exist across Europe. Existing guidelines mainly focus on the acute treatment of obesity rather than on long-term approaches. Multidisciplinary approaches for obesity management and the collaboration between general practitioners and hospitals as well as between professionals at the local level and networks of obesity management centers need to be improved across Europe. Good practices and evidence are available. © 2016 The Author(s) Published by S. Karger GmbH, Freiburg.

  5. A multi-objective programming model for assessment the GHG emissions in MSW management

    International Nuclear Information System (INIS)

    Mavrotas, George; Skoulaxinou, Sotiria; Gakis, Nikos; Katsouros, Vassilis; Georgopoulou, Elena

    2013-01-01

    Highlights: • The multi-objective multi-period optimization model. • The solution approach for the generation of the Pareto front with mathematical programming. • The very detailed description of the model (decision variables, parameters, equations). • The use of IPCC 2006 guidelines for landfill emissions (first order decay model) in the mathematical programming formulation. - Abstract: In this study a multi-objective mathematical programming model is developed for taking into account GHG emissions for Municipal Solid Waste (MSW) management. Mathematical programming models are often used for structure, design and operational optimization of various systems (energy, supply chain, processes, etc.). The last twenty years they are used all the more often in Municipal Solid Waste (MSW) management in order to provide optimal solutions with the cost objective being the usual driver of the optimization. In our work we consider the GHG emissions as an additional criterion, aiming at a multi-objective approach. The Pareto front (Cost vs. GHG emissions) of the system is generated using an appropriate multi-objective method. This information is essential to the decision maker because he can explore the trade-offs in the Pareto curve and select his most preferred among the Pareto optimal solutions. In the present work a detailed multi-objective, multi-period mathematical programming model is developed in order to describe the waste management problem. Apart from the bi-objective approach, the major innovations of the model are (1) the detailed modeling considering 34 materials and 42 technologies, (2) the detailed calculation of the energy content of the various streams based on the detailed material balances, and (3) the incorporation of the IPCC guidelines for the CH 4 generated in the landfills (first order decay model). The equations of the model are described in full detail. Finally, the whole approach is illustrated with a case study referring to the application

  6. A multi-objective programming model for assessment the GHG emissions in MSW management

    Energy Technology Data Exchange (ETDEWEB)

    Mavrotas, George, E-mail: mavrotas@chemeng.ntua.gr [National Technical University of Athens, Iroon Polytechniou 9, Zografou, Athens, 15780 (Greece); Skoulaxinou, Sotiria [EPEM SA, 141 B Acharnon Str., Athens, 10446 (Greece); Gakis, Nikos [FACETS SA, Agiou Isidorou Str., Athens, 11471 (Greece); Katsouros, Vassilis [Athena Research and Innovation Center, Artemidos 6 and Epidavrou Str., Maroussi, 15125 (Greece); Georgopoulou, Elena [National Observatory of Athens, Thisio, Athens, 11810 (Greece)

    2013-09-15

    Highlights: • The multi-objective multi-period optimization model. • The solution approach for the generation of the Pareto front with mathematical programming. • The very detailed description of the model (decision variables, parameters, equations). • The use of IPCC 2006 guidelines for landfill emissions (first order decay model) in the mathematical programming formulation. - Abstract: In this study a multi-objective mathematical programming model is developed for taking into account GHG emissions for Municipal Solid Waste (MSW) management. Mathematical programming models are often used for structure, design and operational optimization of various systems (energy, supply chain, processes, etc.). The last twenty years they are used all the more often in Municipal Solid Waste (MSW) management in order to provide optimal solutions with the cost objective being the usual driver of the optimization. In our work we consider the GHG emissions as an additional criterion, aiming at a multi-objective approach. The Pareto front (Cost vs. GHG emissions) of the system is generated using an appropriate multi-objective method. This information is essential to the decision maker because he can explore the trade-offs in the Pareto curve and select his most preferred among the Pareto optimal solutions. In the present work a detailed multi-objective, multi-period mathematical programming model is developed in order to describe the waste management problem. Apart from the bi-objective approach, the major innovations of the model are (1) the detailed modeling considering 34 materials and 42 technologies, (2) the detailed calculation of the energy content of the various streams based on the detailed material balances, and (3) the incorporation of the IPCC guidelines for the CH{sub 4} generated in the landfills (first order decay model). The equations of the model are described in full detail. Finally, the whole approach is illustrated with a case study referring to the

  7. PARPs database: A LIMS systems for protein-protein interaction data mining or laboratory information management system

    Directory of Open Access Journals (Sweden)

    Picard-Cloutier Aude

    2007-12-01

    Full Text Available Abstract Background In the "post-genome" era, mass spectrometry (MS has become an important method for the analysis of proteins and the rapid advancement of this technique, in combination with other proteomics methods, results in an increasing amount of proteome data. This data must be archived and analysed using specialized bioinformatics tools. Description We herein describe "PARPs database," a data analysis and management pipeline for liquid chromatography tandem mass spectrometry (LC-MS/MS proteomics. PARPs database is a web-based tool whose features include experiment annotation, protein database searching, protein sequence management, as well as data-mining of the peptides and proteins identified. Conclusion Using this pipeline, we have successfully identified several interactions of biological significance between PARP-1 and other proteins, namely RFC-1, 2, 3, 4 and 5.

  8. THE KNOWLEDGE MANAGEMENT FOR BEST PRACTICES SHARING IN A DATABASE AT THE TRIBUNAL REGIONAL FEDERAL DA PRIMEIRA REGIÃO

    Directory of Open Access Journals (Sweden)

    Márcia Mazo Santos de Miranda

    2010-08-01

    Full Text Available A quick, effective and powerful alternative for knowledge management is the systematic sharing of best practices. This study identified in the literature recommendations for structuring a best practices database and summarized the benefits of its deployment to the Tribunal Regional Federal da Primeira Região TRF - 1ª Região. A It was conducted a quantitative research was then carried out, with the distribuition of where questionnaires were distributed to federal judges of the TRF- 1ª Região, which was divided into 4 parts: magistrate profile, flow of knowledge / information, internal environment, organizational facilitator. As a result, we identified the need to have a best practices database in the Institution for the organizational knowledge identification, transfer and sharing. The conclusion presents recommendations for the development of the database and highlights its importance for knowledge management in an organization.

  9. Recon2Neo4j: applying graph database technologies for managing comprehensive genome-scale networks.

    Science.gov (United States)

    Balaur, Irina; Mazein, Alexander; Saqi, Mansoor; Lysenko, Artem; Rawlings, Christopher J; Auffray, Charles

    2017-04-01

    The goal of this work is to offer a computational framework for exploring data from the Recon2 human metabolic reconstruction model. Advanced user access features have been developed using the Neo4j graph database technology and this paper describes key features such as efficient management of the network data, examples of the network querying for addressing particular tasks, and how query results are converted back to the Systems Biology Markup Language (SBML) standard format. The Neo4j-based metabolic framework facilitates exploration of highly connected and comprehensive human metabolic data and identification of metabolic subnetworks of interest. A Java-based parser component has been developed to convert query results (available in the JSON format) into SBML and SIF formats in order to facilitate further results exploration, enhancement or network sharing. The Neo4j-based metabolic framework is freely available from: https://diseaseknowledgebase.etriks.org/metabolic/browser/ . The java code files developed for this work are available from the following url: https://github.com/ibalaur/MetabolicFramework . ibalaur@eisbm.org. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  10. The Net Enabled Waste Management Database in the context of radioactive waste classification

    International Nuclear Information System (INIS)

    Csullog, G.W.; Burcl, R.; Tonkay, D.; Petoe, A.

    2002-01-01

    There is an emerging, international consensus that a common, comprehensive radioactive waste classification system is needed, which derives from the fact that the implementation of radioactive waste classification within countries is highly diverse. Within IAEA Member States, implementation ranges from none to complex systems that vary a great deal from one another. Both the IAEA and the European Commission have recommended common classification schemes but only for the purpose of facilitating communication with the public and national- and international-level organizations and to serve as the basis for developing comprehensive, national waste classification schemes. In the context described above, the IAEA's newly developed Net Enabled Waste Management Database (NEWMDB) contains a feature, the Waste Class Matrix, that Member States use to describe the waste classification schemes they use and to compare them with the IAEA's proposed waste classification scheme. Member States then report waste inventories to the NEWMDB according to their own waste classification schemes, allowing traceability back to nationally based reports. The IAEA uses the information provided in the Waste Class Matrix to convert radioactive waste inventory data reported according to a wide variety of classifications into an single inventory according to the IAEA's proposed scheme. This approach allows the international community time to develop a comprehensive, common classification scheme and allows Member States time to develop and implement effective, operational waste classification schemes while, at the same time, the IAEA can collect the information needed to compile a comprehensive, international radioactive waste inventory. (author)

  11. StreetTiVo: Using a P2P XML Database System to Manage Multimedia Data in Your Living Room

    NARCIS (Netherlands)

    Zhang, Ying; de Vries, A.P.; Boncz, P.; Hiemstra, Djoerd; Ordelman, Roeland J.F.; Li, Qing; Feng, Ling; Pei, Jian; Wang, Sean X.

    StreetTiVo is a project that aims at bringing research results into the living room; in particular, a mix of current results in the areas of Peer-to-Peer XML Database Management System (P2P XDBMS), advanced multimedia analysis techniques, and advanced information re- trieval techniques. The project

  12. A geospatial database model for the management of remote sensing datasets at multiple spectral, spatial, and temporal scales

    Science.gov (United States)

    Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George

    2017-10-01

    In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.

  13. Network and Database Security: Regulatory Compliance, Network, and Database Security - A Unified Process and Goal

    Directory of Open Access Journals (Sweden)

    Errol A. Blake

    2007-12-01

    Full Text Available Database security has evolved; data security professionals have developed numerous techniques and approaches to assure data confidentiality, integrity, and availability. This paper will show that the Traditional Database Security, which has focused primarily on creating user accounts and managing user privileges to database objects are not enough to protect data confidentiality, integrity, and availability. This paper is a compilation of different journals, articles and classroom discussions will focus on unifying the process of securing data or information whether it is in use, in storage or being transmitted. Promoting a change in Database Curriculum Development trends may also play a role in helping secure databases. This paper will take the approach that if one make a conscientious effort to unifying the Database Security process, which includes Database Management System (DBMS selection process, following regulatory compliances, analyzing and learning from the mistakes of others, Implementing Networking Security Technologies, and Securing the Database, may prevent database breach.

  14. A framework for sustainable invasive species management: environmental, social and economic objectives

    Science.gov (United States)

    Larson, Diane L.; Phillips-Mao, Laura; Quiram, Gina; Sharpe, Leah; Stark, Rebecca; Sugita, Shinya; Weiler, Annie

    2011-01-01

    Applying the concept of sustainability to invasive species management (ISM) is challenging but necessary, given the increasing rates of invasion and the high costs of invasion impacts and control. To be sustainable, ISM must address environmental, social, and economic factors (or *pillars*) that influence the causes, impacts, and control of invasive species across multiple spatial and temporal scales. Although these pillars are generally acknowledged, their implementation is often limited by insufficient control options and significant economic and political constraints. In this paper, we outline specific objectives in each of these three *pillars* that, if incorporated into a management plan, will improve the plan's likelihood of sustainability. We then examine three case studies that illustrate how these objectives can be effectively implemented. Each pillar reinforces the others, such that the inclusion of even a few of the outlined objectives will lead to more effective management that achieves ecological goals, while generating social support and long-term funding to maintain projects to completion. We encourage agency directors and policy-makers to consider sustainability principles when developing funding schemes, management agendas, and policy.

  15. Legacy2Drupal - Conversion of an existing oceanographic relational database to a semantically enabled Drupal content management system

    Science.gov (United States)

    Maffei, A. R.; Chandler, C. L.; Work, T.; Allen, J.; Groman, R. C.; Fox, P. A.

    2009-12-01

    Content Management Systems (CMSs) provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have previously designed customized schemas for their metadata. The WHOI Ocean Informatics initiative and the NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) have jointly sponsored a project to port an existing, relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal6 Content Management System. The goal was to translate all the existing database tables, input forms, website reports, and other features present in the existing system to employ Drupal CMS features. The replacement features include Drupal content types, CCK node-reference fields, themes, RDB, SPARQL, workflow, and a number of other supporting modules. Strategic use of some Drupal6 CMS features enables three separate but complementary interfaces that provide access to oceanographic research metadata via the MySQL database: 1) a Drupal6-powered front-end; 2) a standard SQL port (used to provide a Mapserver interface to the metadata and data; and 3) a SPARQL port (feeding a new faceted search capability being developed). Future plans include the creation of science ontologies, by scientist/technologist teams, that will drive semantically-enabled faceted search capabilities planned for the site. Incorporation of semantic technologies included in the future Drupal 7 core release is also anticipated. Using a public domain CMS as opposed to proprietary middleware, and taking advantage of the many features of Drupal 6 that are designed to support semantically-enabled interfaces will help prepare the BCO-DMO database for interoperability with other ecosystem databases.

  16. Customer database for Watrec Oy

    OpenAIRE

    Melnichikhina, Ekaterina

    2016-01-01

    This thesis is a development project for Watrec Oy. Watrec Oy is a Finnish company specializes in “waste-to-energy” issues. Customer Relation Management (CRM) strategies are now being applied within the company. The customer database is the first and trial step towards CRM strategy in Watrec Oy. The reasons for database project lie in lacking of clear customers’ data. The main objectives are: - To integrate the customers’ and project data; - To improve the level of sales and mar...

  17. Moving beyond the MSY concept to reflect multidimensional fisheries management objectives

    DEFF Research Database (Denmark)

    Rindorf, Anna; Mumford, John; Baranowski, Paul

    2017-01-01

    to be maximised can be combined with sustainability constraints aiming specifically at one or more of these four sustainability pillars. The study was conducted as a three-year interactive process involving 290 participating science, industry, NGO and management representatives from six different European regions......, was highly preferred by participants across the project. This suggests that advice incorporating flexibility in the interpretation of objectives to leave room for meaningful inclusiveness in decision-making processes is likely to be a prerequisite for stakeholder buy-in to management decisions...

  18. Management of the database originated from individual and environment monitoring carried out in the UNIFESP/HSP complex, SP, Brazil

    International Nuclear Information System (INIS)

    Medeiros, Regina Bitelli; Daros, Kellen Adriana Curci; Almeida, Natalia Correia de; Pires, Silvio Ricardo; Jorge, Luiz Tadeu

    2005-01-01

    The Radiological Protection Sector of the Sao Paulo Hospital/Federal University of Sao Paulo, SP, Brazil manages the records of 457 dosemeters. Once the users must know about the absorbed doses monthly and the need of keep the individuals records until the age of 75 years old and for, at least during 30 years after the end of the occupation of the individual, it became necessary to construct a database and a computerized control to manage the accumulated doses. This control, between 1991 and 1999, was effected by means of a relational database (Cobol 85 - Operating System GCOS 7 (ABC Telematic Bull)). After this period, when the company responsible for dosimetry went on to provide computerized results, the data were stored in a Paradox database (Borland). In 2004, the databases were integrated and were created a third database developed in Oracle (IBM) and a system that allowed the institutional Intranet users to consult their accumulated doses annually and the value of the total effective dose accumulated during working life

  19. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  20. Spatial processes decouple management from objectives in a heterogeneous landscape: predator control as a case study.

    Science.gov (United States)

    Mahoney, Peter J; Young, Julie K; Hersey, Kent R; Larsen, Randy T; McMillan, Brock R; Stoner, David C

    2018-04-01

    Predator control is often implemented with the intent of disrupting top-down regulation in sensitive prey populations. However, ambiguity surrounding the efficacy of predator management, as well as the strength of top-down effects of predators in general, is often exacerbated by the spatially implicit analytical approaches used in assessing data with explicit spatial structure. Here, we highlight the importance of considering spatial context in the case of a predator control study in south-central Utah. We assessed the spatial match between aerial removal risk in coyotes (Canis latrans) and mule deer (Odocoileus hemionus) resource selection during parturition using a spatially explicit, multi-level Bayesian model. With our model, we were able to evaluate spatial congruence between management action (i.e., coyote removal) and objective (i.e., parturient deer site selection) at two distinct scales: the level of the management unit and the individual coyote removal. In the case of the former, our results indicated substantial spatial heterogeneity in expected congruence between removal risk and parturient deer site selection across large areas, and is a reflection of logistical constraints acting on the management strategy and differences in space use between the two species. At the level of the individual removal, we demonstrated that the potential management benefits of a removed coyote were highly variable across all individuals removed and in many cases, spatially distinct from parturient deer resource selection. Our methods and results provide a means of evaluating where we might anticipate an impact of predator control, while emphasizing the need to weight individual removals based on spatial proximity to management objectives in any assessment of large-scale predator control. Although we highlight the importance of spatial context in assessments of predator control strategy, we believe our methods are readily generalizable in any management or large

  1. Object-oriented Approach to High-level Network Monitoring and Management

    Science.gov (United States)

    Mukkamala, Ravi

    2000-01-01

    An absolute prerequisite for the management of large investigating methods to build high-level monitoring computer networks is the ability to measure their systems that are built on top of existing monitoring performance. Unless we monitor a system, we cannot tools. Due to the heterogeneous nature of the hope to manage and control its performance. In this underlying systems at NASA Langley Research Center, paper, we describe a network monitoring system that we use an object-oriented approach for the design, we are currently designing and implementing. Keeping, first, we use UML (Unified Modeling Language) to in mind the complexity of the task and the required model users' requirements. Second, we identify the flexibility for future changes, we use an object-oriented existing capabilities of the underlying monitoring design methodology. The system is built using the system. Third, we try to map the former with the latter. APIs offered by the HP OpenView system.

  2. A Parallel Relational Database Management System Approach to Relevance Feedback in Information Retrieval.

    Science.gov (United States)

    Lundquist, Carol; Frieder, Ophir; Holmes, David O.; Grossman, David

    1999-01-01

    Describes a scalable, parallel, relational database-drive information retrieval engine. To support portability across a wide range of execution environments, all algorithms adhere to the SQL-92 standard. By incorporating relevance feedback algorithms, accuracy is enhanced over prior database-driven information retrieval efforts. Presents…

  3. Web based Interactive 3D Learning Objects for Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Stefan Hesse

    2012-02-01

    Full Text Available In this paper, we present an approach to create and integrate interactive 3D learning objects of high quality for higher education into a learning management system. The use of these resources allows to visualize topics, such as electro-technical and physical processes in the interior of complex devices. This paper addresses the challenge of combining rich interactivity and adequate realism with 3D exercise material for distance elearning.

  4. Towards Efficient Energy Management: Defining HEMS, AMI and Smart Grid Objectives

    DEFF Research Database (Denmark)

    Rossello Busquet, Ana; Kardaras, Georgios; Soler, José

    2011-01-01

    electricity in the grid will also help to reduce the increase of energy consumption in the future. In order to reduce energy consumption in home environments, researches have been designing Home Energy Management Systems (HEMS). In addition, Advanced Metering Infrastructure (AMI) and smart grids are also...... being developed to distribute and produce electricity efficiently. This paper presents the high level goals and requirements of HEMS. Additionally, it gives an overview of Advanced Metering Infrastructure benefits and smart grids objectives....

  5. Occupant evaluation of commercial office lighting: Volume 3, Data archive and database management system

    Energy Technology Data Exchange (ETDEWEB)

    Gillette, G.; Brown, M. (ed.)

    1987-08-01

    This report documents a database of measured lighting environmental data. The database contains four different types of data on more than 1000 occupied work stations: (1) subjective data on attitudes and ratings of selected lighting and other characteristics, (2) photometric and other direct environmental data, including illuminances, luminances, and contrast conditions, (3) indirect environmental measures obtained from the architectural drawings and the work station photographs, and (4) descriptive characteristics of the occupants. The work stations were sampled from thirteen office buildings located in various cities in the United States. In the database, each record contains data on a single work station with its individual fields comprising characteristics of that work station and its occupant. The relational database runs on an IBM or IBM compatible personal computer using commercially available software. As a supplement to the database, an independent ASCII-8 bit data file is available.

  6. Efficiency potential of management and technical solutions for a construction object

    Directory of Open Access Journals (Sweden)

    Lapidus Azariy Abramovich

    2014-01-01

    Full Text Available The authors investigate the models of efficiency potential of management and technical solutions for a construction object, which allows accounting for the influence of management-technological and administrative solutions in the process of implementing construction project. The solutions are represented by various factors – solitary integral potentials. The factors, which should be taken into account in the process of developing an integral model, are: development of general contracting structure, project decisions, management decisions, administrative decisions and ecological impact. In is necessary to develop the model, which will integrally put together the above mentioned factors of a construction project, observe and investigate other factors, create a model and get the opportunity not only to predict the endpoint of the future construction object on the stage of formulating technological requirements, but also to monitor the changes of this prognosis in time. The parameters of the integral potential will allow the system to obtain flexibility, which makes it possible to adjust to the changes usually taking place on a construction object and at the same time to aim for optimization of organizational, technological and administrative solutions in the process of reaching endpoint of construction.

  7. Integrating indigenous livelihood and lifestyle objectives in managing a natural resource.

    Science.gov (United States)

    Plagányi, Éva Elizabeth; van Putten, Ingrid; Hutton, Trevor; Deng, Roy A; Dennis, Darren; Pascoe, Sean; Skewes, Tim; Campbell, Robert A

    2013-02-26

    Evaluating the success of natural resource management approaches requires methods to measure performance against biological, economic, social, and governance objectives. In fisheries, most research has focused on industrial sectors, with the contributions to global resource use by small-scale and indigenous hunters and fishers undervalued. Globally, the small-scale fisheries sector alone employs some 38 million people who share common challenges in balancing livelihood and lifestyle choices. We used as a case study a fishery with both traditional indigenous and commercial sectors to develop a framework to bridge the gap between quantitative bio-economic models and more qualitative social analyses. For many indigenous communities, communalism rather than capitalism underlies fishers' perspectives and aspirations, and we find there are complicated and often unanticipated trade-offs between economic and social objectives. Our results highlight that market-based management options might score highly in a capitalistic society, but have negative repercussions on community coherence and equity in societies with a strong communal ethic. There are complex trade-offs between economic indicators, such as profit, and social indicators, such as lifestyle preferences. Our approach makes explicit the "triple bottom line" sustainability objectives involving trade-offs between economic, social, and biological performance, and is thus directly applicable to most natural resource management decision-making situations.

  8. An interval-parameter mixed integer multi-objective programming for environment-oriented evacuation management

    Science.gov (United States)

    Wu, C. Z.; Huang, G. H.; Yan, X. P.; Cai, Y. P.; Li, Y. P.

    2010-05-01

    Large crowds are increasingly common at political, social, economic, cultural and sports events in urban areas. This has led to attention on the management of evacuations under such situations. In this study, we optimise an approximation method for vehicle allocation and route planning in case of an evacuation. This method, based on an interval-parameter multi-objective optimisation model, has potential for use in a flexible decision support system for evacuation management. The modeling solutions are obtained by sequentially solving two sub-models corresponding to lower- and upper-bounds for the desired objective function value. The interval solutions are feasible and stable in the given decision space, and this may reduce the negative effects of uncertainty, thereby improving decision makers' estimates under different conditions. The resulting model can be used for a systematic analysis of the complex relationships among evacuation time, cost and environmental considerations. The results of a case study used to validate the proposed model show that the model does generate useful solutions for planning evacuation management and practices. Furthermore, these results are useful for evacuation planners, not only in making vehicle allocation decisions but also for providing insight into the tradeoffs among evacuation time, environmental considerations and economic objectives.

  9. A multi-objective genetic approach to domestic load scheduling in an energy management system

    International Nuclear Information System (INIS)

    Soares, Ana; Antunes, Carlos Henggeler; Oliveira, Carlos; Gomes, Álvaro

    2014-01-01

    In this paper a multi-objective genetic algorithm is used to solve a multi-objective model to optimize the time allocation of domestic loads within a planning period of 36 h, in a smart grid context. The management of controllable domestic loads is aimed at minimizing the electricity bill and the end-user’s dissatisfaction concerning two different aspects: the preferred time slots for load operation and the risk of interruption of the energy supply. The genetic algorithm is similar to the Elitist NSGA-II (Nondominated Sorting Genetic Algorithm II), in which some changes have been introduced to adapt it to the physical characteristics of the load scheduling problem and improve usability of results. The mathematical model explicitly considers economical, technical, quality of service and comfort aspects. Illustrative results are presented and the characteristics of different solutions are analyzed. - Highlights: • A genetic algorithm similar to the NSGA-II is used to solve a multi-objective model. • The optimized time allocation of domestic loads in a smart grid context is achieved. • A variable preference profile for the operation of the managed loads is included. • A safety margin is used to account for the quality of the energy services provided. • A non-dominated front with the solutions in the two-objective space is obtained

  10. Multi-objective, multiple participant decision support for water management in the Andarax catchment, Almeria

    Science.gov (United States)

    van Cauwenbergh, N.; Pinte, D.; Tilmant, A.; Frances, I.; Pulido-Bosch, A.; Vanclooster, M.

    2008-04-01

    Water management in the Andarax river basin (Almeria, Spain) is a multi-objective, multi-participant, long-term decision-making problem that faces several challenges. Adequate water allocation needs informed decisions to meet increasing socio-economic demands while respecting the environmental integrity of this basin. Key players in the Andarax water sector include the municipality of Almeria, the irrigators involved in the intensive greenhouse agricultural sector, and booming second residences. A decision support system (DSS) is developed to rank different sustainable planning and management alternatives according to their socio-economic and environmental performance. The DSS is intimately linked to sustainability indicators and is designed through a public participation process. Indicators are linked to criteria reflecting stakeholders concerns in the 2005 field survey, such as fulfilling water demand, water price, technical and economical efficiency, social and environmental impacts. Indicators can be partly quantified after simulating the operation of the groundwater reservoir over a 20-year planning period and partly through a parallel expert evaluation process. To predict the impact of future water demand in the catchment, several development scenarios are designed to be evaluated in the DSS. The successive multi-criteria analysis of the performance indicators permits the ranking of the different management alternatives according to the multiple objectives formulated by the different sectors/participants. This allows more informed and transparent decision-making processes for the Andarax river basin, recognizing both the socio-economic and environmental dimensions of water resources management.

  11. Legacy systems: managing evolution through integration in a distributed and object-oriented computing environment.

    Science.gov (United States)

    Lemaitre, D; Sauquet, D; Fofol, I; Tanguy, L; Jean, F C; Degoulet, P

    1995-01-01

    Legacy systems are crucial for organizations since they support key functionalities. But they become obsolete with aging and the apparition of new techniques. Managing their evolution is a key issue in software engineering. This paper presents a strategy that has been developed at Broussais University Hospital in Paris to make a legacy system devoted to the management of health care units evolve towards a new up-to-date software. A two-phase evolution pathway is described. The first phase consists in separating the interface from the data storage and application control and in using a communication channel between the individualized components. The second phase proposes to use an object-oriented DBMS in place of the homegrown system. An application example for the management of hypertensive patients is described.

  12. Management of cyber physical objects in the future Internet of Things methods, architectures and applications

    CERN Document Server

    Loscri, Valeria; Rovella, Anna; Fortino, Giancarlo

    2016-01-01

    This book focuses on new methods, architectures, and applications for the management of Cyber Physical Objects (CPOs) in the context of the Internet of Things (IoT). It covers a wide range of topics related to CPOs, such as resource management, hardware platforms, communication and control, and control and estimation over networks. It also discusses decentralized, distributed, and cooperative optimization as well as effective discovery, management, and querying of CPOs. Other chapters outline the applications of control, real-time aspects, and software for CPOs and introduce readers to agent-oriented CPOs, communication support for CPOs, real-world deployment of CPOs, and CPOs in Complex Systems. There is a focus on the importance of application of IoT technologies for Smart Cities.

  13. [Research and development of medical case database: a novel medical case information system integrating with biospecimen management].

    Science.gov (United States)

    Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang

    2010-04-01

    To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.

  14. Quality standards for DNA sequence variation databases to improve clinical management under development in Australia

    Directory of Open Access Journals (Sweden)

    B. Bennetts

    2014-09-01

    Full Text Available Despite the routine nature of comparing sequence variations identified during clinical testing to database records, few databases meet quality requirements for clinical diagnostics. To address this issue, The Royal College of Pathologists of Australasia (RCPA in collaboration with the Human Genetics Society of Australasia (HGSA, and the Human Variome Project (HVP is developing standards for DNA sequence variation databases intended for use in the Australian clinical environment. The outputs of this project will be promoted to other health systems and accreditation bodies by the Human Variome Project to support the development of similar frameworks in other jurisdictions.

  15. Knowledge base technology for CT-DIMS: Report 1. [CT-DIMS (Cutting Tool - Database and Information Management System)

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, E.E.

    1993-05-01

    This report discusses progress on the Cutting Tool-Database and Information Management System (CT-DIMS) project being conducted by the University of Illinois Urbana-Champaign (UIUC) under contract to the Department of Energy. This project was initiated in October 1991 by UIUC. The Knowledge-Based Engineering Systems Research Laboratory (KBESRL) at UIUC is developing knowledge base technology and prototype software for the presentation and manipulation of the cutting tool databases at Allied-Signal Inc., Kansas City Division (KCD). The graphical tool selection capability being developed for CT-DIMS in the Intelligent Design Environment for Engineering Automation (IDEEA) will provide a concurrent environment for simultaneous access to tool databases, tool standard libraries, and cutting tool knowledge.

  16. Spatial access method for urban geospatial database management: An efficient approach of 3D vector data clustering technique

    DEFF Research Database (Denmark)

    Azri, Suhaibah; Ujang, Uznir; Rahman, Alias Abdul

    2014-01-01

    In the last few years, 3D urban data and its information are rapidly increased due to the growth of urban area and urbanization phenomenon. These datasets are then maintain and manage in 3D spatial database system. However, performance deterioration is likely to happen due to the massiveness of 3D...... datasets. As a solution, 3D spatial index structure is used as a booster to increase the performance of data retrieval. In commercial database, commonly and widely used index structure for 3D spatial database is 3D R-Tree. This is due to its simplicity and promising method in handling spatial data. However......D geospatial data clustering to be used in the construction of 3D R-Tree and respectively could reduce the overlapping among nodes. The proposed method is tested on 3D urban dataset for the application of urban infill development. By using several cases of data updating operations such as building...

  17. Event driven software package for the database of Integrated Coastal and Marine Area Management (ICMAM) (Developed in 'C')

    Digital Repository Service at National Institute of Oceanography (India)

    Sadhuram, Y.; Murty, T.V.R.; Chandramouli, P.; Murthy, K.S.R.

    National Institute of Oceanography (NIO, RC, Visakhapatnam, India) had taken up the Integrated Coastal and Marine Area Management (ICMAM) project funded by Department of Ocean Development (DOD), New Delhi, India. The main objective of this project...

  18. 'The surface management system' (SuMS) database: a surface-based database to aid cortical surface reconstruction, visualization and analysis

    Science.gov (United States)

    Dickson, J.; Drury, H.; Van Essen, D. C.

    2001-01-01

    Surface reconstructions of the cerebral cortex are increasingly widely used in the analysis and visualization of cortical structure, function and connectivity. From a neuroinformatics perspective, dealing with surface-related data poses a number of challenges. These include the multiplicity of configurations in which surfaces are routinely viewed (e.g. inflated maps, spheres and flat maps), plus the diversity of experimental data that can be represented on any given surface. To address these challenges, we have developed a surface management system (SuMS) that allows automated storage and retrieval of complex surface-related datasets. SuMS provides a systematic framework for the classification, storage and retrieval of many types of surface-related data and associated volume data. Within this classification framework, it serves as a version-control system capable of handling large numbers of surface and volume datasets. With built-in database management system support, SuMS provides rapid search and retrieval capabilities across all the datasets, while also incorporating multiple security levels to regulate access. SuMS is implemented in Java and can be accessed via a Web interface (WebSuMS) or using downloaded client software. Thus, SuMS is well positioned to act as a multiplatform, multi-user 'surface request broker' for the neuroscience community.

  19. [Cystic Fibrosis Cloud database: An information system for storage and management of clinical and microbiological data of cystic fibrosis patients].

    Science.gov (United States)

    Prieto, Claudia I; Palau, María J; Martina, Pablo; Achiary, Carlos; Achiary, Andrés; Bettiol, Marisa; Montanaro, Patricia; Cazzola, María L; Leguizamón, Mariana; Massillo, Cintia; Figoli, Cecilia; Valeiras, Brenda; Perez, Silvia; Rentería, Fernando; Diez, Graciela; Yantorno, Osvaldo M; Bosch, Alejandra

    2016-01-01

    The epidemiological and clinical management of cystic fibrosis (CF) patients suffering from acute pulmonary exacerbations or chronic lung infections demands continuous updating of medical and microbiological processes associated with the constant evolution of pathogens during host colonization. In order to monitor the dynamics of these processes, it is essential to have expert systems capable of storing and subsequently extracting the information generated from different studies of the patients and microorganisms isolated from them. In this work we have designed and developed an on-line database based on an information system that allows to store, manage and visualize data from clinical studies and microbiological analysis of bacteria obtained from the respiratory tract of patients suffering from cystic fibrosis. The information system, named Cystic Fibrosis Cloud database is available on the http://servoy.infocomsa.com/cfc_database site and is composed of a main database and a web-based interface, which uses Servoy's product architecture based on Java technology. Although the CFC database system can be implemented as a local program for private use in CF centers, it can also be used, updated and shared by different users who can access the stored information in a systematic, practical and safe manner. The implementation of the CFC database could have a significant impact on the monitoring of respiratory infections, the prevention of exacerbations, the detection of emerging organisms, and the adequacy of control strategies for lung infections in CF patients. Copyright © 2015 Asociación Argentina de Microbiología. Publicado por Elsevier España, S.L.U. All rights reserved.

  20. Content Based Retrieval Database Management System with Support for Similarity Searching and Query Refinement

    National Research Council Canada - National Science Library

    Ortega-Binderberger, Michael

    2002-01-01

    ... as a critical area of research. This thesis explores how to enhance database systems with content based search over arbitrary abstract data types in a similarity based framework with query refinement...