WorldWideScience

Sample records for database creation management

  1. Discussions about acceptance of the free software for management and creation of referencial database for papers

    Directory of Open Access Journals (Sweden)

    Flavio Ribeiro Córdula

    2016-03-01

    Full Text Available Objective. This research aimed to determine the degree of acceptance, by the use of the Technology Acceptance Model - TAM, of the developed software, which allows the construction and database management of scientific articles aimed at assisting in the dissemination and retrieval of stored scientific production in digital media. Method. The research is characterized as quantitative, since the TAM, which guided this study is essentially quantitative. A questionnaire developed according to TAM guidelines was used as a tool for data collection. Results. It was possible to verify that this software, despite the need of fixes and improvements inherent to this type of tool, obtained a relevant degree of acceptance by the sample studied. Conciderations. It also should be noted that although this research has been directed to scholars in the field of information science, the idea that justified the creation of the software used in this study might contribute to the development of science in any field of knowledge, aiming at the optimization results of a search conducted in a specialized database can provide.

  2. From document to database: modernizing requirements management

    International Nuclear Information System (INIS)

    Giajnorio, J.; Hamilton, S.

    2007-01-01

    The creation, communication, and management of design requirements are central to the successful completion of any large engineering project, both technically and commercially. Design requirements in the Canadian nuclear industry are typically numbered lists in multiple documents created using word processing software. As an alternative, GE Nuclear Products implemented a central requirements management database for a major project at Bruce Power. The database configured the off-the-shelf software product, Telelogic Doors, to GE's requirements structure. This paper describes the advantages realized by this scheme. Examples include traceability from customer requirements through to test procedures, concurrent engineering, and automated change history. (author)

  3. TRENDS: The aeronautical post-test database management system

    Science.gov (United States)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  4. Use of SQL Databases to Support Human Resource Management

    OpenAIRE

    Zeman, Jan

    2011-01-01

    Bakalářská práce se zaměřuje na návrh SQL databáze pro podporu Řízení lidských zdrojů a její následné vytvoření v programu MS SQL Server. This thesis focuses on the design of SQL database for support Human resources management and its creation in MS SQL Server. A

  5. Clever generation of rich SPARQL queries from annotated relational schema: application to Semantic Web Service creation for biological databases.

    Science.gov (United States)

    Wollbrett, Julien; Larmande, Pierre; de Lamotte, Frédéric; Ruiz, Manuel

    2013-04-15

    In recent years, a large amount of "-omics" data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic.

  6. Clever generation of rich SPARQL queries from annotated relational schema: application to Semantic Web Service creation for biological databases

    Science.gov (United States)

    2013-01-01

    Background In recent years, a large amount of “-omics” data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. Results We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. Conclusions BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic. PMID:23586394

  7. NETMARK: A Schema-less Extension for Relational Databases for Managing Semi-structured Data Dynamically

    Science.gov (United States)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  8. PACSY, a relational database management system for protein structure and chemical shift analysis.

    Science.gov (United States)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L

    2012-10-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu.

  9. PACSY, a relational database management system for protein structure and chemical shift analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Woonghee, E-mail: whlee@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States); Yu, Wookyung [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Kim, Suhkmann [Pusan National University, Department of Chemistry and Chemistry Institute for Functional Materials (Korea, Republic of); Chang, Iksoo [Center for Proteome Biophysics, Pusan National University, Department of Physics (Korea, Republic of); Lee, Weontae, E-mail: wlee@spin.yonsei.ac.kr [Yonsei University, Structural Biochemistry and Molecular Biophysics Laboratory, Department of Biochemistry (Korea, Republic of); Markley, John L., E-mail: markley@nmrfam.wisc.edu [University of Wisconsin-Madison, National Magnetic Resonance Facility at Madison, and Biochemistry Department (United States)

    2012-10-15

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  10. PACSY, a relational database management system for protein structure and chemical shift analysis

    Science.gov (United States)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu. PMID:22903636

  11. PACSY, a relational database management system for protein structure and chemical shift analysis

    International Nuclear Information System (INIS)

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L.

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.eduhttp://pacsy.nmrfam.wisc.edu.

  12. An Interoperable Cartographic Database

    OpenAIRE

    Slobodanka Ključanin; Zdravko Galić

    2007-01-01

    The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on t...

  13. Database management systems understanding and applying database technology

    CERN Document Server

    Gorman, Michael M

    1991-01-01

    Database Management Systems: Understanding and Applying Database Technology focuses on the processes, methodologies, techniques, and approaches involved in database management systems (DBMSs).The book first takes a look at ANSI database standards and DBMS applications and components. Discussion focus on application components and DBMS components, implementing the dynamic relationship application, problems and benefits of dynamic relationship DBMSs, nature of a dynamic relationship application, ANSI/NDL, and DBMS standards. The manuscript then ponders on logical database, interrogation, and phy

  14. Value Co-creation as Part of an Integrative Vision of Innovation Management

    DEFF Research Database (Denmark)

    Gerstlberger, Wolfgang; Knudsen, Mette Præst; Tanev, Stoyan

    2009-01-01

    Value co-creation is an emerging concept in business, marketing and innovation management. Its growing interest points to the emergence of a new semantic wave in innovation research that requires the adoption of new terminology, frameworks and fields of research exploration. There is a number...... an attempt to position the value co-creation paradigm within an integrative vision for innovation management research and practices. This positioning is a challenging task as the meaning of the terms "value co-creation" and "integrative" innovation management need to be more fully clarified. We attempt...... to identify an appropriate plane of conceptual integrity that could be used to describe the innovation management field within the context of its relation to value co-creation....

  15. An Interoperable Cartographic Database

    Directory of Open Access Journals (Sweden)

    Slobodanka Ključanin

    2007-05-01

    Full Text Available The concept of producing a prototype of interoperable cartographic database is explored in this paper, including the possibilities of integration of different geospatial data into the database management system and their visualization on the Internet. The implementation includes vectorization of the concept of a single map page, creation of the cartographic database in an object-relation database, spatial analysis, definition and visualization of the database content in the form of a map on the Internet. 

  16. The Role of Knowledge Creation and Its Dimensions in Management Skills of Managers of Tabriz University of Medical Sciences

    Directory of Open Access Journals (Sweden)

    Mohammad-Ali Hemmati

    2016-01-01

    Full Text Available ​Background and Objectives : The purpose of this study was to study the role of knowledge creation and management skills of managers in Tabriz University of Medical Sciences. Material and Methods : This was a descriptive correlational study. The statistical population consisted of all managers (140 managers in Tabriz University of Medical Sciences. Census sampling method was used due to limited statistical population. The data were collected through management skills and knowledge creation questionnaire developed by Goudarzi (2002. The reliability was 0.933 and 0.792 respectively using Cronbach's alpha. The validity of the questionnaire was verified by management faculty members. Pearson correlation and multiple regression analysis were used to analyze the data. Results : Results showed that there was a positive relationship between knowledge creation and management skills of the managers. In addition, there was a positive and significant relationship between the management skills indicators (human, conceptual and technical and the knowledge creation variables. Multiple regression results indicated that the knowledge creation dimensions had a predictive role in human, perceptual and technical skills. A significant relationship between knowledge creation and management skills of managers indicated that managers should have access to the up-to-date knowledge to promote it in order to execute it at all levels within the organization to improvement staff and organization creativity. Conclusion : The results demonstrated that the enhancement of organizational knowledge creation and its dimensions lead to improvement of management skills.  Managers need to have dynamic capabilities to move towards knowledge creation and make the best use of available and potential resources of the organization to achieve these capabilities and identify, acquire, apply, integrate and combine the information, knowledge and skills.

  17. Measuring and Managing Value Co-Creation Process: Overview of Existing Theoretical Models

    Directory of Open Access Journals (Sweden)

    Monika Skaržauskaitė

    2013-08-01

    Full Text Available Purpose — the article is to provide a holistic view on concept of value co-creation and existing models for measuring and managing it by conducting theoretical analysis of scientific literature sources targeting the integration of various approaches. Most important and relevant results of the literature study are presented with a focus on changed roles of organizations and consumers. This article aims at contributing theoretically to the research stream of measuring co-creation of value in order to gain knowledge for improvement of organizational performance and enabling new and innovative means of value creation. Design/methodology/approach. The nature of this research is exploratory – theoretical analysis and synthesis of scientific literature sources targeting the integration of various approaches was performed. This approach was chosen due to the absence of established theory on models of co-creation, possible uses in organizations and systematic overview of tools measuring/suggesting how to measure co-creation. Findings. While the principles of managing and measuring co-creation in regards of consumer motivation and involvement are widely researched, little attempt has been made to identify critical factors and create models dealing with organizational capabilities and managerial implications of value co-creation. Systematic analysis of literature revealed a gap not only in empirical research concerning organization’s role in co-creation process, but in theoretical and conceptual levels, too. Research limitations/implications. The limitations of this work as a literature review lies in its nature – the complete reliance on previously published research papers and the availability of these studies. For a deeper understanding of co-creation management and for developing models that can be used in real-life organizations, a broader theoretical, as well as empirical, research is necessary. Practical implications. Analysis of the

  18. ARTISTIC AND SCIENTIFIC CREATION PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioan Tudor

    2013-12-01

    Full Text Available Cultural creation is not ex nihilo, but a conversion of old “material” into a new form or a re-signifying, or a new combination of preexisting elements. Art, as well as science, has been characterized as acts of dominating and transforming nature. In art, the process is real and in science it is virtual. This process is characterized by an efficient management of the means of expression which must be subordinated to the message that the piece of art wants to transmit and the scientist exploits facts in order to make them significant as support, expression and exemplification of the laws of nature. There is also an ontogenesis of management: individual evolution which takes place by virtue of a program, thanks to devices with self-regulating capabilities. This aspect may be particularly interesting for cyberneticists. In contemporary civilization the transfer of certain aspects of the creation process to automated machines implies programming, algorithms. Man is an algorithmic being.

  19. Database development and management

    CERN Document Server

    Chao, Lee

    2006-01-01

    Introduction to Database Systems Functions of a DatabaseDatabase Management SystemDatabase ComponentsDatabase Development ProcessConceptual Design and Data Modeling Introduction to Database Design Process Understanding Business ProcessEntity-Relationship Data Model Representing Business Process with Entity-RelationshipModelTable Structure and NormalizationIntroduction to TablesTable NormalizationTransforming Data Models to Relational Databases .DBMS Selection Transforming Data Models to Relational DatabasesEnforcing ConstraintsCreating Database for Business ProcessPhysical Design and Database

  20. Column-oriented database management systems

    OpenAIRE

    Možina, David

    2013-01-01

    In the following thesis I will present column-oriented database. Among other things, I will answer on a question why there is a need for a column-oriented database. In recent years there have been a lot of attention regarding a column-oriented database, even if the existence of a columnar database management systems dates back in the early seventies of the last century. I will compare both systems for a database management – a colum-oriented database system and a row-oriented database system ...

  1. How to manage enterprise? From creation to rational continuation

    NARCIS (Netherlands)

    Hans Broekhuis; Louise van Weerden

    2009-01-01

    There is a difference between enterprise and management. Enterprise is about creation and management is the rational continuation of enterprise. Being rational comes natural to entrepreneurs, but a good entrepreneur has to develop both aspects. To achieve this is an important aspect of management

  2. Applying the knowledge creation model to the management of ...

    African Journals Online (AJOL)

    In present-day society, the need to manage indigenous knowledge is widely recognised. However, there is a debate in progress on whether or not indigenous knowledge can be easily managed. The purpose of this paper is to examine the possibility of using knowledge management models like knowledge creation theory ...

  3. A Curriculum of Value Creation and Management in Engineering

    Science.gov (United States)

    Yannou, Bernard; Bigand, Michel

    2004-01-01

    As teachers and researchers belonging to two sister French engineering schools, we are convinced that the processes of value creation and management are essential in today's teaching of industrial engineering and project managers. We believe that such processes may be embedded in a three-part curriculum composed of value management and innovation…

  4. The Role of External Involvement in the Creation of Management Innovations

    DEFF Research Database (Denmark)

    Mol, Michael J.; Birkinshaw, Julian

    2014-01-01

    There has recently been renewed scholarly interest in management innovating, the creation of new organizational practices, structures, processes and techniques. We suggest that external involvement in the process of management innovating can transpire in three different ways: direct input from...... no clear effect. Furthermore the three forms of involvement act to a large degree as substitutes. We contribute new theoretical arguments for the facilitators of management innovation, demonstrate the usefulness of an open innovation lens to the study of management innovation, show that management...... innovating is a relatively complex form of strategic process and highlight how the creation of management innovations is similar to and different from the genesis of other types of innovation....

  5. Applications of GIS and database technologies to manage a Karst Feature Database

    Science.gov (United States)

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  6. Generalized Database Management System Support for Numeric Database Environments.

    Science.gov (United States)

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  7. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    Science.gov (United States)

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  8. [Creation and management of organizational knowledge].

    Science.gov (United States)

    Shinyashiki, Gilberto Tadeu; Trevizan, Maria Auxiliadora; Mendes, Isabel Amélia

    2003-01-01

    With a view to creating and establishing a sustainable position of competitive advantage, the best organizations are increasingly investing in the application of concepts such as learning, knowledge and competency. The organization's creation or acquisition of knowledge about its actions represents an intangible resource that is capable of conferring a competitive advantage upon them. This knowledge derives from interactions developed in learning processes that occur in the organizational environment. The more specific characteristics this knowledge demonstrates in relation to the organization, the more it will become the foundation of its core competencies and, consequently, an important strategic asset. This article emphasizes nurses' role in the process of knowledge management, placing them in the intersection between horizontal and vertical information levels as well as in the creation of a sustainable competitive advantage. Authors believe that this contribution may represent an opportunity for a reflection about its implications for the scenarious of health and nursing practices.

  9. TWRS technical baseline database manager definition document

    International Nuclear Information System (INIS)

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  10. Value Creation Logics and Internationalization of Service Firms

    DEFF Research Database (Denmark)

    Ørberg Jensen, Peter D.; Petersen, Bent

    2014-01-01

    be based on a thorough understanding of the fundamental nature of these firms. Design/methodology/approach - Theoretical study. Findings - We put forward propositions concerning the pace of internationalization and the default foreign operation modes in service firms. Research limitations...... implications - We suggest that managers in service firms should consider primarily the nature of the value creation logic(s) in their firms when deciding and designing an internationalization strategy. Originality/value - The study presents a novel theoretical approach and a set of propositions on service firm....../implications - The use of value creation logics can be a useful complement to the conventional approaches to the study of service firms’ internationalization. However, the fact that most firms encompass more than one value creation logic complicates the use of firm databases and industry statistics. Practical...

  11. Selection of nuclear power information database management system

    International Nuclear Information System (INIS)

    Zhang Shuxin; Wu Jianlei

    1996-01-01

    In the condition of the present database technology, in order to build the Chinese nuclear power information database (NPIDB) in the nuclear industry system efficiently at a high starting point, an important task is to select a proper database management system (DBMS), which is the hinge of the matter to build the database successfully. Therefore, this article explains how to build a practical information database about nuclear power, the functions of different database management systems, the reason of selecting relation database management system (RDBMS), the principles of selecting RDBMS, the recommendation of ORACLE management system as the software to build database and so on

  12. Records Management Database

    Data.gov (United States)

    US Agency for International Development — The Records Management Database is tool created in Microsoft Access specifically for USAID use. It contains metadata in order to access and retrieve the information...

  13. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  14. Management of virtualized infrastructure for physics databases

    International Nuclear Information System (INIS)

    Topurov, Anton; Gallerani, Luigi; Chatal, Francois; Piorkowski, Mariusz

    2012-01-01

    Demands for information storage of physics metadata are rapidly increasing together with the requirements for its high availability. Most of the HEP laboratories are struggling to squeeze more from their computer centers, thus focus on virtualizing available resources. CERN started investigating database virtualization in early 2006, first by testing database performance and stability on native Xen. Since then we have been closely evaluating the constantly evolving functionality of virtualisation solutions for database and middle tier together with the associated management applications – Oracle's Enterprise Manager and VM Manager. This session will detail our long experience in dealing with virtualized environments, focusing on newest Oracle OVM 3.0 for x86 and Oracle Enterprise Manager functionality for efficiently managing your virtualized database infrastructure.

  15. Database design and database administration for a kindergarten

    OpenAIRE

    Vítek, Daniel

    2009-01-01

    The bachelor thesis deals with creation of database design for a standard kindergarten, installation of the designed database into the database system Oracle Database 10g Express Edition and demonstration of the administration tasks in this database system. The verification of the database was proved by a developed access application.

  16. Microcomputer Database Management Systems for Bibliographic Data.

    Science.gov (United States)

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  17. A Support Database System for Integrated System Health Management (ISHM)

    Science.gov (United States)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between

  18. Managing the BABAR Object Oriented Database

    International Nuclear Information System (INIS)

    Hasan, Adil

    2002-01-01

    The BaBar experiment stores its data in an Object Oriented federated database supplied by Objectivity/DB(tm). This database is currently 350TB in size and is expected to increase considerably as the experiment matures. Management of this database requires careful planning and specialized tools in order to make the data available to physicists in an efficient and timely manner. We discuss the operational issues and management tools that were developed during the previous run to deal with this vast quantity of data at SLAC

  19. Creation of the NaSCoRD Database

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jankovsky, Zachary Kyle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stuart, William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    This report was written as part of a United States Department of Energy (DOE), Office of Nuclear Energy, Advanced Reactor Technologies program funded project to re-create the capabilities of the legacy Centralized Reliability Database Organization (CREDO) database. The CREDO database provided a record of component design and performance documentation across various systems that used sodium as a working fluid. Regaining this capability will allow the DOE complex and the domestic sodium reactor industry to better understand how previous systems were designed and built for use in improving the design and operations of future loops. The contents of this report include: overview of the current state of domestic sodium reliability databases; summary of the ongoing effort to improve, understand, and process the CREDO information; summary of the initial efforts to develop a unified sodium reliability database called the Sodium System Component Reliability Database (NaSCoRD); and explain both how potential users can access the domestic sodium reliability databases and the type of information that can be accessed from these databases.

  20. Development of operation management database for research reactors

    International Nuclear Information System (INIS)

    Zhang Xinjun; Chen Wei; Yang Jun

    2005-01-01

    An Operation Database for Pulsed Reactor has been developed on the platform for Microsoft visual C++ 6.0. This database includes four function modules, fuel elements management, incident management, experiment management and file management. It is essential for reactor security and information management. (authors)

  1. Creation and use of knowledge management in academic libraries ...

    African Journals Online (AJOL)

    Creation and use of knowledge management in academic libraries: Usmanu ... one of the important topics today in both the industrial and information research world. Huge amount of data and information are being processed and needed to ... the overall progress and development of academic libraries as well as its impact ...

  2. Co-Creation

    DEFF Research Database (Denmark)

    Degnegaard, Rex

    2012-01-01

    Co-creation as a concept has won terrain over the past 10 years. In practice as well as in literature, co-creation is climbing the agenda in relation to contemporary opportunities and challenges within management, organization development, and change initiatives. However, there Is very little...... research-based literature on the development of co-creation. This paper aims to build an overview of the literature on co-creation to explore what the existing literature relate to and indeed to pinpoint if any pattern or streams can be identified. A main finding from the analysis is how co-creation tends...

  3. Critical processes of knowledge management: An approach toward the creation of customer value

    Directory of Open Access Journals (Sweden)

    Ignacio Cepeda-Carrion

    2017-01-01

    Full Text Available The aim of this article is to contribute to the literature by identifying and analyzing possible combinations between critical knowledge management processes (absorptive capacity, knowledge transfer and knowledge application, which will result in the creation of superior customer value. The main research question this work addresses is: given that customers are demanding each day a greater value, how can organizations create more value to customers from their knowledge management processes and the combination of them? We propose that the combination of the three knowledge management processes builds a dynamic or higher-order capability that results in the creation of superior value for customers.

  4. Content And Multimedia Database Management Systems

    NARCIS (Netherlands)

    de Vries, A.P.

    1999-01-01

    A database management system is a general-purpose software system that facilitates the processes of defining, constructing, and manipulating databases for various applications. The main characteristic of the ‘database approach’ is that it increases the value of data by its emphasis on data

  5. [The future of clinical laboratory database management system].

    Science.gov (United States)

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  6. Legacy2Drupal - Conversion of an existing oceanographic relational database to a semantically enabled Drupal content management system

    Science.gov (United States)

    Maffei, A. R.; Chandler, C. L.; Work, T.; Allen, J.; Groman, R. C.; Fox, P. A.

    2009-12-01

    Content Management Systems (CMSs) provide powerful features that can be of use to oceanographic (and other geo-science) data managers. However, in many instances, geo-science data management offices have previously designed customized schemas for their metadata. The WHOI Ocean Informatics initiative and the NSF funded Biological Chemical and Biological Data Management Office (BCO-DMO) have jointly sponsored a project to port an existing, relational database containing oceanographic metadata, along with an existing interface coded in Cold Fusion middleware, to a Drupal6 Content Management System. The goal was to translate all the existing database tables, input forms, website reports, and other features present in the existing system to employ Drupal CMS features. The replacement features include Drupal content types, CCK node-reference fields, themes, RDB, SPARQL, workflow, and a number of other supporting modules. Strategic use of some Drupal6 CMS features enables three separate but complementary interfaces that provide access to oceanographic research metadata via the MySQL database: 1) a Drupal6-powered front-end; 2) a standard SQL port (used to provide a Mapserver interface to the metadata and data; and 3) a SPARQL port (feeding a new faceted search capability being developed). Future plans include the creation of science ontologies, by scientist/technologist teams, that will drive semantically-enabled faceted search capabilities planned for the site. Incorporation of semantic technologies included in the future Drupal 7 core release is also anticipated. Using a public domain CMS as opposed to proprietary middleware, and taking advantage of the many features of Drupal 6 that are designed to support semantically-enabled interfaces will help prepare the BCO-DMO database for interoperability with other ecosystem databases.

  7. Reexamining Operating System Support for Database Management

    OpenAIRE

    Vasil, Tim

    2003-01-01

    In 1981, Michael Stonebraker [21] observed that database management systems written for commodity operating systems could not effectively take advantage of key operating system services, such as buffer pool management and process scheduling, due to expensive overhead and lack of customizability. The “not quite right” fit between these kernel services and the demands of database systems forced database designers to work around such limitations or re-implement some kernel functionality in user ...

  8. Study on managing EPICS database using ORACLE

    International Nuclear Information System (INIS)

    Liu Shu; Wang Chunhong; Zhao Jijiu

    2007-01-01

    EPICS is used as a development toolkit of BEPCII control system. The core of EPICS is a distributed database residing in front-end machines. The distributed database is usually created by tools such as VDCT and text editor in the host, then loaded to front-end target IOCs through the network. In BEPCII control system there are about 20,000 signals, which are distributed in more than 20 IOCs. All the databases are developed by device control engineers using VDCT or text editor. There's no uniform tools providing transparent management. The paper firstly presents the current status on EPICS database management issues in many labs. Secondly, it studies EPICS database and the interface between ORACLE and EPICS database. finally, it introduces the software development and application is BEPCII control system. (authors)

  9. Wireless Sensor Networks Database: Data Management and Implementation

    Directory of Open Access Journals (Sweden)

    Ping Liu

    2014-04-01

    Full Text Available As the core application of wireless sensor network technology, Data management and processing have become the research hotspot in the new database. This article studied mainly data management in wireless sensor networks, in connection with the characteristics of the data in wireless sensor networks, discussed wireless sensor network data query, integrating technology in-depth, proposed a mobile database structure based on wireless sensor network and carried out overall design and implementation for the data management system. In order to achieve the communication rules of above routing trees, network manager uses a simple maintenance algorithm of routing trees. Design ordinary node end, server end in mobile database at gathering nodes and mobile client end that can implement the system, focus on designing query manager, storage modules and synchronous module at server end in mobile database at gathering nodes.

  10. Value Creation Logics and Internationalization of Service Firms

    DEFF Research Database (Denmark)

    Ørberg Jensen, Peter D.; Petersen, Bent

    2014-01-01

    While mainstream theories in international business and management are founded either explicitly or implicitly on studies of manufacturing firms, prior attempts to develop theory on the internationalization of service firms are sparse and have yet to establish solid and comprehensive frameworks...... on a thorough understanding of the fundamental nature of these firms. We put forward ten propositions concerning the pace of internationalization in service firms and the dominant foreign operation modes. The use of value creation logics can be a useful complement to the conventional approaches to the study...... of service firms’ internationalization. However, the fact that most firms encompass more than one value creation logic complicates the use of firm databases and industry statistics. The study presents a novel theoretical approach and a set of propositions on service firm internationalization founded...

  11. Managing the BaBar object oriented database

    International Nuclear Information System (INIS)

    Hasan, A.; Trunov, A.

    2001-01-01

    The BaBar experiment stores its data in an Object Oriented federated database supplied by Objectivity/DB(tm). This database is currently 350TB in size and is expected to increase considerably as the experiment matures. Management of this database requires careful planning and specialized tools in order to make the data available to physicists in an efficient and timely manner. The authors discuss the operational issues and management tools that were developed during the previous run to deal with this vast quantity of data at SLAC

  12. Access database application in medical treatment management platform

    International Nuclear Information System (INIS)

    Wu Qingming

    2014-01-01

    For timely, accurate and flexible access to medical expenses data, we applied Microsoft Access 2003 database management software, and we finished the establishment of a management platform for medical expenses. By developing management platform for medical expenses, overall hospital costs for medical expenses can be controlled to achieve a real-time monitoring of medical expenses. Using the Access database management platform for medical expenses not only changes the management model, but also promotes a sound management system for medical expenses. (authors)

  13. Database basic design for safe management radioactive waste

    International Nuclear Information System (INIS)

    Son, D. C.; Ahn, K. I.; Jung, D. J.; Cho, Y. B.

    2003-01-01

    As the amount of radioactive waste and related information to be managed are increasing, some organizations are trying or planning to computerize the management on radioactive waste. When we consider that information on safe management of radioactive waste should be used in association with national radioactive waste management project, standardization of data form and its protocol is required, Korea Institute of Nuclear Safety(KINS) will establish and operate nationwide integrated database in order to effectively manage a large amount of information on national radioactive waste. This database allows not only to trace and manage the trend of radioactive waste occurrence and in storage but also to produce reliable analysis results for the quantity accumulated. Consequently, we can provide necessary information for national radioactive waste management policy and related industry's planing. This study explains the database design which is the essential element for information management

  14. Performance Enhancements for Advanced Database Management Systems

    OpenAIRE

    Helmer, Sven

    2000-01-01

    New applications have emerged, demanding database management systems with enhanced functionality. However, high performance is a necessary precondition for the acceptance of such systems by end users. In this context we developed, implemented, and tested algorithms and index structures for improving the performance of advanced database management systems. We focused on index structures and join algorithms for set-valued attributes.

  15. The ATLAS Distributed Data Management System & Databases

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Barisits, M; Beermann, T; Vigne, R; Serfon, C

    2013-01-01

    The ATLAS Distributed Data Management (DDM) System is responsible for the global management of petabytes of high energy physics data. The current system, DQ2, has a critical dependency on Relational Database Management Systems (RDBMS), like Oracle. RDBMS are well-suited to enforcing data integrity in online transaction processing applications, however, concerns have been raised about the scalability of its data warehouse-like workload. In particular, analysis of archived data or aggregation of transactional data for summary purposes is problematic. Therefore, we have evaluated new approaches to handle vast amounts of data. We have investigated a class of database technologies commonly referred to as NoSQL databases. This includes distributed filesystems, like HDFS, that support parallel execution of computational tasks on distributed data, as well as schema-less approaches via key-value stores, like HBase. In this talk we will describe our use cases in ATLAS, share our experiences with various databases used ...

  16. Views of Community Managers on Knowledge Co-creation in Online Communities for People With Disabilities: Qualitative Study.

    Science.gov (United States)

    Amann, Julia; Rubinelli, Sara

    2017-10-10

    The use of online communities to promote end user involvement and co-creation in the product and service innovation process is well documented in the marketing and management literature. Whereas online communities are widely used for health care service provision and peer-to-peer support, only little is known about how they could be integrated into the health care innovation process. The overall objective of this qualitative study was to explore community managers' views on and experiences with knowledge co-creation in online communities for people with disabilities. A descriptive qualitative research design was used. Data were collected through semi-structured interviews with nine community managers. To complement the interview data, additional information was retrieved from the communities in the form of structural information (number of registered users, number and names of topic areas covered by the forum) and administrative information (terms and conditions and privacy statements, forum rules). Data were analyzed using thematic analysis. Our results highlight two main aspects: peer-to-peer knowledge co-creation and types of collaboration with external actors. Although community managers strongly encouraged peer-to-peer knowledge co-creation, our findings indicated that these activities were not common practice in the communities under investigation. In fact, much of what related to co-creation, prototyping, and product development was still perceived to be directed by professionals and experts. Community managers described the role of their respective communities as informing this process rather than a driving force. The role of community members as advisors to researchers, health care professionals, and businesses was discussed in the context of types of collaboration with external actors. According to the community managers, most of the external inquiries related to research projects of students or health care professionals in training, who often joined a

  17. The Politics of Information: Building a Relational Database To Support Decision-Making at a Public University.

    Science.gov (United States)

    Friedman, Debra; Hoffman, Phillip

    2001-01-01

    Describes creation of a relational database at the University of Washington supporting ongoing academic planning at several levels and affecting the culture of decision making. Addresses getting started; sharing the database; questions, worries, and issues; improving access to high-demand courses; the advising function; management of instructional…

  18. Database management in the new GANIL control system

    International Nuclear Information System (INIS)

    Lecorche, E.; Lermine, P.

    1993-01-01

    At the start of the new control system design, decision was made to manage the huge amount of data by means of a database management system. The first implementations built on the INGRES relational database are described. Real time and data management domains are shown, and problems induced by Ada/SQL interfacing are briefly discussed. Database management concerns the whole hardware and software configuration for the GANIL pieces of equipment and the alarm system either for the alarm configuration or for the alarm logs. An other field of application encompasses the beam parameter archiving as a function of the various kinds of beams accelerated at GANIL (ion species, energies, charge states). (author) 3 refs., 4 figs

  19. Distributed Database Management Systems A Practical Approach

    CERN Document Server

    Rahimi, Saeed K

    2010-01-01

    This book addresses issues related to managing data across a distributed database system. It is unique because it covers traditional database theory and current research, explaining the difficulties in providing a unified user interface and global data dictionary. The book gives implementers guidance on hiding discrepancies across systems and creating the illusion of a single repository for users. It also includes three sample frameworksâ€"implemented using J2SE with JMS, J2EE, and Microsoft .Netâ€"that readers can use to learn how to implement a distributed database management system. IT and

  20. NGNP Risk Management Database: A Model for Managing Risk

    International Nuclear Information System (INIS)

    Collins, John

    2009-01-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft(reg s ign) Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool's design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  1. NGNP Risk Management Database: A Model for Managing Risk

    Energy Technology Data Exchange (ETDEWEB)

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  2. Computerized nuclear material database management system for power reactors

    International Nuclear Information System (INIS)

    Cheng Binghao; Zhu Rongbao; Liu Daming; Cao Bin; Liu Ling; Tan Yajun; Jiang Jincai

    1994-01-01

    The software packages for nuclear material database management for power reactors are described. The database structure, data flow and model for management of the database are analysed. Also mentioned are the main functions and characterizations of the software packages, which are successfully installed and used at both the Daya Bay Nuclear Power Plant and the Qinshan Nuclear Power Plant for the purposed of handling nuclear material database automatically

  3. METHODOLOGICAL PROBLEMS AND WAYS OF CREATION OF THE AIRCRAFT EQUIPMENT TEST AUTOMATED MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Vladimir Michailovich Vetoshkin

    2017-01-01

    Full Text Available The development of new and modernization of existing aviation equipment specimens of different classes are ac- companied and completed by the complex process of ground and flight tests. This phase of aviation equipment life cycle is implemented by means of organizational and technical systems - running centers. The latter include various proving grounds, measuring complex and systems, aircraft, ships, security and flight control offices, information processing laborato- ries and many other elements. The system analysis results of development challenges of the automated control systems of aviation equipment tests operations are presented. The automated control systems are in essence an automated data bank. The key role of development of flight tests automated control system in the process of creation of the automated control sys- tems of aviation equipment tests operations is substantiated. The way of the mobile modular measuring complexes integra- tion and the need for national methodologies and technological standards for database systems design concepts are grounded. Database system, as a central element in this scheme, provides collection, storing and updating of values of the elements described above in pace and the required frequency of the controlled object state monitoring. It is database system that pro- vides the supervisory unit with actual data corresponding to specific moments of time, which concern the state processes, assessments of the progress and results of flight experiments, creating the necessary environment for aviation equipment managing and testing as a whole. The basis for development of subsystems of automated control systems of aviation equip- ment tests operations are conceptual design processes of the respective database system, the implementation effectiveness of which largely determines the level of success and ability to develop the systems being created. Introduced conclusions and suggestions can be used in the

  4. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  5. VALOIR 2012 2nd Workshop on Managing the Client Value Creation Process in Agile Projects: Message from the Chairs

    NARCIS (Netherlands)

    Pérez, Jennifer; Buglione, Luigi; Daneva, Maia; Dieste, Oscar; Jedlitschka, Andreas; Juristo, Natalia

    2012-01-01

    Welcome to the 2nd Workshop on Managing the Client Value Creation Process in Agile Projects (VALOIR) at the PROFES 2012 conference! The overall goal of VALOIR is to make the knowledge on value creation and management explicit, encouraging the discussion on the use of measurement and estimation

  6. The Enterprise Value Creation Management Process and Intangible Assets - the CERM/ROIAM Approach (Japanese)

    OpenAIRE

    KARIYA Takeaki

    2006-01-01

    As Robert Redfield has stated, culture is a shared common knowledge and understanding to be manifested in behavior and processed goods. Intangible assets are the fundamental source of enterprise value creation. In this paper we first aim to clarify the compound, complex and multilayered processes whereby intangible assets contribute to value creation, and then aim to formulate the framework of an effective and comprehensive management process referred to as CERM/ROIAM. This approach makes it ...

  7. The cost of wetland creation and restoration. Final report

    Energy Technology Data Exchange (ETDEWEB)

    King, D.; Bohlen, C.

    1995-08-01

    This report examines the economics of wetland creation, restoration, and enhancement projects, especially as they are used within the context of mitigation for unavoidable wetland losses. Complete engineering-cost-accounting profiles of over 90 wetland projects were developed in collaboration with leading wetland restoration and creation practitioners around the country to develop a primary source database. Data on the costs of over 1,000 wetland projects were gathered from published sources and other available databases to develop a secondary source database. Cases in both databases were carefully analyzed and a set of baseline cost per acre estimates were developed for wetland creation, restoration, and enhancement. Observations of costs varied widely, ranging from $5 per acre to $1.5 million per acre. Differences in cost were related to the target wetland type, and to site-specific and project-specific factors that affected the preconstruction, construction, and post-construction tasks necessary to carry out each particular project. Project-specific and site-specific factors had a much larger effect on project costs than wetland type for non-agricultural projects. Costs of wetland creation and restoration were also shown to differ by region, but not by as much as expected, and in response to the regulatory context. The costs of wetland creation, restoration, and enhancement were also analyzed in a broader economic context through examination of the market for wetland mitigation services, and through the development of a framework for estimating compensation ratios-the number of acres of created, restored, or enhanced wetland required to compensate for an acre of lost natural wetland. The combination of per acre creation, restoration, and enhancement costs and the compensation ratio determine the overall mitigation costs associated with alternative mitigation strategies.

  8. Insertion algorithms for network model database management systems

    Science.gov (United States)

    Mamadolimov, Abdurashid; Khikmat, Saburov

    2017-12-01

    The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, forms partial order. When a database is large and a query comparison is expensive then the efficiency requirement of managing algorithms is minimizing the number of query comparisons. We consider updating operation for network model database management systems. We develop a new sequantial algorithm for updating operation. Also we suggest a distributed version of the algorithm.

  9. Database management system for large container inspection system

    International Nuclear Information System (INIS)

    Gao Wenhuan; Li Zheng; Kang Kejun; Song Binshan; Liu Fang

    1998-01-01

    Large Container Inspection System (LCIS) based on radiation imaging technology is a powerful tool for the Customs to check the contents inside a large container without opening it. The author has discussed a database application system, as a part of Signal and Image System (SIS), for the LCIS. The basic requirements analysis was done first. Then the selections of computer hardware, operating system, and database management system were made according to the technology and market products circumstance. Based on the above considerations, a database application system with central management and distributed operation features has been implemented

  10. Management system of instrument database

    International Nuclear Information System (INIS)

    Zhang Xin

    1997-01-01

    The author introduces a management system of instrument database. This system has been developed using with Foxpro on network. The system has some characters such as clear structure, easy operation, flexible and convenient query, as well as the data safety and reliability

  11. The use of modern databases in managing nuclear material inventories

    International Nuclear Information System (INIS)

    Behrens, R.G.

    1994-01-01

    The need for a useful nuclear materials database to assist in the management of nuclear materials within the Department of Energy (DOE) Weapons Complex is becoming significantly more important as the mission of the DOE Complex changes and both international safeguards and storage issues become drivers in determining how these materials are managed. A well designed nuclear material inventory database can provide the Nuclear Materials Manager with an essential cost effective tool for timely analysis and reporting of inventories. This paper discusses the use of databases as a management tool to meet increasing requirements for accurate and timely information on nuclear material inventories and related information. From the end user perspective, this paper discusses the rationale, philosophy, and technical requirements for an integrated database to meet the needs for a variety of users such as those working in the areas of Safeguards, Materials Control and Accountability (MC ampersand A), Nuclear Materials Management, Waste Management, materials processing, packaging and inspection, and interim/long term storage

  12. Environment: General; Grammar & Usage; Money Management; Music History; Web Page Creation & Design.

    Science.gov (United States)

    Web Feet, 2001

    2001-01-01

    Describes Web site resources for elementary and secondary education in the topics of: environment, grammar, money management, music history, and Web page creation and design. Each entry includes an illustration of a sample page on the site and an indication of the grade levels for which it is appropriate. (AEF)

  13. Dictionary as Database.

    Science.gov (United States)

    Painter, Derrick

    1996-01-01

    Discussion of dictionaries as databases focuses on the digitizing of The Oxford English dictionary (OED) and the use of Standard Generalized Mark-Up Language (SGML). Topics include the creation of a consortium to digitize the OED, document structure, relational databases, text forms, sequence, and discourse. (LRW)

  14. The Network Configuration of an Object Relational Database Management System

    Science.gov (United States)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  15. BDVC (Bimodal Database of Violent Content): A database of violent audio and video

    Science.gov (United States)

    Rivera Martínez, Jose Luis; Mijes Cruz, Mario Humberto; Rodríguez Vázqu, Manuel Antonio; Rodríguez Espejo, Luis; Montoya Obeso, Abraham; García Vázquez, Mireya Saraí; Ramírez Acosta, Alejandro Álvaro

    2017-09-01

    Nowadays there is a trend towards the use of unimodal databases for multimedia content description, organization and retrieval applications of a single type of content like text, voice and images, instead bimodal databases allow to associate semantically two different types of content like audio-video, image-text, among others. The generation of a bimodal database of audio-video implies the creation of a connection between the multimedia content through the semantic relation that associates the actions of both types of information. This paper describes in detail the used characteristics and methodology for the creation of the bimodal database of violent content; the semantic relationship is stablished by the proposed concepts that describe the audiovisual information. The use of bimodal databases in applications related to the audiovisual content processing allows an increase in the semantic performance only and only if these applications process both type of content. This bimodal database counts with 580 audiovisual annotated segments, with a duration of 28 minutes, divided in 41 classes. Bimodal databases are a tool in the generation of applications for the semantic web.

  16. Value Creation by Process-Oriented Project Management

    NARCIS (Netherlands)

    Geijtenbeek, W.; Eekelen, van A.L.M.; Kleine, A.J.; Favie, R.; Maas, G.J.; Milford, R.

    2007-01-01

    The start of a design process based on value creation requires a different approach and new models. The aim of this study is to provide insight into how a design process based on value creation can be initiated. The intended result of the study is the design of the of a collaboration model that can

  17. The Future of Co-Creation

    DEFF Research Database (Denmark)

    Seppa, Marko; Tanev, Stoyan

    2011-01-01

    The objective of this article is to provide a brief summary of the key directions in value co-creation research that have emerged in the last 10 years. It points to several emerging streams in value co-creation research including: i) general management perspective; ii) new product development...... on business co-creation. The development of business co-creation frameworks integrating the participatory role of both universities and vibrantly emerging business ecosystems represents a valuable alternative to traditional technology transfer and business administration approaches....

  18. Establishment of database system for management of KAERI wastes

    International Nuclear Information System (INIS)

    Shon, J. S.; Kim, K. J.; Ahn, S. J.

    2004-07-01

    Radioactive wastes generated by KAERI has various types, nuclides and characteristics. To manage and control these kinds of radioactive wastes, it comes to need systematic management of their records, efficient research and quick statistics. Getting information about radioactive waste generated and stored by KAERI is the basic factor to construct the rapid information system for national cooperation management of radioactive waste. In this study, Radioactive Waste Management Integration System (RAWMIS) was developed. It is is aimed at management of record of radioactive wastes, uplifting the efficiency of management and support WACID(Waste Comprehensive Integration Database System) which is a national radioactive waste integrated safety management system of Korea. The major information of RAWMIS supported by user's requirements is generation, gathering, transfer, treatment, and storage information for solid waste, liquid waste, gas waste and waste related to spent fuel. RAWMIS is composed of database, software (interface between user and database), and software for a manager and it was designed with Client/Server structure. RAWMIS will be a useful tool to analyze radioactive waste management and radiation safety management. Also, this system is developed to share information with associated companies. Moreover, it can be expected to support the technology of research and development for radioactive waste treatment

  19. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    Science.gov (United States)

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  20. A Database Management Assessment Instrument

    Science.gov (United States)

    Landry, Jeffrey P.; Pardue, J. Harold; Daigle, Roy; Longenecker, Herbert E., Jr.

    2013-01-01

    This paper describes an instrument designed for assessing learning outcomes in data management. In addition to assessment of student learning and ABET outcomes, we have also found the instrument to be effective for determining database placement of incoming information systems (IS) graduate students. Each of these three uses is discussed in this…

  1. Development of the severe accident risk information database management system SARD

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies

  2. Development of the severe accident risk information database management system SARD

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Kwang Il; Kim, Dong Ha

    2003-01-01

    The main purpose of this report is to introduce essential features and functions of a severe accident risk information management system, SARD (Severe Accident Risk Database Management System) version 1.0, which has been developed in Korea Atomic Energy Research Institute, and database management and data retrieval procedures through the system. The present database management system has powerful capabilities that can store automatically and manage systematically the plant-specific severe accident analysis results for core damage sequences leading to severe accidents, and search intelligently the related severe accident risk information. For that purpose, the present database system mainly takes into account the plant-specific severe accident sequences obtained from the Level 2 Probabilistic Safety Assessments (PSAs), base case analysis results for various severe accident sequences (such as code responses and summary for key-event timings), and related sensitivity analysis results for key input parameters/models employed in the severe accident codes. Accordingly, the present database system can be effectively applied in supporting the Level 2 PSA of similar plants, for fast prediction and intelligent retrieval of the required severe accident risk information for the specific plant whose information was previously stored in the database system, and development of plant-specific severe accident management strategies.

  3. Empirical Essays on the Stock Returns, Risk Management, and Liquidity Creation of Banks

    NARCIS (Netherlands)

    Y. Xu (Ying)

    2010-01-01

    textabstractThis thesis consists of three studies that respectively investigate the stock returns, risk management, and liquidity creation of banks. Chapter 2 focuses on the cross-section of bank stock returns and found that pricing factors such as leverage and beta, shown to be irrelevant to

  4. The higher school teaching staff professional development system creation on the adaptive management principles

    Directory of Open Access Journals (Sweden)

    Borova T.A.

    2012-03-01

    Full Text Available The article deals with theoretical analysis of the higher school teaching staff professional development system creation on the adaptive management principles. It is determined the background and components of the higher school teaching staff professional development adaptive management system. It is specified the mechanisms for higher school teaching staff professional development adaptive management: monitoring and coaching. It is shown their place in the higher school teaching staff professional development system on the adaptive management principles. The results of the system efficiency are singled out.

  5. Database system selection for marketing strategies support in information systems

    Directory of Open Access Journals (Sweden)

    František Dařena

    2007-01-01

    Full Text Available In today’s dynamically changing environment marketing has a significant role. Creating successful marketing strategies requires large amount of high quality information of various kinds and data types. A powerful database management system is a necessary condition for marketing strategies creation support. The paper briefly describes the field of marketing strategies and specifies the features that should be provided by database systems in connection with these strategies support. Major commercial (Oracle, DB2, MS SQL, Sybase and open-source (PostgreSQL, MySQL, Firebird databases are than examined from the point of view of accordance with these characteristics and their comparison in made. The results are useful for making the decision before acquisition of a database system during information system’s hardware architecture specification.

  6. Interconnecting heterogeneous database management systems

    Science.gov (United States)

    Gligor, V. D.; Luckenbaugh, G. L.

    1984-01-01

    It is pointed out that there is still a great need for the development of improved communication between remote, heterogeneous database management systems (DBMS). Problems regarding the effective communication between distributed DBMSs are primarily related to significant differences between local data managers, local data models and representations, and local transaction managers. A system of interconnected DBMSs which exhibit such differences is called a network of distributed, heterogeneous DBMSs. In order to achieve effective interconnection of remote, heterogeneous DBMSs, the users must have uniform, integrated access to the different DBMs. The present investigation is mainly concerned with an analysis of the existing approaches to interconnecting heterogeneous DBMSs, taking into account four experimental DBMS projects.

  7. Kingfisher: a system for remote sensing image database management

    Science.gov (United States)

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  8. International database on ageing management and life extension

    International Nuclear Information System (INIS)

    Ianko, L.; Lyssakov, V.; McLachlan, D.; Russell, J.; Mukhametshin, V.

    1995-01-01

    International database on ageing management and life extension for reactor pressure vessel materials (RPVM) is described with the emphasis on the following issues: requirements of the system; design concepts for RPVM database system; data collection, processing and storage; information retrieval and dissemination; RPVM information assessment and evaluation. 1 fig

  9. Nuclear data processing using a database management system

    International Nuclear Information System (INIS)

    Castilla, V.; Gonzalez, L.

    1991-01-01

    A database management system that permits the design of relational models was used to create an integrated database with experimental and evaluated nuclear data.A system that reduces the time and cost of processing was created for computers type EC or compatibles.A set of programs for the conversion from nuclear calculated data output format to EXFOR format was developed.A dictionary to perform a retrospective search in the ENDF database was created too

  10. Report of the SRC working party on databases and database management systems

    International Nuclear Information System (INIS)

    Crennell, K.M.

    1980-10-01

    An SRC working party, set up to consider the subject of support for databases within the SRC, were asked to identify interested individuals and user communities, establish which features of database management systems they felt were desirable, arrange demonstrations of possible systems and then make recommendations for systems, funding and likely manpower requirements. This report describes the activities and lists the recommendations of the working party and contains a list of databses maintained or proposed by those who replied to a questionnaire. (author)

  11. Managing Multiuser Database Buffers Using Data Mining Techniques

    NARCIS (Netherlands)

    Feng, L.; Lu, H.J.

    2004-01-01

    In this paper, we propose a data-mining-based approach to public buffer management for a multiuser database system, where database buffers are organized into two areas – public and private. While the private buffer areas contain pages to be updated by particular users, the public

  12. A user's manual for managing database system of tensile property

    International Nuclear Information System (INIS)

    Ryu, Woo Seok; Park, S. J.; Kim, D. H.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the tensile database system for managing the tensile property test data. The data base constructed the data produced from tensile property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The tensile database system was developed by internet method using Java, PL/SQL, JSP(Java Server Pages) tool

  13. An Introduction to the DB Relational Database Management System

    OpenAIRE

    Ward, J.R.

    1982-01-01

    This paper is an introductory guide to using the Db programs to maintain and query a relational database on the UNIX operating system. In the past decade. increasing interest has been shown in the development of relational database management systems. Db is an attempt to incorporate a flexible and powerful relational database system within the user environment presented by the UNIX operating system. The family of Db programs is useful for maintaining a database of information that i...

  14. Development of Krsko Severe Accident Management Database (SAMD)

    International Nuclear Information System (INIS)

    Basic, I.; Kocnar, R.

    1996-01-01

    Severe Accident Management is a framework to identify and implement the Emergency Response Capabilities that can be used to prevent or mitigate severe accidents and their consequences. Krsko Severe Accident Management Database documents the severe accident management activities which are developed in the NPP Krsko, based on the Krsko IPE (Individual Plant Examination) insights and Generic WOG SAMGs (Westinghouse Owners Group Severe Accident Management Guidance). (author)

  15. Managing Database Services: An Approach Based in Information Technology Services Availabilty and Continuity Management

    Directory of Open Access Journals (Sweden)

    Leonardo Bastos Pontes

    2017-01-01

    Full Text Available This paper is held in the information technology services management environment, with a few ideas of information technology governance, and purposes to implement a hybrid model to manage the services of a database, based on the principles of information technology services management in a supplementary health operator. This approach utilizes fundamental nuances of services management guides, such as CMMI for Services, COBIT, ISO 20000, ITIL and MPS.BR for Services; it studies harmonically Availability and Continuity Management, as most part of the guides also do. This work has its importance because it keeps a good flow in the database and improves the agility of the systems in the accredited clinics in the health plan.

  16. SPIRE Data-Base Management System

    Science.gov (United States)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  17. Design of database management system for 60Co container inspection system

    International Nuclear Information System (INIS)

    Liu Jinhui; Wu Zhifang

    2007-01-01

    The function of the database management system has been designed according to the features of cobalt-60 container inspection system. And the software related to the function has been constructed. The database querying and searching are included in the software. The database operation program is constructed based on Microsoft SQL server and Visual C ++ under Windows 2000. The software realizes database querying, image and graph displaying, statistic, report form and its printing, interface designing, etc. The software is powerful and flexible for operation and information querying. And it has been successfully used in the real database management system of cobalt-60 container inspection system. (authors)

  18. Nuclear database management systems

    International Nuclear Information System (INIS)

    Stone, C.; Sutton, R.

    1996-01-01

    The authors are developing software tools for accessing and visualizing nuclear data. MacNuclide was the first software application produced by their group. This application incorporates novel database management and visualization tools into an intuitive interface. The nuclide chart is used to access properties and to display results of searches. Selecting a nuclide in the chart displays a level scheme with tables of basic, radioactive decay, and other properties. All level schemes are interactive, allowing the user to modify the display, move between nuclides, and display entire daughter decay chains

  19. DOE technology information management system database study report

    Energy Technology Data Exchange (ETDEWEB)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.; Jusko, M.J.; Keisler, J.M.; Love, R.J.; Robinson, G.L. [Argonne National Lab., IL (United States). Decision and Information Sciences Div.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performed detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.

  20. 'Ethos' Enabling Organisational Knowledge Creation

    Science.gov (United States)

    Matsudaira, Yoshito

    This paper examines knowledge creation in relation to improvements on the production line in the manufacturing department of Nissan Motor Company and aims to clarify embodied knowledge observed in the actions of organisational members who enable knowledge creation will be clarified. For that purpose, this study adopts an approach that adds a first, second, and third-person's viewpoint to the theory of knowledge creation. Embodied knowledge, observed in the actions of organisational members who enable knowledge creation, is the continued practice of 'ethos' (in Greek) founded in Nissan Production Way as an ethical basis. Ethos is knowledge (intangible) assets for knowledge creating companies. Substantiated analysis classifies ethos into three categories: the individual, team and organisation. This indicates the precise actions of the organisational members in each category during the knowledge creation process. This research will be successful in its role of showing the indispensability of ethos - the new concept of knowledge assets, which enables knowledge creation -for future knowledge-based management in the knowledge society.

  1. Computerized database management system for breast cancer patients.

    Science.gov (United States)

    Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah

    2014-01-01

    Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.

  2. Nuclear power plant reliability database management

    International Nuclear Information System (INIS)

    Meslin, Th.; Aufort, P.

    1996-04-01

    In the framework of the development of a probabilistic safety project on site (notion of living PSA), Saint Laurent des Eaux NPP implements a specific EDF reliability database. The main goals of this project at Saint Laurent des Eaux are: to expand risk analysis and to constitute an effective local basis of thinking about operating safety by requiring the participation of all departments of a power plant: analysis of all potential operating transients, unavailability consequences... that means to go further than a simple culture of applying operating rules; to involve nuclear power plant operators in experience feedback and its analysis, especially by following up behaviour of components and of safety functions; to allow plant safety managers to outline their decisions facing safety authorities for notwithstanding, preventive maintenance programme, operating incident evaluation. To hit these goals requires feedback data, tools, techniques and development of skills. The first step is to obtain specific reliability data on the site. Raw data come from plant maintenance management system which processes all maintenance activities and keeps in memory all the records of component failures and maintenance activities. Plant specific reliability data are estimated with a Bayesian model which combines these validated raw data with corporate generic data. This approach allow to provide reliability data for main components modelled in PSA, to check the consistency of the maintenance program (RCM), to verify hypothesis made at the design about component reliability. A number of studies, related to components reliability as well as decision making process of specific incident risk evaluation have been carried out. This paper provides also an overview of the process management set up on site from raw database to specific reliability database in compliance with established corporate objectives. (authors). 4 figs

  3. Nuclear power plant reliability database management

    Energy Technology Data Exchange (ETDEWEB)

    Meslin, Th [Electricite de France (EDF), 41 - Saint-Laurent-des-Eaux (France); Aufort, P

    1996-04-01

    In the framework of the development of a probabilistic safety project on site (notion of living PSA), Saint Laurent des Eaux NPP implements a specific EDF reliability database. The main goals of this project at Saint Laurent des Eaux are: to expand risk analysis and to constitute an effective local basis of thinking about operating safety by requiring the participation of all departments of a power plant: analysis of all potential operating transients, unavailability consequences... that means to go further than a simple culture of applying operating rules; to involve nuclear power plant operators in experience feedback and its analysis, especially by following up behaviour of components and of safety functions; to allow plant safety managers to outline their decisions facing safety authorities for notwithstanding, preventive maintenance programme, operating incident evaluation. To hit these goals requires feedback data, tools, techniques and development of skills. The first step is to obtain specific reliability data on the site. Raw data come from plant maintenance management system which processes all maintenance activities and keeps in memory all the records of component failures and maintenance activities. Plant specific reliability data are estimated with a Bayesian model which combines these validated raw data with corporate generic data. This approach allow to provide reliability data for main components modelled in PSA, to check the consistency of the maintenance program (RCM), to verify hypothesis made at the design about component reliability. A number of studies, related to components reliability as well as decision making process of specific incident risk evaluation have been carried out. This paper provides also an overview of the process management set up on site from raw database to specific reliability database in compliance with established corporate objectives. (authors). 4 figs.

  4. A user's manual for the database management system of impact property

    International Nuclear Information System (INIS)

    Ryu, Woo Seok; Park, S. J.; Kong, W. S.; Jun, I.

    2003-06-01

    This manual is written for the management and maintenance of the impact database system for managing the impact property test data. The data base constructed the data produced from impact property test can increase the application of test results. Also, we can get easily the basic data from database when we prepare the new experiment and can produce better result by compare the previous data. To develop the database we must analyze and design carefully application and after that, we can offer the best quality to customers various requirements. The impact database system was developed by internet method using jsp(Java Server pages) tool

  5. Dockomatic - automated ligand creation and docking.

    Science.gov (United States)

    Bullock, Casey W; Jacob, Reed B; McDougal, Owen M; Hampikian, Greg; Andersen, Tim

    2010-11-08

    The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI) application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  6. Dockomatic - automated ligand creation and docking

    Directory of Open Access Journals (Sweden)

    Hampikian Greg

    2010-11-01

    Full Text Available Abstract Background The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. Results DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. Conclusions DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  7. Managing XML Data to optimize Performance into Object-Relational Databases

    Directory of Open Access Journals (Sweden)

    Iuliana BOTHA

    2011-06-01

    Full Text Available This paper propose some possibilities for manage XML data in order to optimize performance into object-relational databases. It is detailed the possibility of storing XML data into such databases, using for exemplification an Oracle database and there are tested some optimizing techniques of the queries over XMLType tables, like indexing and partitioning tables.

  8. Institutional Co-Creation Interfaces for Innovation Diffusion during Disaster Management

    Directory of Open Access Journals (Sweden)

    Adrian SOLOMON

    2017-03-01

    Full Text Available This paper discusses the concept of Resilient and Green Supply Chain Management (RGSCM implementation in South Eastern Europe (SEE from the point of view of understanding the structure of the inter-organizational (institutional interfaces involved in this process as well as how are these interfaces evolving and transforming over time. As social and environmental concerns are growing in importance through normative and coercive directions, all the regional actors (triple/quadruple/quintuple helix that supply chains interact with need to bridge their inter-organizational interfaces to properly ensure co-creation at the entire stakeholder level towards increasing the chances of a homogenous implementation of RGSCM. In this context, this paper adopts a three-stage mixed methodology of interviews, survey, focus groups, modelling and simulation case studies. The results show that the key pillars of inter-organizational interface integration and evolution reside in the proper identification of the key goals (performance indicators of the involved institutions, which will maintain market-optimized competition levels. Then, institutions will steadily adhere to the market trends as explained by the ST and INT and in the process of adopting the RGSCM eco-innovation (DIT, the new entrant institutions will transform their inter-organizational interface to properly bridge with the core market stakeholder group. Finally, the key driver of interface alteration resides in the ability of disruptive (eco innovators to set new standards. This research has core academic implications by extending the INT, DIT and ST under the context of RGSCM, policy implications in terms of proper policy making to support the required co-creation as well as practical implications by helping organizations to manage their inter-organizational interfaces.

  9. Using Online Databases in Corporate Issues Management.

    Science.gov (United States)

    Thomsen, Steven R.

    1995-01-01

    Finds that corporate public relations practitioners felt they were able, using online database and information services, to intercept issues earlier in the "issue cycle" and thus enable their organizations to develop more "proactionary" or "catalytic" issues management repose strategies. (SR)

  10. Palaeo sea-level and ice-sheet databases: problems, strategies and perspectives

    Science.gov (United States)

    Rovere, Alessio; Düsterhus, André; Carlson, Anders; Barlow, Natasha; Bradwell, Tom; Dutton, Andrea; Gehrels, Roland; Hibbert, Fiona; Hijma, Marc; Horton, Benjamin; Klemann, Volker; Kopp, Robert; Sivan, Dorit; Tarasov, Lev; Törnqvist, Torbjorn

    2016-04-01

    Databases of palaeoclimate data have driven many major developments in understanding the Earth system. The measurement and interpretation of palaeo sea-level and ice-sheet data that form such databases pose considerable challenges to the scientific communities that use them for further analyses. In this paper, we build on the experience of the PALSEA (PALeo constraints on SEA level rise) community, which is a working group inside the PAGES (Past Global Changes) project, to describe the challenges and best strategies that can be adopted to build a self-consistent and standardised database of geological and geochemical data related to palaeo sea levels and ice sheets. Our aim in this paper is to identify key points that need attention and subsequent funding when undertaking the task of database creation. We conclude that any sea-level or ice-sheet database must be divided into three instances: i) measurement; ii) interpretation; iii) database creation. Measurement should include postion, age, description of geological features, and quantification of uncertainties. All must be described as objectively as possible. Interpretation can be subjective, but it should always include uncertainties and include all the possible interpretations, without unjustified a priori exclusions. We propose that, in the creation of a database, an approach based on Accessibility, Transparency, Trust, Availability, Continued updating, Completeness and Communication of content (ATTAC3) must be adopted. Also, it is essential to consider the community structure that creates and benefits of a database. We conclude that funding sources should consider to address not only the creation of original data in specific research-question oriented projects, but also include the possibility to use part of the funding for IT-related and database creation tasks, which are essential to guarantee accessibility and maintenance of the collected data.

  11. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    Science.gov (United States)

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  12. Managing innovation processes through value co-creation: a process case from business-to-business service practise

    DEFF Research Database (Denmark)

    Nardelli, Giulia; Broumels, Marcel

    2018-01-01

    Value co-creation is a specific type of collaboration that is considered to be an innovative and interactive process between end users and organisations; it aims to increase the value of a product or service. This study investigates how a network of stakeholders collaborating to manage innovation...... openly co-creates value over time; it contributes to the existing literature on value co-creation by taking the perspective of the network as a whole. The study follows a case in which value co-creation unfolds over time across a network of stakeholders within the business-to-business facility service...... context. The in-depth longitudinal investigation of a network composed of a corporate customer and its external facility service providers revealed that a network of stakeholders co-creates value over time by 1) offering an adaptable structure for the network to organise innovation activities...

  13. The Quality Control Algorithms Used in the Creation of NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    Science.gov (United States)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological

  14. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    Science.gov (United States)

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. The creation, management, and use of data quality information for life cycle assessment.

    Science.gov (United States)

    Edelen, Ashley; Ingwersen, Wesley W

    2018-04-01

    Despite growing access to data, questions of "best fit" data and the appropriate use of results in supporting decision making still plague the life cycle assessment (LCA) community. This discussion paper addresses revisions to assessing data quality captured in a new US Environmental Protection Agency guidance document as well as additional recommendations on data quality creation, management, and use in LCA databases and studies. Existing data quality systems and approaches in LCA were reviewed and tested. The evaluations resulted in a revision to a commonly used pedigree matrix, for which flow and process level data quality indicators are described, more clarity for scoring criteria, and further guidance on interpretation are given. Increased training for practitioners on data quality application and its limits are recommended. A multi-faceted approach to data quality assessment utilizing the pedigree method alongside uncertainty analysis in result interpretation is recommended. A method of data quality score aggregation is proposed and recommendations for usage of data quality scores in existing data are made to enable improved use of data quality scores in LCA results interpretation. Roles for data generators, data repositories, and data users are described in LCA data quality management. Guidance is provided on using data with data quality scores from other systems alongside data with scores from the new system. The new pedigree matrix and recommended data quality aggregation procedure can now be implemented in openLCA software. Additional ways in which data quality assessment might be improved and expanded are described. Interoperability efforts in LCA data should focus on descriptors to enable user scoring of data quality rather than translation of existing scores. Developing and using data quality indicators for additional dimensions of LCA data, and automation of data quality scoring through metadata extraction and comparison to goal and scope are needed.

  16. Creative participation: collective sentiment in online co-creation communities

    NARCIS (Netherlands)

    Lee, H.H.M.; van Dolen, W.

    2015-01-01

    Co-creation communities allow companies to utilize consumers’ creative thinking in the innovation process. This paper seeks to understand the role of sentiment in user co-creation. The results suggest that management style can affect the success of co-creation communities. Specific employees’

  17. Mobile, Collaborative Situated Knowledge Creation for Urban Planning

    Directory of Open Access Journals (Sweden)

    Nelson Baloian

    2012-05-01

    Full Text Available Geo-collaboration is an emerging research area in computer sciences studying the way spatial, geographically referenced information and communication technologies can support collaborative activities. Scenarios in which information associated to its physical location are of paramount importance are often referred as Situated Knowledge Creation scenarios. To date there are few computer systems supporting knowledge creation that explicitly incorporate physical context as part of the knowledge being managed in mobile face-to-face scenarios. This work presents a collaborative software application supporting visually-geo-referenced knowledge creation in mobile working scenarios while the users are interacting face-to-face. The system allows to manage data information associated to specific physical locations for knowledge creation processes in the field, such as urban planning, identifying specific physical locations, territorial management, etc.; using Tablet-PCs and GPS in order to geo-reference data and information. It presents a model for developing mobile applications supporting situated knowledge creation in the field, introducing the requirements for such an application and the functionalities it should have in order to fulfill them. The paper also presents the results of utility and usability evaluations.

  18. Development of the ageing management database of PUSPATI TRIGA reactor

    Energy Technology Data Exchange (ETDEWEB)

    Ramli, Nurhayati, E-mail: nurhayati@nm.gov.my; Tom, Phongsakorn Prak; Husain, Nurfazila; Farid, Mohd Fairus Abd; Ramli, Shaharum [Reactor Technology Centre, Malaysian Nuclear Agency, MOSTI, Bangi, 43000 Kajang, Selangor (Malaysia); Maskin, Mazleha [Science Program, Faculty of Science and Technology, Universiti Kebangsaan Malaysia, Selangor (Malaysia); Adnan, Amirul Syazwan; Abidin, Nurul Husna Zainal [Faculty of Petroleum and Renewable Energy Engineering, Universiti Teknologi Malaysia (Malaysia)

    2016-01-22

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  19. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  20. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    International Nuclear Information System (INIS)

    Viegas, F; Nairz, A; Goossens, L; Malon, D; Cranshaw, J; Dimitrov, G; Nowak, M; Gamboa, C; Gallas, E; Wong, A; Vinek, E

    2010-01-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  1. What does existing research say about value co-creation?

    DEFF Research Database (Denmark)

    Thomsen, Merethe Stjerne; Tanev, Stoyan; Pedrosa, Alex

    2010-01-01

    The paper presents a literature review on co-creation, which is summarized into emerging research areas and insights as a basis for a future research agenda for value co-creation. The search methodology is based on a keywords search on ISI Web of Knowledge, leading to 82 articles with a summary...... of four emerging subject areas within marketing science, service management, new product development & innovation and general business and management. The four subject areas lead to new key driving forces of value co-creation by involving the customers in experience networks, where both creating......-customer interaction events, which are extremely personal with unique products, services and experiences. In general the paper is starting up a conceptual refinement on value co-creation by addressing the key characteristics of current literature and driving forces of co-creation....

  2. Of creation of up-to-date system for liquid radwaste management at Ukraine's NPPs. Problem statement

    International Nuclear Information System (INIS)

    Andronov, O.B.

    2015-01-01

    The main aspects are addressed of problems in the field of liquid radwaste (LRW) management for Ukrainian NPPs; approaches for its decision, and offers of NNEGC Energoatom SE STC specialists concerning the above issue. Conceptual principle of creation of up-to-date hi-tech complex for LRW management is considered

  3. Concierge: Personal database software for managing digital research resources

    Directory of Open Access Journals (Sweden)

    Hiroyuki Sakai

    2007-11-01

    Full Text Available This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literaturemanagement, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp.

  4. Military Personnel: DOD Has Processes for Operating and Managing Its Sexual Assault Incident Database

    Science.gov (United States)

    2017-01-01

    MILITARY PERSONNEL DOD Has Processes for Operating and Managing Its Sexual Assault Incident Database Report to...to DSAID’s system speed and ease of use; interfaces with MCIO databases ; utility as a case management tool; and users’ ability to query data and... Managing Its Sexual Assault Incident Database What GAO Found As of October 2013, the Department of Defense’s (DOD) Defense Sexual Assault Incident

  5. Knowledge Creation in Constructivist Learning

    Science.gov (United States)

    Jaleel, Sajna; Verghis, Alie Molly

    2015-01-01

    In today's competitive global economy characterized by knowledge acquisition, the concept of knowledge management has become increasingly prevalent in academic and business practices. Knowledge creation is an important factor and remains a source of competitive advantage over knowledge management. Constructivism holds that learners learn actively…

  6. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    Science.gov (United States)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

  7. Using a database to manage resolution of comments on standards

    International Nuclear Information System (INIS)

    Holloran, R.W.; Kelley, R.P.

    1995-01-01

    Features of production systems that would enhance development and implementation of procedures and other standards were first suggested in 1988 described how a database could provide the features sought for managing the content of structured documents such as standards and procedures. This paper describes enhancements of the database that manage the more complex links associated with resolution of comments. Displaying the linked information on a computer display aids comment resolvers. A hardcopy report generated by the database permits others to independently evaluate the resolution of comments in context with the original text of the standard, the comment, and the revised text of the standard. Because the links are maintained by the database, consistency between the agreed-upon resolutions and the text of the standard can be maintained throughout the subsequent reviews of the standard. Each of the links is bidirectional; i.e., the relationships between any two documents can be viewed from the perspective of either document

  8. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    Energy Technology Data Exchange (ETDEWEB)

    Viegas, F; Nairz, A; Goossens, L [CERN, CH-1211 Geneve 23 (Switzerland); Malon, D; Cranshaw, J [Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, IL 60439 (United States); Dimitrov, G [DESY, D-22603 Hamburg (Germany); Nowak, M; Gamboa, C [Brookhaven National Laboratory, PO Box 5000 Upton, NY 11973-5000 (United States); Gallas, E [University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Wong, A [Triumf, 4004 Wesbrook Mall, Vancouver, BC, V6T 2A3 (Canada); Vinek, E [University of Vienna, Dr.-Karl-Lueger-Ring 1, 1010 Vienna (Austria)

    2010-04-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  9. Integrated Space Asset Management Database and Modeling

    Science.gov (United States)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  10. Development of a Relational Database for Learning Management Systems

    Science.gov (United States)

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  11. Computer application for database management and networking of service radio physics

    International Nuclear Information System (INIS)

    Ferrando Sanchez, A.; Cabello Murillo, E.; Diaz Fuentes, R.; Castro Novais, J.; Clemente Gutierrez, F.; Casa de Juan, M. A. de la; Adaimi Hernandez, P.

    2011-01-01

    The databases in the quality control prove to be a powerful tool for recording, management and statistical process control. Developed in a Windows environment and under Access (Micros of Office) our service implements this philosophy on the canter's computer network. A computer that acts as the server provides the database to the treatment units to record quality control measures daily and incidents. To remove shortcuts stop working with data migration, possible use of duplicate and erroneous data loss because of errors in network connections, which are common problems, we proceeded to manage connections and access to databases ease of maintenance and use horn to all service personnel.

  12. The Cocoa Shop: A Database Management Case

    Science.gov (United States)

    Pratt, Renée M. E.; Smatt, Cindi T.

    2015-01-01

    This is an example of a real-world applicable case study, which includes background information on a small local business (i.e., TCS), description of functional business requirements, and sample data. Students are asked to design and develop a database to improve the management of the company's customers, products, and purchases by emphasizing…

  13. Residue Management of Biodiesel Industry: A Study of Value Creation in the Supply Chain

    Directory of Open Access Journals (Sweden)

    Stella Maris Lima Altoé

    2014-06-01

    Full Text Available Residues, whether solid or liquid, are inherent to many industrial processes, and require specialized treatments. The purpose of this research is to evaluate the process of creating value in the supply chain, from the sustainable management of residues in the biodiesel industry. The methodological approach was a multiple case study, with the use of bibliographic data, documents and discourse analysis. Data were collected through interviews with managers of the companies analyzed. The findings suggest that residue management enables the creation of value in the supply chain of biodiesel. It is also noted that from this management, environmental preservation occurs, the incidence of fines is reduced or even eliminated, and there are still economic cooperation between the companies that have different activities but are a part of the supply chain of biodiesel

  14. Design and implementation of component reliability database management system for NPP

    International Nuclear Information System (INIS)

    Kim, S. H.; Jung, J. K.; Choi, S. Y.; Lee, Y. H.; Han, S. H.

    1999-01-01

    KAERI is constructing the component reliability database for Korean nuclear power plant. This paper describes the development of data management tool, which runs for component reliability database. This is running under intranet environment and is used to analyze the failure mode and failure severity to compute the component failure rate. Now we are developing the additional modules to manage operation history, test history and algorithms for calculation of component failure history and reliability

  15. Logistics potentials in business competitive advantage creation

    Directory of Open Access Journals (Sweden)

    Rafał Matwiejczuk

    2013-12-01

    Full Text Available Background: Companies constantly search for ways to achieve and sustain long-term competitive advantage. Among the factors influencing the competitive advantage creation there are so called logistics potentials, which constitute a component part of a business strategic potentials. Logistics resources, logistics capabilities and logistics competences are the main components of the logistics potentials structure and hierarchy. Methods: In order to recognize the logistics potentials which determine the competitive advantage creation one may use the assumptions and elements of contemporary management concepts, including strategic management. In particular the article deals with Resource-Based View (RBV, Dynamic Capabilities Concept (DCC and - first of all - Competence-Based Management (CBM. Results and conclusions: Several significant research projects have presented a wide scope and a large number of possibilities of logistics potentials (and logistics competences in particular influence on business competitive advantage creation. The article briefly presents the research results conducted by: (1 Michigan State University (USA, (2 European Logistics Association (ELA in cooperation with A.T. Kearney, (3 Computer Sciences Corporation and (4 Capgemini. The research results have pointed out to differentiated but at the same distinctive symptoms of logistics competences influence on competitive advantage creation. The article also refers to the results of the research carried out by the Chair of Logistics & Marketing at Opole University (Poland in companies operating in Poland. The research has been mainly dealing with the significance of logistics competences in competitive advantage creation.

  16. Pengembangan Content Management System pada Admisi Online Binus University

    Directory of Open Access Journals (Sweden)

    Karto Iskandar

    2012-12-01

    Full Text Available Technically registration form for new Binus institution must be regenerated and not dynamic. Therefore, this research is objected to simplify the creation of registration process for new Binus institution and provide solutions to the problem when Binus establish other new institutions. Methodology used is analysis and design of the database (database oriented. Analysis is done by asking the problems of existing systems in the IT Directorate, whereas the design uses UML diagram notation 2.0. The results obtained is front end and back end applications for the Content Management System of the registration form. The results of this design can be used to simplify the new student registration or in the many Binus institutions by grouping similar fields. With some changes in the front end and back end applications for content Management System, the addition of new online admission application form can be managed faster, where the creation of admision registration form is managed in the back end application. As suggestions for future development, online admission registration can be run in mobile version.

  17. DOG-SPOT database for comprehensive management of dog genetic research data

    Directory of Open Access Journals (Sweden)

    Sutter Nathan B

    2010-12-01

    Full Text Available Abstract Research laboratories studying the genetics of companion animals have no database tools specifically designed to aid in the management of the many kinds of data that are generated, stored and analyzed. We have developed a relational database, "DOG-SPOT," to provide such a tool. Implemented in MS-Access, the database is easy to extend or customize to suit a lab's particular needs. With DOG-SPOT a lab can manage data relating to dogs, breeds, samples, biomaterials, phenotypes, owners, communications, amplicons, sequences, markers, genotypes and personnel. Such an integrated data structure helps ensure high quality data entry and makes it easy to track physical stocks of biomaterials and oligonucleotides.

  18. Protein-Protein Interaction Databases

    DEFF Research Database (Denmark)

    Szklarczyk, Damian; Jensen, Lars Juhl

    2015-01-01

    Years of meticulous curation of scientific literature and increasingly reliable computational predictions have resulted in creation of vast databases of protein interaction data. Over the years, these repositories have become a basic framework in which experiments are analyzed and new directions...

  19. Updating and improving the National Population Database to National Population Database 2

    OpenAIRE

    SMITH, Graham; FAIRBURN, Jonathan

    2008-01-01

    In 2004 Staffordshire University delivered the National Population Database for use in estimating populations at risk under the Control of Major Accident Hazards Regulations (COMAH). In 2006 an assessment of the updating and potential improvements to NPD was delivered to HSE. Between Autumn 2007 and Summer 2008 an implementation of the feasibility report led to the creation of National Population Database 2 which both updated and expanded the datasets contained in the original NPD. This repor...

  20. Small and Medium Enterprises, Job Creation, and Sustainability ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Small and Medium Enterprises, Job Creation, and Sustainability: Maximizing ... job creation, human capital, and green production and technologies will only materialize if ... IWRA/IDRC webinar on climate change and adaptive water management. International Water Resources Association, in close collaboration with IDRC, ...

  1. METODE RESET PASSWORD LEVEL ROOT PADA RELATIONAL DATABASE MANAGEMENT SYSTEM (RDBMS MySQL

    Directory of Open Access Journals (Sweden)

    Taqwa Hariguna

    2011-08-01

    Full Text Available Database merupakan sebuah hal yang penting untuk menyimpan data, dengan database organisasi akan mendapatkan keuntungan dalam beberapa hal, seperti kecepatan akases dan mengurangi penggunaan kertas, namun dengan implementasi database tidak jarang administrator database lupa akan password yang digunakan, hal ini akan mempersulit dalam proses penangganan database. Penelitian ini bertujuan untuk menggali cara mereset password level root pada relational database management system MySQL.

  2. A multidisciplinary database for geophysical time series management

    Science.gov (United States)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  3. The Erasmus insurance case and a related questionnaire for distributed database management systems

    NARCIS (Netherlands)

    S.C. van der Made-Potuijt

    1990-01-01

    textabstractThis is the third report concerning transaction management in the database environment. In the first report the role of the transaction manager in protecting the integrity of a database has been studied [van der Made-Potuijt 1989]. In the second report a model has been given for a

  4. Redefining the Practice of Peer Review Through Intelligent Automation Part 1: Creation of a Standardized Methodology and Referenceable Database.

    Science.gov (United States)

    Reiner, Bruce I

    2017-10-01

    Conventional peer review practice is compromised by a number of well-documented biases, which in turn limit standard of care analysis, which is fundamental to determination of medical malpractice. In addition to these intrinsic biases, other existing deficiencies exist in current peer review including the lack of standardization, objectivity, retrospective practice, and automation. An alternative model to address these deficiencies would be one which is completely blinded to the peer reviewer, requires independent reporting from both parties, utilizes automated data mining techniques for neutral and objective report analysis, and provides data reconciliation for resolution of finding-specific report differences. If properly implemented, this peer review model could result in creation of a standardized referenceable peer review database which could further assist in customizable education, technology refinement, and implementation of real-time context and user-specific decision support.

  5. The Mayo Clinic Value Creation System.

    Science.gov (United States)

    Swensen, Stephen J; Dilling, James A; Harper, C Michel; Noseworthy, John H

    2012-01-01

    The authors present Mayo Clinic's Value Creation System, a coherent systems engineering approach to delivering a single high-value practice. There are 4 tightly linked, interdependent phases of the system: alignment, discovery, managed diffusion, and measurement. The methodology is described and examples of the results to date are presented. The Value Creation System has been demonstrated to improve the quality of patient care while reducing costs and increasing productivity.

  6. The cost of wetland creation and restoration. Final report, [February 12, 1992--April 30, 1994]- Draft

    Energy Technology Data Exchange (ETDEWEB)

    King, D.; Costanza, R.

    1994-07-11

    This report examines the economics of wetland creation, restoration, and enhancement projects, especially as they are used within the context of mitigation for unavoidable wetland losses. Complete engineering-cost-accounting profiles of over 90 wetland projects were developed in collaboration with leading wetland restoration and creation practitioners around the country to develop a primary source database. Data on the costs of over 1,000 wetland projects were gathered from published sources and other available databases to develop a secondary source database. Cases in both databases were carefully analyzed and a set of baseline cost per acre estimates were developed for wetland creation, restoration, and enhancement. Observations of costs varied widely, ranging from $5 per acre to $1.5 million per acre. Differences in cost were related to the target wetland type, and to site-specific and project-specific factors that affected the preconstruction, construction, and post-construction tasks necessary to carry out each particular project. Project-specific and site-specific factors had a much larger effect on project costs than wetland type for non-agricultural projects. Costs of wetland creation and restoration were also shown to differ by region, but not by as much as expected, and in response to the regulatory context. The costs of wetland creation, restoration, and enhancement were also analyzed in a broader economic context through examination of the market for wetland mitigation services, and through the development of a framework for estimating compensation ratios-the number of acres of created, restored, or enhanced wetland required to compensate for an acre of lost natural wetland. The combination of per acre creation, restoration, and enhancement costs and the compensation ratio determine the overall mitigation costs associated with alternative mitigation strategies.

  7. Updated Palaeotsunami Database for Aotearoa/New Zealand

    Science.gov (United States)

    Gadsby, M. R.; Goff, J. R.; King, D. N.; Robbins, J.; Duesing, U.; Franz, T.; Borrero, J. C.; Watkins, A.

    2016-12-01

    The updated configuration, design, and implementation of a national palaeotsunami (pre-historic tsunami) database for Aotearoa/New Zealand (A/NZ) is near completion. This tool enables correlation of events along different stretches of the NZ coastline, provides information on frequency and extent of local, regional and distant-source tsunamis, and delivers detailed information on the science and proxies used to identify the deposits. In A/NZ a plethora of data, scientific research and experience surrounds palaeotsunami deposits, but much of this information has been difficult to locate, has variable reporting standards, and lacked quality assurance. The original database was created by Professor James Goff while working at the National Institute of Water & Atmospheric Research in A/NZ, but has subsequently been updated during his tenure at the University of New South Wales. The updating and establishment of the national database was funded by the Ministry of Civil Defence and Emergency Management (MCDEM), led by Environment Canterbury Regional Council, and supported by all 16 regions of A/NZ's local government. Creation of a single database has consolidated a wide range of published and unpublished research contributions from many science providers on palaeotsunamis in A/NZ. The information is now easily accessible and quality assured and allows examination of frequency, extent and correlation of events. This provides authoritative scientific support for coastal-marine planning and risk management. The database will complement the GNS New Zealand Historical Database, and contributes to a heightened public awareness of tsunami by being a "one-stop-shop" for information on past tsunami impacts. There is scope for this to become an international database, enabling the pacific-wide correlation of large events, as well as identifying smaller regional ones. The Australian research community has already expressed an interest, and the database is also compatible with a

  8. Development of an Internet-based data explorer for a samples databases: the example of the STRATFEED project

    Directory of Open Access Journals (Sweden)

    Dardenne P.

    2004-01-01

    Full Text Available A key aspect of the European STRATFEED project on developing and validating analytical methods to detect animal meal in feed was the creation of a samples bank. To manage the 2,500 samples that were stored in the samples bank, another important objective was to build a database and develop an Internet-based data explorer – the STRATFEED explorer – to enable all laboratories and manufacturers working in the feed sector to make use of the database. The concept developed for the STRATFEED project could be used for samples management in other projects and it is easily adapted to meet a variety of requirements. The STRATFEED explorer can now be run from the public website http://stratfeed.cra.wallonie.be. Each webpage of this application is described in a documentation file aimed at helping the user to explore the database.

  9. CALCOM Database for managing California Commercial Groundfish sample data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The CALCOM database is used by the California Cooperative Groundfish Survey to store and manage Commercial market sample data. This data is ultimately used to...

  10. The use of database management systems in particle physics

    CERN Document Server

    Stevens, P H; Read, B J; Rittenberg, Alan

    1979-01-01

    Examines data-handling needs and problems in particle physics and looks at three very different efforts by the Particle Data Group (PDG) , the CERN-HERA Group in Geneva, and groups cooperating with ZAED in Germany at resolving these problems. The ZAED effort does not use a database management system (DBMS), the CERN-HERA Group uses an existing, limited capability DBMS, and PDG uses the Berkely Database Management (BDMS), which PDG itself designed and implemented with scientific data-handling needs in mind. The range of problems each group tried to resolve was influenced by whether or not a DBMS was available and by what capabilities it had. Only PDG has been able to systematically address all the problems. The authors discuss the BDMS- centered system PDG is now building in some detail. (12 refs).

  11. How to Manage and Plan Terminology: Creating Management TDBs

    Directory of Open Access Journals (Sweden)

    Gordana Jakić

    2016-09-01

    Full Text Available Scientific and technical terminology represents a very topical issue in economically and technologically dependent countries with small languages such as Serbian. The current terminological problems in the Serbian language, especially in specialized areas that are experiencing dynamic development, are: Anglicization of the language for special purposes, underdeveloped and unstable terminology, and lack of adequate and modern terminological and lexical resources. On the one hand, the terminological problems listed above are of concern to subject-field specialists, since inadequate and non-existent terminology significantly affects the representation, transfer and management of specialized knowledge and information. On the other hand, terminology and language planners point to the growing need for immediate and systematic intervention aimed at terminology harmonization, consolidation and standardization. In spite of the awareness, there is no systematic approach to the solving of terminological problems in Serbian. In addition, practical activities regarding the collection and organization of terminology are few and reduced to individual initiatives. Under the paradigm of language planning (LP-oriented terminology management (2, this paper is going to address a practical activity of terminology management: the creation of a Serbian management terminology database (TDB with equivalent terms in English. The paper will discuss the methodology of terminology work, potential obstacles in termbase creation, as well as potential benefits that such a resource would have on all its potential users: management specialists and practitioners, professional translators, and language and terminology planners. A particular focus will be placed on the potential significance that this kind of a database would have for terminology policy and planning in the Serbian language, on the one hand, and knowledge transfer and management, on the other hand.

  12. A database system for the management of severe accident risk information, SARD

    International Nuclear Information System (INIS)

    Ahn, K. I.; Kim, D. H.

    2003-01-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies

  13. A database system for the management of severe accident risk information, SARD

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, K. I.; Kim, D. H. [KAERI, Taejon (Korea, Republic of)

    2003-10-01

    The purpose of this paper is to introduce main features and functions of a PC Windows-based database management system, SARD, which has been developed at Korea Atomic Energy Research Institute for automatic management and search of the severe accident risk information. Main functions of the present database system are implemented by three closely related, but distinctive modules: (1) fixing of an initial environment for data storage and retrieval, (2) automatic loading and management of accident information, and (3) automatic search and retrieval of accident information. For this, the present database system manipulates various form of the plant-specific severe accident risk information, such as dominant severe accident sequences identified from the plant-specific Level 2 Probabilistic Safety Assessment (PSA) and accident sequence-specific information obtained from the representative severe accident codes (e.g., base case and sensitivity analysis results, and summary for key plant responses). The present database system makes it possible to implement fast prediction and intelligent retrieval of the required severe accident risk information for various accident sequences, and in turn it can be used for the support of the Level 2 PSA of similar plants and for the development of plant-specific severe accident management strategies.

  14. A database system for enhancing fuel records management capabilities

    International Nuclear Information System (INIS)

    Rieke, Phil; Razvi, Junaid

    1994-01-01

    The need to modernize the system of managing a large variety of fuel related data at the TRIGA Reactors Facility at General Atomics, as well as the need to improve NRC nuclear material reporting requirements, prompted the development of a database to cover all aspects of fuel records management. The TRIGA Fuel Database replaces (a) an index card system used for recording fuel movements, (b) hand calculations for uranium burnup, and (c) a somewhat aged and cumbersome system of recording fuel inspection results. It was developed using Microsoft Access, a relational database system for Windows. Instead of relying on various sources for element information, users may now review individual element statistics, record inspection results, calculate element burnup and more, all from within a single application. Taking full advantage of the ease-of-use features designed in to Windows and Access, the user can enter and extract information easily through a number of customized on screen forms, with a wide variety of reporting options available. All forms are accessed through a main 'Options' screen, with the options broken down by categories, including 'Elements', 'Special Elements/Devices', 'Control Rods' and 'Areas'. Relational integrity and data validation rules are enforced to assist in ensuring accurate and meaningful data is entered. Among other items, the database lets the user define: element types (such as FLIP or standard) and subtypes (such as fuel follower, instrumented, etc.), various inspection codes for standardizing inspection results, areas within the facility where elements are located, and the power factors associated with element positions within a reactor. Using fuel moves, power history, power factors and element types, the database tracks uranium burnup and plutonium buildup on a quarterly basis. The Fuel Database was designed with end-users in mind and does not force an operations oriented user to learn any programming or relational database theory in

  15. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    Science.gov (United States)

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  16. Reactor pressure vessel embrittlement management through EPRI-Developed material property databases

    International Nuclear Information System (INIS)

    Rosinski, S.T.; Server, W.L.; Griesbach, T.J.

    1997-01-01

    Uncertainties and variability in U.S. reactor pressure vessel (RPV) material properties have caused the U.S. Nuclear Regulatory Commission (NRC) to request information from all nuclear utilities in order to assess the impact of these data scatter and uncertainties on compliance with existing regulatory criteria. Resolving the vessel material uncertainty issues requires compiling all available data into a single integrated database to develop a better understanding of irradiated material property behavior. EPRI has developed two comprehensive databases for utility implementation to compile and evaluate available material property and surveillance data. RPVDATA is a comprehensive reactor vessel materials database and data management program that combines data from many different sources into one common database. Searches of the data can be easily performed to identify plants with similar materials, sort through measured test results, compare the ''best-estimates'' for reported chemistries with licensing basis values, quantify variability in measured weld qualification and test data, identify relevant surveillance results for characterizing embrittlement trends, and resolve uncertainties in vessel material properties. PREP4 has been developed to assist utilities in evaluating existing unirradiated and irradiated data for plant surveillance materials; PREP4 evaluations can be used to assess the accuracy of new trend curve predictions. In addition, searches of the data can be easily performed to identify available Charpy shift and upper shelf data, review surveillance material chemistry and fabrication information, review general capsule irradiation information, and identify applicable source reference information. In support of utility evaluations to consider thermal annealing as a viable embrittlement management option, EPRI is also developing a database to evaluate material response to thermal annealing. Efforts are underway to develop an irradiation

  17. NM WAIDS: A PRODUCED WATER QUALITY AND INFRASTRUCTURE GIS DATABASE FOR NEW MEXICO OIL PRODUCERS

    Energy Technology Data Exchange (ETDEWEB)

    Martha Cather; Robert Lee; Ibrahim Gundiler; Andrew Sung; Naomi Davidson; Ajeet Kumar Reddy; Mingzhen Wei

    2003-04-01

    The New Mexico Water and Infrastructure Data System (NM WAIDS) seeks to alleviate a number of produced water-related issues in southeast New Mexico. The project calls for the design and implementation of a Geographical Information System (GIS) and integral tools that will provide operators and regulators with necessary data and useful information to help them make management and regulatory decisions. The major components of this system are: (1) databases on produced water quality, cultural and groundwater data, oil pipeline and infrastructure data, and corrosion information, (2) a web site capable of displaying produced water and infrastructure data in a GIS or accessing some of the data by text-based queries, (3) a fuzzy logic-based, site risk assessment tool that can be used to assess the seriousness of a spill of produced water, and (4) a corrosion management toolkit that will provide operators with data and information on produced waters that will aid them in deciding how to address corrosion issues. The various parts of NM WAIDS will be integrated into a website with a user-friendly interface that will provide access to previously difficult-to-obtain data and information. Primary attention during the first six months of this project has been focused on creating the water quality databases for produced water and surface water, along with collection of corrosion information and building parts of the corrosion toolkit. Work on the project to date includes: (1) Creation of a water quality database for produced water analyses. The database was compiled from a variety of sources and currently has over 4000 entries for southeast New Mexico. (2) Creation of a web-based data entry system for the water quality database. This system allows a user to view, enter, or edit data from a web page rather than having to directly access the database. (3) Creation of a semi-automated data capturing system for use with standard water quality analysis forms. This system improves the

  18. Creation of the First French Database in Primary Care Using the ICPC2: Feasibility Study.

    Science.gov (United States)

    Lacroix-Hugues, V; Darmon, D; Pradier, C; Staccini, P

    2017-01-01

    The objective of our study was to assess the feasibility of gathering data stored in primary care Electronic Health records (EHRs) in order to create a research database (PRIMEGE PACA project). The software for EHR models of two office and patient data management systems were analyzed; anonymized data was extracted and imported into a MySQL database. An ETL procedure to code text in ICPC2 codes was implemented. Eleven general practitioners (GPs) were enrolled as "data producers" and data were extracted from 2012 to 2015. In this paper, we explain the ways to make this process feasible as well as illustrate its utility for estimating epidemiological indicators and professional practice assessments. Other software is currently being analyzed for integration and expansion of this panel of GPs. This experimentation is recognized as a robust framework and is considered to be the technical foundation of the first regional observatory of primary care data.

  19. Plant operation data collection and database management using NIC system

    International Nuclear Information System (INIS)

    Inase, S.

    1990-01-01

    The Nuclear Information Center (NIC), a division of the Central Research Institute of Electric Power Industry, collects nuclear power plant operation and maintenance information both in Japan and abroad and transmits the information to all domestic utilities so that it can be effectively utilized for safe plant operation and reliability enhancement. The collected information is entered into the database system after being key-worded by NIC. The database system, Nuclear Information database/Communication System (NICS), has been developed by NIC for storage and management of collected information. Objectives of keywords are retrieval and classification by the keyword categories

  20. Database implementation to fluidized cracking catalytic-FCC process

    International Nuclear Information System (INIS)

    Santana, Antonio Otavio de; Dantas, Carlos Costa; Santos, Valdemir A. dos

    2009-01-01

    A process of Fluidized Cracking Catalytic (FCC) was developed by our research group. A cold model FCC unit, in laboratory scale, was used for obtaining of the data relative to the following parameters: air flow, system pressure, riser inlet pressure, rise outlet pressure, pressure drop in the riser, motor speed of catalyst injection and density. The measured of the density is made by gamma ray transmission. For the fact of the process of FCC not to have a database until then, the present work supplied this deficiency with the implementation of a database in connection with the Matlab software. The data from the FCC unit (laboratory model) are obtained as spreadsheet of the MS-Excel software. These spreadsheets were treated before importing them as database tables. The application of the process of normalization of database and the analysis done with the MS-Access in these spreadsheets treated revealed the need of an only relation (table) for to represent the database. The Database Manager System (DBMS) chosen has been the MS-Access by to satisfy our flow of data. The next step was the creation of the database, being built the table of data, the action query, selection query and the macro for to import data from the unit FCC in study. Also an interface between the application 'Database Toolbox' (Matlab2008a) and the database was created. This was obtained through the drivers ODBC (Open Data Base Connectivity). This interface allows the manipulation of the database by the users operating in the Matlab. (author)

  1. Database implementation to fluidized cracking catalytic-FCC process

    Energy Technology Data Exchange (ETDEWEB)

    Santana, Antonio Otavio de; Dantas, Carlos Costa, E-mail: aos@ufpe.b [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear; Santos, Valdemir A. dos, E-mail: valdemir.alexandre@pq.cnpq.b [Universidade Catolica de Pernambuco, Recife, PE (Brazil). Centro de Ciencia e Tecnologia

    2009-07-01

    A process of Fluidized Cracking Catalytic (FCC) was developed by our research group. A cold model FCC unit, in laboratory scale, was used for obtaining of the data relative to the following parameters: air flow, system pressure, riser inlet pressure, rise outlet pressure, pressure drop in the riser, motor speed of catalyst injection and density. The measured of the density is made by gamma ray transmission. For the fact of the process of FCC not to have a database until then, the present work supplied this deficiency with the implementation of a database in connection with the Matlab software. The data from the FCC unit (laboratory model) are obtained as spreadsheet of the MS-Excel software. These spreadsheets were treated before importing them as database tables. The application of the process of normalization of database and the analysis done with the MS-Access in these spreadsheets treated revealed the need of an only relation (table) for to represent the database. The Database Manager System (DBMS) chosen has been the MS-Access by to satisfy our flow of data. The next step was the creation of the database, being built the table of data, the action query, selection query and the macro for to import data from the unit FCC in study. Also an interface between the application 'Database Toolbox' (Matlab2008a) and the database was created. This was obtained through the drivers ODBC (Open Data Base Connectivity). This interface allows the manipulation of the database by the users operating in the Matlab. (author)

  2. Management of venous hypertension following arteriovenous fistula creation for hemodialysis access

    Directory of Open Access Journals (Sweden)

    Varun Mittal

    2016-01-01

    Full Text Available Introduction: Venous hypertension (VH is a distressing complication following the creation of arteriovenous fistula (AVF. The aim of management is to relieve edema with preservation of AVF. Extensive edema increases surgical morbidity with the loss of hemodialysis access. We present our experience in management of VH. Methods: A retrospective study was conducted on 37 patients with VH managed between July 2005 to May 2014. Patient demographics, evaluation, and procedures performed were noted. A successful outcome of management with surgical ligation (SL, angioembolization (AE, balloon dilatation (BD or endovascular stent (EVS was defined by immediate disappearance of thrill and murmur with resolution of edema in the next 48-72 h, no demonstrable flow during check angiogram and resolution of edema with preservation of AVF respectively. Results: All 8 distal AVF had peripheral venous stenosis and were managed with SL in 7 and BD in one patient. In 29 proximal AVF, central and peripheral venous stenosis was present in 16 and 13 patients respectively. SL, AE, BD, and BD with EVS were done in 18, 5, 4, and 3 patients, respectively. All patients had a successful outcome. SL was associated with wound related complications in 11 (29.73 % patients. A total of 7 AVF were salvaged. One had restenosis after BD and was managed with AE. BD, EVS, and AE had no associated morbidity. Conclusions: Management of central and peripheral venous stenosis with VH should be individualized and in selected cases it seems preferable to secure a new access in another limb and close the native AVF in edematous limb for better overall outcome.

  3. Translation from the collaborative OSM database to cartography

    Science.gov (United States)

    Hayat, Flora

    2018-05-01

    The OpenStreetMap (OSM) database includes original items very useful for geographical analysis and for creating thematic maps. Contributors record in the open database various themes regarding amenities, leisure, transports, buildings and boundaries. The Michelin mapping department develops map prototypes to test the feasibility of mapping based on OSM. To translate the OSM database structure into a database structure fitted with Michelin graphic guidelines a research project is in development. It aims at defining the right structure for the Michelin uses. The research project relies on the analysis of semantic and geometric heterogeneities in OSM data. In that order, Michelin implements methods to transform the input geographical database into a cartographic image dedicated for specific uses (routing and tourist maps). The paper focuses on the mapping tools available to produce a personalised spatial database. Based on processed data, paper and Web maps can be displayed. Two prototypes are described in this article: a vector tile web map and a mapping method to produce paper maps on a regional scale. The vector tile mapping method offers an easy navigation within the map and within graphic and thematic guide- lines. Paper maps can be partly automatically drawn. The drawing automation and data management are part of the mapping creation as well as the final hand-drawing phase. Both prototypes have been set up using the OSM technical ecosystem.

  4. The IAEA's Net Enabled Waste Management Database: Overview and current status

    International Nuclear Information System (INIS)

    Csullog, G.W.; Bell, M.J.; Pozdniakov, I.; Petison, G.; Kostitsin, V.

    2002-01-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) contains information on national radioactive waste management programmes and organizations, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. The NEWMDB, which was launched on the Internet on 6 July 2001, is the successor to the IAEA's Waste Management Database (WMDB), which was in use during the 1990's. The NEWMDB's first data collection cycle took place from July 2001 to March 2002. This paper provides an overview of the NEWMDB, it describes the results of the first data collection cycle, and it discusses the way forward for additional data collection cycles. Three companion papers describe (1) the role of the NEWMDB as an international source of information about radioactive waste management, (2) issues related to the variety of waste classification schemes used by IAEA Member States, and (3) the NEWMDB in the context of an indicator of sustainable development for radioactive waste management. (author)

  5. KNOWLEDGE SCIENCES AND NANATSUDAKI: A NEW MODEL OF KNOWLEDGE CREATION PROCESSES

    Institute of Scientific and Technical Information of China (English)

    Andrzej P.WIERZBICKI; Yoshiteru NAKAMORI

    2007-01-01

    The paper starts from a discussion of the concepts of knowledge management versus technology management,and the emergence of knowledge sciences.This is followed be a summary of recent results in the theory of knowledge creation.Most of them concern diverse spirals of creative interplay between rational (explicit) and intuitive or emotional (tacit) aspects of knowledge.Some of them concentrate on organizational (market or purpose-oriented) knowledge creation,other describe academic (research-oriented) knowledge creation.The problem addressed in this paper is how to integrate diverse spirals of knowledge creation into a prescriptive or exemplar model that would help to overcome the differences between organizational (market-oriented) and normal academic knowledge creation.As such prescriptive approach,the JAIST Nanatsudaki Model of knowledge creation is proposed.It consists of seven spirals,known from other studies,but integrated in a sequence resulting from the experience of authors in practical management of research activities.Not all of these spirals have to be fully utilized,depending on a particular application,but all of them relate to some essential aspects of either academic or organizational knowledge creation.The paper presents Nanatsudaki Model in detail with comments on consecutive spirals.The results of a survey of opinions about creativity conditions at JAIST indicate the importance of many spirals constituting the Nanatsudaki Model.Directions of further testing the Nanatsudaki Model are indicated.

  6. Text-Mining Applications for Creation of Biofilm Literature Database

    Directory of Open Access Journals (Sweden)

    Kanika Gupta

    2017-10-01

    So in the present research published corpora of 34306 documents for biofilm was collected from PubMed database along with non-indexed resources like books, conferences, newspaper articles, etc. and these were divided into five categories i.e. classification, growth and development, physiology, drug effects and radiation effects. These five categories were further individually divided into three parts i.e. Journal Title, Abstract Title, and Abstract Text to make indexing highly specific. Text-processing was done using the software Rapid Miner_v5.3, which tokenizes the entire text into words and provides the frequency of each word within the document. The obtained words were normalized using Remove Stop and Stem Word command of Rapid Miner_v5.3 which removes the stopping and stemming words. The obtained words were stored in MS-Excel 2007 and were sorted in decreasing order of frequency using Sort & Filter command of MS-Excel 2007. The words are visualization through networks obtained by Cytoscape_v2.7.0. Now the words obtained were highly specific for biofilms, generating a controlled biofilm vocabulary and this vocabulary could be used for indexing articles for biofilm (similar to MeSH database which indexes articles for PubMed. The obtained keywords information was stored in the relational database which is locally hosted using the WAMP_v2.4 (Windows, Apache, MySQL, PHP server. The available biofilm vocabulary will be significant for researchers studying biofilm literature, making their search easy and efficient.

  7. Value Encounters - Modeling and Analyzing Co-creation of Value

    Science.gov (United States)

    Weigand, Hans

    Recent marketing and management literature has introduced the concept of co-creation of value. Current value modeling approaches such as e3-value focus on the exchange of value rather than co-creation. In this paper, an extension to e3-value is proposed in the form of a “value encounter”. Value encounters are defined as interaction spaces where a group of actors meet and derive value by each one bringing in some of its own resources. They can be analyzed from multiple strategic perspectives, including knowledge management, social network management and operational management. Value encounter modeling can be instrumental in the context of service analysis and design.

  8. A Framework for Mapping User-Designed Forms to Relational Databases

    Science.gov (United States)

    Khare, Ritu

    2011-01-01

    In the quest for database usability, several applications enable users to design custom forms using a graphical interface, and forward engineer the forms into new databases. The path-breaking aspect of such applications is that users are completely shielded from the technicalities of database creation. Despite this innovation, the process of…

  9. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    Science.gov (United States)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  10. Zebrafish Database: Customizable, Free, and Open-Source Solution for Facility Management.

    Science.gov (United States)

    Yakulov, Toma Antonov; Walz, Gerd

    2015-12-01

    Zebrafish Database is a web-based customizable database solution, which can be easily adapted to serve both single laboratories and facilities housing thousands of zebrafish lines. The database allows the users to keep track of details regarding the various genomic features, zebrafish lines, zebrafish batches, and their respective locations. Advanced search and reporting options are available. Unique features are the ability to upload files and images that are associated with the respective records and an integrated calendar component that supports multiple calendars and categories. Built on the basis of the Joomla content management system, the Zebrafish Database is easily extendable without the need for advanced programming skills.

  11. CPU and cache efficient management of memory-resident databases

    NARCIS (Netherlands)

    Pirk, H.; Funke, F.; Grund, M.; Neumann, T.; Leser, U.; Manegold, S.; Kemper, A.; Kersten, M.L.

    2013-01-01

    Memory-Resident Database Management Systems (MRDBMS) have to be optimized for two resources: CPU cycles and memory bandwidth. To optimize for bandwidth in mixed OLTP/OLAP scenarios, the hybrid or Partially Decomposed Storage Model (PDSM) has been proposed. However, in current implementations,

  12. CPU and Cache Efficient Management of Memory-Resident Databases

    NARCIS (Netherlands)

    H. Pirk (Holger); F. Funke; M. Grund; T. Neumann (Thomas); U. Leser; S. Manegold (Stefan); A. Kemper (Alfons); M.L. Kersten (Martin)

    2013-01-01

    htmlabstractMemory-Resident Database Management Systems (MRDBMS) have to be optimized for two resources: CPU cycles and memory bandwidth. To optimize for bandwidth in mixed OLTP/OLAP scenarios, the hybrid or Partially Decomposed Storage Model (PDSM) has been proposed. However, in current

  13. Analysis of functionality free CASE-tools databases design

    Directory of Open Access Journals (Sweden)

    A. V. Gavrilov

    2016-01-01

    Full Text Available The introduction in the educational process of database design CASEtechnologies requires the institution of significant costs for the purchase of software. A possible solution could be the use of free software peers. At the same time this kind of substitution should be based on even-com representation of the functional characteristics and features of operation of these programs. The purpose of the article – a review of the free and non-profi t CASE-tools database design, as well as their classifi cation on the basis of the analysis functionality. When writing this article were used materials from the offi cial websites of the tool developers. Evaluation of the functional characteristics of CASEtools for database design made exclusively empirically with the direct work with software products. Analysis functionality of tools allow you to distinguish the two categories CASE-tools database design. The first category includes systems with a basic set of features and tools. The most important basic functions of these systems are: management connections to database servers, visual tools to create and modify database objects (tables, views, triggers, procedures, the ability to enter and edit data in table mode, user and privilege management tools, editor SQL-code, means export/import data. CASE-system related to the first category can be used to design and develop simple databases, data management, as well as a means of administration server database. A distinctive feature of the second category of CASE-tools for database design (full-featured systems is the presence of visual designer, allowing to carry out the construction of the database model and automatic creation of the database on the server based on this model. CASE-system related to this categories can be used for the design and development of databases of any structural complexity, as well as a database server administration tool. The article concluded that the

  14. Managing Consistency Anomalies in Distributed Integrated Databases with Relaxed ACID Properties

    DEFF Research Database (Denmark)

    Frank, Lars; Ulslev Pedersen, Rasmus

    2014-01-01

    In central databases the consistency of data is normally implemented by using the ACID (Atomicity, Consistency, Isolation and Durability) properties of a DBMS (Data Base Management System). This is not possible if distributed and/or mobile databases are involved and the availability of data also...... has to be optimized. Therefore, we will in this paper use so called relaxed ACID properties across different locations. The objective of designing relaxed ACID properties across different database locations is that the users can trust the data they use even if the distributed database temporarily...... is inconsistent. It is also important that disconnected locations can operate in a meaningful way in socalled disconnected mode. A database is DBMS consistent if its data complies with the consistency rules of the DBMS's metadata. If the database is DBMS consistent both when a transaction starts and when it has...

  15. Storage and Database Management for Big Data

    Science.gov (United States)

    2015-07-27

    cloud models that satisfy different problem 1.2. THE BIG DATA CHALLENGE 3 Enterprise Big Data - Interactive - On-demand - Virtualization - Java ...replication. Data loss can only occur if three drives fail prior to any one of the failures being corrected. Hadoop is written in Java and is installed in a...visible view into a dataset. There are many popular database management systems such as MySQL [4], PostgreSQL [63], and Oracle [5]. Most commonly

  16. Ageing Management Program Database

    International Nuclear Information System (INIS)

    Basic, I.; Vrbanic, I.; Zabric, I.; Savli, S.

    2008-01-01

    The aspects of plant ageing management (AM) gained increasing attention over the last ten years. Numerous technical studies have been performed to study the impact of ageing mechanisms on the safe and reliable operation of nuclear power plants. National research activities have been initiated or are in progress to provide the technical basis for decision making processes. The long-term operation of nuclear power plants is influenced by economic considerations, the socio-economic environment including public acceptance, developments in research and the regulatory framework, the availability of technical infrastructure to maintain and service the systems, structures and components as well as qualified personnel. Besides national activities there are a number of international activities in particular under the umbrella of the IAEA, the OECD and the EU. The paper discusses the process, procedure and database developed for Slovenian Nuclear Safety Administration (SNSA) surveillance of ageing process of Nuclear power Plant Krsko.(author)

  17. Knowledge Value Creation Characteristics of Virtual Teams: A Case Study in the Construction Sector

    Science.gov (United States)

    Vorakulpipat, Chalee; Rezgui, Yacine

    Any knowledge environment aimed at virtual teams should promote identification, access, capture and retrieval of relevant knowledge anytime / anywhere, while nurturing the social activities that underpin the knowledge sharing and creation process. In fact, socio-cultural issues play a critical role in the successful implementation of Knowledge Management (KM), and constitute a milestone towards value creation. The findings indicate that Knowledge Management Systems (KMS) promote value creation when they embed and nurture the social conditions that bind and bond team members together. Furthermore, technology assets, human networks, social capital, intellectual capital, and change management are identified as essential ingredients that have the potential to ensure effective knowledge value creation.

  18. Waves of Knowledge Management: The Flow between Explicit and Tacit Knowledge

    OpenAIRE

    Roxanne H. Stevens; Joshua Millage; Sondra Clark

    2010-01-01

    Problem statement: Knowledge Management (KM) is often equated with content management. Indeed, robust knowledge management processes include a database; but, information becomes knowledge when it is understood, manipulated and can become tied to a purpose or idea. By equating KM with content management and by equating the purpose of KM with predictability and control, companies may inadvertently de-emphasize knowledge creation and transfer. To keep pace with global market dynamics, an explici...

  19. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    Science.gov (United States)

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  20. SIRSALE: integrated video database management tools

    Science.gov (United States)

    Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.

    2002-07-01

    Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.

  1. Development of subsurface drainage database system for use in environmental management issues

    International Nuclear Information System (INIS)

    Azhar, A.H.; Rafiq, M.; Alam, M.M.

    2007-01-01

    A simple user-friendly menue-driven system for database management pertinent to the Impact of Subsurface Drainage Systems on Land and Water Conditions (ISIAW) has been developed for use in environment-management issues of the drainage areas. This database has been developed by integrating four soft wares, viz; Microsoft Excel, MS Word Acrobat and MS Access. The information, in the form of tables and figures, with respect to various drainage projects has been presented in MS Word files. The major data-sets of various subsurface drainage projects included in the ISLaW database are: i) technical aspects, ii) groundwater and soil-salinity aspects, iii) socio-technical aspects, iv) agro-economic aspects, and v) operation and maintenance aspects. The various ISlAW file can be accessed just by clicking at the Menu buttons of the database system. This database not only gives feed back on the functioning of different subsurface drainage projects, with respect to the above-mentioned aspects, but also serves as a resource-document for these data for future studies on other drainage projects. The developed database-system is useful for planners, designers and Farmers Organisations for improved operation of existing drainage projects as well as development of future ones. (author)

  2. NM WAIDS: A PRODUCED WATER QUALITY AND INFRASTRUCTURE GIS DATABASE FOR NEW MEXICO OIL PRODUCERS

    Energy Technology Data Exchange (ETDEWEB)

    Martha Cather; Robert Lee; Ibrahim Gundiler; Andrew Sung

    2003-09-24

    The New Mexico Water and Infrastructure Data System (NM WAIDS) seeks to alleviate a number of produced water-related issues in southeast New Mexico. The project calls for the design and implementation of a Geographical Information System (GIS) and integral tools that will provide operators and regulators with necessary data and useful information to help them make management and regulatory decisions. The major components of this system are: (1) Databases on produced water quality, cultural and groundwater data, oil pipeline and infrastructure data, and corrosion information. (2) A web site capable of displaying produced water and infrastructure data in a GIS or accessing some of the data by text-based queries. (3) A fuzzy logic-based, site risk assessment tool that can be used to assess the seriousness of a spill of produced water. (4) A corrosion management toolkit that will provide operators with data and information on produced waters that will aid them in deciding how to address corrosion issues. The various parts of NM WAIDS will be integrated into a website with a user-friendly interface that will provide access to previously difficult-to-obtain data and information. Primary attention during the first six months of this project was focused on creating the water quality databases for produced water and surface water, along with collecting of corrosion information and building parts of the corrosion toolkit. Work on the project to date includes: (1) Creation of a water quality database for produced water analyses. The database was compiled from a variety of sources and currently has over 7000 entries for New Mexico. (2) Creation of a web-based data entry system for the water quality database. This system allows a user to view, enter, or edit data from a web page rather than having to directly access the database. (3) Creation of a semi-automated data capturing system for use with standard water quality analysis forms. This system improves the accuracy and speed

  3. Organizing, exploring, and analyzing antibody sequence data: the case for relational-database managers.

    Science.gov (United States)

    Owens, John

    2009-01-01

    Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.

  4. The development of technical database of advanced spent fuel management process

    Energy Technology Data Exchange (ETDEWEB)

    Ro, Seung Gy; Byeon, Kee Hoh; Song, Dae Yong; Park, Seong Won; Shin, Young Jun

    1999-03-01

    The purpose of this study is to develop the technical database system to provide useful information to researchers who study on the back end nuclear fuel cycle. Technical database of advanced spent fuel management process was developed for a prototype system in 1997. In 1998, this database system is improved into multi-user systems and appended special database which is composed of thermochemical formation data and reaction data. In this report, the detailed specification of our system design is described and the operating methods are illustrated as a user's manual. Also, expanding current system, or interfacing between this system and other system, this report is very useful as a reference. (Author). 10 refs., 18 tabs., 46 fig.

  5. The development of technical database of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, Seung Gy; Byeon, Kee Hoh; Song, Dae Yong; Park, Seong Won; Shin, Young Jun

    1999-03-01

    The purpose of this study is to develop the technical database system to provide useful information to researchers who study on the back end nuclear fuel cycle. Technical database of advanced spent fuel management process was developed for a prototype system in 1997. In 1998, this database system is improved into multi-user systems and appended special database which is composed of thermochemical formation data and reaction data. In this report, the detailed specification of our system design is described and the operating methods are illustrated as a user's manual. Also, expanding current system, or interfacing between this system and other system, this report is very useful as a reference. (Author). 10 refs., 18 tabs., 46 fig

  6. Reldata - a tool for reliability database management

    International Nuclear Information System (INIS)

    Vinod, Gopika; Saraf, R.K.; Babar, A.K.; Sanyasi Rao, V.V.S.; Tharani, Rajiv

    2000-01-01

    Component failure, repair and maintenance data is a very important element of any Probabilistic Safety Assessment study. The credibility of the results of such study is enhanced if the data used is generated from operating experience of similar power plants. Towards this objective, a computerised database is designed, with fields such as, date and time of failure, component name, failure mode, failure cause, ways of failure detection, reactor operating power status, repair times, down time, etc. This leads to evaluation of plant specific failure rate, and on demand failure probability/unavailability for all components. Systematic data updation can provide a real time component reliability parameter statistics and trend analysis and this helps in planning maintenance strategies. A software package has been developed RELDATA, which incorporates the database management and data analysis methods. This report describes the software features and underlying methodology in detail. (author)

  7. Development of a computational database for application in Probabilistic Safety Analysis of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner dos Santos

    2016-01-01

    The objective of this work is to present the computational database that was developed to store technical information and process data on component operation, failure and maintenance for the nuclear research reactors located at the Nuclear and Energy Research Institute (Instituto de Pesquisas Energéticas e Nucleares, IPEN), in São Paulo, Brazil. Data extracted from this database may be applied in the Probabilistic Safety Analysis of these research reactors or in less complex quantitative assessments related to safety, reliability, availability and maintainability of these facilities. This database may be accessed by users of the corporate network, named IPEN intranet. Professionals who require the access to the database must be duly registered by the system administrator, so that they will be able to consult and handle the information. The logical model adopted to represent the database structure is an entity-relationship model, which is in accordance with the protocols installed in IPEN intranet. The open-source relational database management system called MySQL, which is based on the Structured Query Language (SQL), was used in the development of this work. The PHP programming language was adopted to allow users to handle the database. Finally, the main result of this work was the creation a web application for the component reliability database named PSADB, specifically developed for the research reactors of IPEN; furthermore, the database management system provides relevant information efficiently. (author)

  8. Nuclear plant operations, maintenance, and configuration management using three-dimensional computer graphics and databases

    International Nuclear Information System (INIS)

    Tutos, N.C.; Reinschmidt, K.F.

    1987-01-01

    Stone and Webster Engineering Corporation has developed the Plant Digital Model concept as a new approach to Configuration Mnagement of nuclear power plants. The Plant Digital Model development is a step-by-step process, based on existing manual procedures and computer applications, and is fully controllable by the plant managers and engineers. The Plant Digital Model is based on IBM computer graphics and relational database management systems, and therefore can be easily integrated with existing plant databases and corporate management-information systems

  9. A survey of the use of database management systems in accelerator projects

    OpenAIRE

    Poole, John; Strubin, Pierre M

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accele...

  10. Value encounters - Modeling and analyzing co-creation of value

    NARCIS (Netherlands)

    Weigand, H.; Godart, C.; Gronau, N.; Sharma, S.; Canals, G.

    2009-01-01

    Recent marketing and management literature has introduced the concept of co-creation of value. Current value modeling approaches such as e3-value focus on the exchange of value rather than co-creation. In this paper, an extension to e3-value is proposed in the form of a “value encounter”. Value

  11. Value encounters : Modelling and analyzing co-creation of value

    NARCIS (Netherlands)

    Weigand, H.; Jayasinghe Arachchig, J.

    2009-01-01

    Recent marketing and management literature has introduced the concept of co-creation of value. Current value modeling approaches such as e3-value focus on the exchange of value rather than co-creation. In this paper, an extension to e3-value is proposed in the form of a “value encounter”. Value

  12. Taking Stock of Project Value Creation

    DEFF Research Database (Denmark)

    Laursen, Markus; Svejvig, Per

    2014-01-01

    This paper presents the outcome of a literature review through classifying and analyzing 59 publications in project value creation literature. The analysis led to five distinct categories: Benefit Realization Management (BRM) and techniques, broad value perspective, value time frame, engineering...... requirements and product development. These five categories cover a wide selection of value creation literature in project contexts. The project types reported in empirical studies are mainly IS/IT and construction and a variety of other types such as R&D and strategy implementation. The literature dates back...

  13. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    Science.gov (United States)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  14. Contextual-Based Knowledge Creation for Agroindustrial Innovation

    Directory of Open Access Journals (Sweden)

    Elisa Anggraeni

    2017-08-01

    Full Text Available This paper discusses the knowledge creation process in one department, in a higher educational context, and the possible actions to take to improve the efficiency and effectiveness of the knowledge creation system in it. We conducted a case study at one department of a university that strives to improve its innovations, in terms of their quantity and quality.We used a soft system methodology to investigate the knowledge creation system in the chosen department. From the study, we conclude that the department can be considered as a learning organization, within which its staff continually create, acquire and transfer knowledge. This department has a learning environment which is conducive, concrete learning processes, and leadership that reinforces learning. In the context of producing agroindustry innovations, the knowledge creation system in this department is considered to be less effective since it frequently happens more at individual or small group levels. To improve its effectiveness, the management may facilitate the institutionalization of knowledge creation processes at every phase of the interactions between tacit and explicit knowledge.

  15. Benefits of a relational database for computerized management

    International Nuclear Information System (INIS)

    Shepherd, W.W.

    1991-01-01

    This paper reports on a computerized relational database which is the basis for a hazardous materials information management system which is comprehensive, effective, flexible and efficient. The system includes product information for Material Safety Data Sheets (MSDSs), labels, shipping, and the environment and is used in Dowell Schlumberger (DS) operations worldwide for a number of programs including planning, training, emergency response and regulatory compliance

  16. Customer Experience Creation : Determinants, Dynamics and Management Strategies

    NARCIS (Netherlands)

    Verhoef, Peter C.; Lemon, Katherine N.; Parasuraman, A.; Roggeveen, Anne; Tsiros, Michael; Schlesinger, Leonard A.; Schlessinger, L.L.

    2009-01-01

    Retailers, such as Starbucks and Victoria's Secret, aim to provide customers a great experience across channels, In this paper we provide an overview of the existing literature on customer experience and expand on it to examine the creation of a customer experience front a holistic perspective. We

  17. Development of a Framework for Multimodal Research: Creation of a Bibliographic Database

    National Research Council Canada - National Science Library

    Coovert, Michael D; Gray, Ashley A; Elliott, Linda R; Redden, Elizabeth S

    2007-01-01

    .... The results of the overall effort, the multimodal framework and article tracking sheet, bibliographic database, and searchable multimodal database make substantial and valuable contributions to the accumulation and interpretation of multimodal research. References collected in this effort are listed in the appendix.

  18. The LHCb configuration database

    CERN Document Server

    Abadie, L; Van Herwijnen, Eric; Jacobsson, R; Jost, B; Neufeld, N

    2005-01-01

    The aim of the LHCb configuration database is to store information about all the controllable devices of the detector. The experiment's control system (that uses PVSS ) will configure, start up and monitor the detector from the information in the configuration database. The database will contain devices with their properties, connectivity and hierarchy. The ability to store and rapidly retrieve huge amounts of data, and the navigability between devices are important requirements. We have collected use cases to ensure the completeness of the design. Using the entity relationship modelling technique we describe the use cases as classes with attributes and links. We designed the schema for the tables using relational diagrams. This methodology has been applied to the TFC (switches) and DAQ system. Other parts of the detector will follow later. The database has been implemented using Oracle to benefit from central CERN database support. The project also foresees the creation of tools to populate, maintain, and co...

  19. Data management in the TJ-II multi-layer database

    International Nuclear Information System (INIS)

    Vega, J.; Cremy, C.; Sanchez, E.; Portas, A.; Fabregas, J.A.; Herrera, R.

    2000-01-01

    The handling of TJ-II experimental data is performed by means of several software modules. These modules provide the resources for data capture, data storage and management, data access as well as general-purpose data visualisation. Here we describe the module related to data storage and management. We begin by introducing the categories in which data can be classified. Then, we describe the TJ-II data flow through the several file systems involved, before discussing the architecture of the TJ-II database. We review the concept of the 'discharge file' and identify the drawbacks that would result from a direct application of this idea to the TJ-II data. In order to overcome these drawbacks, we propose alternatives based on our concepts of signal family, user work-group and data priority. Finally, we present a model for signal storage. This model is in accordance with the database architecture and provides a proper framework for managing the TJ-II experimental data. In the model, the information is organised in layers and is distributed according to the generality of the information, from the common fields of all signals (first layer), passing through the specific records of signal families (second layer) and reaching the particular information of individual signals (third layer)

  20. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  1. Use of an INGRES database to implement the beam parameter management at GANIL

    International Nuclear Information System (INIS)

    Gillette, P.; Lecorche, E.; Lermine, P.; Maugeais, C.; Leboucher, Ch.; Moscatello, M.H.; Pain, P.

    1995-01-01

    Since the beginning of the operation driven by the new Ganil control system in February 1993, the relational database management system (RDBMS) Ingres has been more and more widely used. The most significant application relying on the RDBMS is the new beam parameter management which has been entirely redesigned. It has been operational since the end of the machine shutdown in July this year. After a short recall of the use of Ingres inside the control system, the organization of the parameter management is presented. Then the database implementation is shown, including the way how the physical aspects of the Ganil tuning have been integrated in such an environment. (author)

  2. Knowledge creation for practice in public sector management accounting by consultants and academics: Preliminary findings and directions for future research

    NARCIS (Netherlands)

    van Helden, G.J.; Aardema, H.; ter Bogt, H.J.; Groot, T.L.C.M.

    2010-01-01

    This study is about knowledge creation for practice in public sector management accounting by consultants and academics. It shows that researchers emphasize the importance of practice, but worry about the prospects of a successful cross-fertilization between practice and research, because of the

  3. Knowledge creation for practice in public sector management accounting by consultants and academics : Preliminary findings and directions for future research

    NARCIS (Netherlands)

    van Helden, G. Jan; Aardema, Harrie; ter Bogt, Henk J.; Groot, Tom L. C. M.

    This study is about knowledge creation for practice in public sector management accounting by consultants and academics. It shows that researchers emphasize the importance of practice, but worry about the prospects of a successful cross-fertilization between practice and research, because of the

  4. The Process of Creation and Consolidation Committees for Hydrographic Basin Management Water Resources

    Directory of Open Access Journals (Sweden)

    Mario Marcos Lopes Lopes

    2011-06-01

    Full Text Available Water is among the most precious goods in Earth's environmental heritage, however, the economic activities have caused the contamination and degradation of surface and underground springs. Consequently, emerges the need to reconcile the development and the management of natural resources. Several national and international conferences have been taken place to spread this idea. In Brazil, this new model of water resources management is beginning to be implanted, culminating in the approval of The State Water Resources Policy and, later, in the National Water Resources Policy. This legislation takes the river basin as a regional unity of water planning and management. The objective of this work is to present the evolution of the process of organization and creation of river basin committees. Literature search as well as documentary analysis (minutes, decisions were used as research methodology. The experience of basin committees is considered an innovation for considering deliberative groups with effectively deliberative actions, incorporating guiding principles favoring shared management, taking as a support basis decentralization, integration and participation in the destiny of water resources in each region of the river basin. However, it is also necessary to intensify the involvement of users and other segments of society so that these groups can really work as "Water Parliament".

  5. Blending Education and Polymer Science: Semiautomated Creation of a Thermodynamic Property Database

    Science.gov (United States)

    Tchoua, Roselyne B.; Qin, Jian; Audus, Debra J.; Chard, Kyle; Foster, Ian T.; de Pablo, Juan

    2016-01-01

    Structured databases of chemical and physical properties play a central role in the everyday research activities of scientists and engineers. In materials science, researchers and engineers turn to these databases to quickly query, compare, and aggregate various properties, thereby allowing for the development or application of new materials. The…

  6. Application of cloud database in the management of clinical data of patients with skin diseases.

    Science.gov (United States)

    Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning

    2015-04-01

    To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.

  7. Applying the archetype approach to the database of a biobank information management system.

    Science.gov (United States)

    Späth, Melanie Bettina; Grimson, Jane

    2011-03-01

    The purpose of this study is to investigate the feasibility of applying the openEHR archetype approach to modelling the data in the database of an existing proprietary biobank information management system. A biobank information management system stores the clinical/phenotypic data of the sample donor and sample related information. The clinical/phenotypic data is potentially sourced from the donor's electronic health record (EHR). The study evaluates the reuse of openEHR archetypes that have been developed for the creation of an interoperable EHR in the context of biobanking, and proposes a new set of archetypes specifically for biobanks. The ultimate goal of the research is the development of an interoperable electronic biomedical research record (eBMRR) to support biomedical knowledge discovery. The database of the prostate cancer biobank of the Irish Prostate Cancer Research Consortium (PCRC), which supports the identification of novel biomarkers for prostate cancer, was taken as the basis for the modelling effort. First the database schema of the biobank was analyzed and reorganized into archetype-friendly concepts. Then, archetype repositories were searched for matching archetypes. Some existing archetypes were reused without change, some were modified or specialized, and new archetypes were developed where needed. The fields of the biobank database schema were then mapped to the elements in the archetypes. Finally, the archetypes were arranged into templates specifically to meet the requirements of the PCRC biobank. A set of 47 archetypes was found to cover all the concepts used in the biobank. Of these, 29 (62%) were reused without change, 6 were modified and/or extended, 1 was specialized, and 11 were newly defined. These archetypes were arranged into 8 templates specifically required for this biobank. A number of issues were encountered in this research. Some arose from the immaturity of the archetype approach, such as immature modelling support tools

  8. Use of Knowledge Bases in Education of Database Management

    Science.gov (United States)

    Radványi, Tibor; Kovács, Emod

    2008-01-01

    In this article we present a segment of Sulinet Digital Knowledgebase curriculum system in which you can find the sections of subject-matter which aid educating the database management. You can follow the order of the course from the beginning when some topics appearance and raise in elementary school, through the topics accomplish in secondary…

  9. Information Technology: A challenge to the Creation and ...

    African Journals Online (AJOL)

    As organisations increasingly adopt the use of information and communication technologies, the corresponding increase in the creation of electronic records has brought about a number of records management challenges. These manifest themselves in a number of ways. Problems associated with the management of ...

  10. Distinctive Dynamic Capabilities for New Business Creation

    DEFF Research Database (Denmark)

    Rosenø, Axel; Enkel, Ellen; Mezger, Florian

    2013-01-01

    This study examines the distinctive dynamic capabilities for new business creation in established companies. We argue that these are very different from those for managing incremental innovation within a company's core business. We also propose that such capabilities are needed in both slow...... and fast-paced industries, and that similarities exist across industries. Hence, the study contributes to dynamic capabilities literature by: 1) identifying the distinctive dynamic capabilities for new business creation; 2) shifting focus away from dynamic capabilities in environments characterised by high...... clock-speed and uncertainty towards considering dynamic capabilities for the purpose of developing new businesses, which also implies a high degree of uncertainty. Based on interviews with 33 companies, we identify distinctive dynamic capabilities for new business creation, find that dynamic...

  11. [Conceptual foundations of creation of branch database of technology and intellectual property rights owned by scientific institutions, organizations, higher medical educational institutions and enterprises of healthcare sphere of Ukraine].

    Science.gov (United States)

    Horban', A Ie

    2013-09-01

    The question of implementation of the state policy in the field of technology transfer in the medical branch to implement the law of Ukraine of 02.10.2012 No 5407-VI "On Amendments to the law of Ukraine" "On state regulation of activity in the field of technology transfers", namely to ensure the formation of branch database on technology and intellectual property rights owned by scientific institutions, organizations, higher medical education institutions and enterprises of healthcare sphere of Ukraine and established by budget are considered. Analysis of international and domestic experience in the processing of information about intellectual property rights and systems implementation support transfer of new technologies are made. The main conceptual principles of creation of this branch database of technology transfer and branch technology transfer network are defined.

  12. Implementation of a database for the management of radioactive sources

    International Nuclear Information System (INIS)

    MOHAMAD, M.

    2012-01-01

    In Madagascar, the application of nuclear technology continues to develop. In order to protect the human health and his environment against the harmful effects of the ionizing radiation, each user of radioactive sources has to implement a program of nuclear security and safety and to declare their sources at Regulatory Authority. This Authority must have access to all the informations relating to all the sources and their uses. This work is based on the elaboration of a software using python as programming language and SQlite as database. It makes possible to computerize the radioactive sources management.This application unifies the various existing databases and centralizes the activities of the radioactive sources management.The objective is to follow the movement of each source in the Malagasy territory in order to avoid the risks related on the use of the radioactive sources and the illicit traffic. [fr

  13. The Creation and Operation of Internal High Performance Modern Enterprises Team

    Institute of Scientific and Technical Information of China (English)

    Shengyu WANG

    2015-01-01

    The future of enterprises mainly depends on product research and development. For the modern enterprises, high performance project team is the most important means of R & D projects. According to the interviews and survey found of a plurality of enterprise project R & D team. the internal high performance team of modern business is good or bad, its key lies in whether the team managers for the team creation and management is in place, this is the most difficult place for the high performance team management system, especially the team leadership. Based on this, this paper discusses on the creation and management of high performance modern enterprise team, aiming to provide valuable reference for the enterprise team management.

  14. Operations management in automotive industries from industrial strategies to production resources management, through the industrialization process and supply chain to pursue value creation

    CERN Document Server

    Gobetto, Marco

    2014-01-01

    This book has proved its worth over the years as a text for courses in Production Management at the Faculty of Automotive Engineering in Turin, Italy, but deserves a wider audience as it presents a compendium of basics on Industrial Management, since it covers all major topics required. It treats all subjects from product development and “make or buy”-decision strategies to the manufacturing systems setting and management through analysis of the main resources needed in production and finally exploring the supply chain management and the procurement techniques. The very last chapter recapitulates the previous ones by analysing key management indicators to pursue the value creation that is the real purpose of every industrial enterprise. As an appendix, a specific chapter is dedicated to the basics of production management where all main relevant definitions, techniques and criteria are treated, including some numerical examples, in order to provide an adequate foundation for understanding the other chapte...

  15. Use of an INGRES database to implement the beam parameter management at GANIL

    Energy Technology Data Exchange (ETDEWEB)

    Gillette, P.; Lecorche, E.; Lermine, P.; Maugeais, C.; Leboucher, Ch.; Moscatello, M.H.; Pain, P.

    1995-12-31

    Since the beginning of the operation driven by the new Ganil control system in February 1993, the relational database management system (RDBMS) Ingres has been more and more widely used. The most significant application relying on the RDBMS is the new beam parameter management which has been entirely redesigned. It has been operational since the end of the machine shutdown in July this year. After a short recall of the use of Ingres inside the control system, the organization of the parameter management is presented. Then the database implementation is shown, including the way how the physical aspects of the Ganil tuning have been integrated in such an environment. (author). 2 refs.

  16. Use of an INGRES database to implement the beam parameter management at GANIL

    Energy Technology Data Exchange (ETDEWEB)

    Gillette, P; Lecorche, E; Lermine, P; Maugeais, C; Leboucher, Ch; Moscatello, M H; Pain, P

    1996-12-31

    Since the beginning of the operation driven by the new Ganil control system in February 1993, the relational database management system (RDBMS) Ingres has been more and more widely used. The most significant application relying on the RDBMS is the new beam parameter management which has been entirely redesigned. It has been operational since the end of the machine shutdown in July this year. After a short recall of the use of Ingres inside the control system, the organization of the parameter management is presented. Then the database implementation is shown, including the way how the physical aspects of the Ganil tuning have been integrated in such an environment. (author). 2 refs.

  17. KALIMER database development

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment.

  18. KALIMER database development

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Lee, Yong Bum; Jeong, Hae Yong; Ha, Kwi Seok

    2003-03-01

    KALIMER database is an advanced database to utilize the integration management for liquid metal reactor design technology development using Web applications. KALIMER design database is composed of results database, Inter-Office Communication (IOC), 3D CAD database, and reserved documents database. Results database is a research results database during all phase for liquid metal reactor design technology development of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD database is a schematic overview for KALIMER design structure. And reserved documents database is developed to manage several documents and reports since project accomplishment

  19. SNPpy--database management for SNP data from genome wide association studies.

    Directory of Open Access Journals (Sweden)

    Faheem Mitha

    Full Text Available BACKGROUND: We describe SNPpy, a hybrid script database system using the Python SQLAlchemy library coupled with the PostgreSQL database to manage genotype data from Genome-Wide Association Studies (GWAS. This system makes it possible to merge study data with HapMap data and merge across studies for meta-analyses, including data filtering based on the values of phenotype and Single-Nucleotide Polymorphism (SNP data. SNPpy and its dependencies are open source software. RESULTS: The current version of SNPpy offers utility functions to import genotype and annotation data from two commercial platforms. We use these to import data from two GWAS studies and the HapMap Project. We then export these individual datasets to standard data format files that can be imported into statistical software for downstream analyses. CONCLUSIONS: By leveraging the power of relational databases, SNPpy offers integrated management and manipulation of genotype and phenotype data from GWAS studies. The analysis of these studies requires merging across GWAS datasets as well as patient and marker selection. To this end, SNPpy enables the user to filter the data and output the results as standardized GWAS file formats. It does low level and flexible data validation, including validation of patient data. SNPpy is a practical and extensible solution for investigators who seek to deploy central management of their GWAS data.

  20. The MANAGE database: nutrient load and site characteristic updates and runoff concentration data.

    Science.gov (United States)

    Harmel, Daren; Qian, Song; Reckhow, Ken; Casebolt, Pamela

    2008-01-01

    The "Measured Annual Nutrient loads from AGricultural Environments" (MANAGE) database was developed to be a readily accessible, easily queried database of site characteristic and field-scale nutrient export data. The original version of MANAGE, which drew heavily from an early 1980s compilation of nutrient export data, created an electronic database with nutrient load data and corresponding site characteristics from 40 studies on agricultural (cultivated and pasture/range) land uses. In the current update, N and P load data from 15 additional studies of agricultural runoff were included along with N and P concentration data for all 55 studies. The database now contains 1677 watershed years of data for various agricultural land uses (703 for pasture/rangeland; 333 for corn; 291 for various crop rotations; 177 for wheat/oats; and 4-33 yr for barley, citrus, vegetables, sorghum, soybeans, cotton, fallow, and peanuts). Across all land uses, annual runoff loads averaged 14.2 kg ha(-1) for total N and 2.2 kg ha(-1) for total P. On average, these losses represented 10 to 25% of applied fertilizer N and 4 to 9% of applied fertilizer P. Although such statistics produce interesting generalities across a wide range of land use, management, and climatic conditions, regional crop-specific analyses should be conducted to guide regulatory and programmatic decisions. With this update, MANAGE contains data from a vast majority of published peer-reviewed N and P export studies on homogeneous agricultural land uses in the USA under natural rainfall-runoff conditions and thus provides necessary data for modeling and decision-making related to agricultural runoff. The current version can be downloaded at http://www.ars.usda.gov/spa/manage-nutrient.

  1. Health technology management: a database analysis as support of technology managers in hospitals.

    Science.gov (United States)

    Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi

    2011-01-01

    Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy.

  2. Cloud database development and management

    CERN Document Server

    Chao, Lee

    2013-01-01

    Nowadays, cloud computing is almost everywhere. However, one can hardly find a textbook that utilizes cloud computing for teaching database and application development. This cloud-based database development book teaches both the theory and practice with step-by-step instructions and examples. This book helps readers to set up a cloud computing environment for teaching and learning database systems. The book will cover adequate conceptual content for students and IT professionals to gain necessary knowledge and hands-on skills to set up cloud based database systems.

  3. Pattern-based information portal for business plan co-creation

    Science.gov (United States)

    Bontchev, Boyan; Ruskov, Petko; Tanev, Stoyan

    2011-03-01

    Creation of business plans helps entrepreneurs in managing identification of business opportunities and committing necessary resources for process evolution. Applying patterns in business plan creation facilitates the identification of effective solutions that were adopted in the past and may provide a basis for adopting similar solutions in the future within given business context. The article presents the system design of an information portal for business plan co-creation based on patterns. The portal is going to provide start-up and entrepreneurs with ready-to-modify business plan patterns in order to help them in development of effective and efficient business plans. It will facilitate entrepreneurs in co-experimenting and co-learning more frequently and faster. Moreover, the paper focuses on the software architecture of the pattern based portal and explains the functionality of its modules, namely the pattern designer, pattern repository services and agent-based pattern implementers. It explains their role for business process co-creation, storing and managing patterns described formally, and selecting patterns best suited for specific business case. Thus, innovative entrepreneurs will be guided by the portal in co-writing winning business plans and staying competitive in the present day dynamic globalized environment.

  4. Brasilia’s Database Administrators

    Directory of Open Access Journals (Sweden)

    Jane Adriana

    2016-06-01

    Full Text Available Database administration has gained an essential role in the management of new database technologies. Different data models are being created for supporting the enormous data volume, from the traditional relational database. These new models are called NoSQL (Not only SQL databases. The adoption of best practices and procedures, has become essential for the operation of database management systems. Thus, this paper investigates some of the techniques and tools used by database administrators. The study highlights features and particularities in databases within the area of Brasilia, the Capital of Brazil. The results point to which new technologies regarding database management are currently the most relevant, as well as the central issues in this area.

  5. Magnetic Pair Creation Transparency in Pulsars

    Science.gov (United States)

    Story, Sarah; Baring, M. G.

    2013-04-01

    The Fermi gamma-ray pulsar database now exceeds 115 sources and has defined an important part of Fermi's science legacy, providing rich information for the interpretation of young energetic pulsars and old millisecond pulsars. Among the well established population characteristics is the common occurrence of exponential turnovers in the 1-10 GeV range. These turnovers are too gradual to arise from magnetic pair creation in the strong magnetic fields of pulsar inner magnetospheres, so their energy can be used to provide lower bounds to the typical altitude of GeV band emission. We explore such constraints due to single-photon pair creation transparency below the turnover energy. We adopt a semi-analytic approach, spanning both domains when general relativistic influences are important and locales where flat spacetime photon propagation is modified by rotational aberration effects. Our work clearly demonstrates that including near-threshold physics in the pair creation rate is essential to deriving accurate attenuation lengths. The altitude bounds, typically in the range of 2-6 neutron star radii, provide key information on the emission altitude in radio quiet pulsars that do not possess double-peaked pulse profiles. For the Crab pulsar, which emits pulsed radiation up to energies of 120 GeV, we obtain a lower bound of around 15 neutron star radii to its emission altitude.

  6. Database Optimizing Services

    Directory of Open Access Journals (Sweden)

    Adrian GHENCEA

    2010-12-01

    Full Text Available Almost every organization has at its centre a database. The database provides support for conducting different activities, whether it is production, sales and marketing or internal operations. Every day, a database is accessed for help in strategic decisions. The satisfaction therefore of such needs is entailed with a high quality security and availability. Those needs can be realised using a DBMS (Database Management System which is, in fact, software for a database. Technically speaking, it is software which uses a standard method of cataloguing, recovery, and running different data queries. DBMS manages the input data, organizes it, and provides ways of modifying or extracting the data by its users or other programs. Managing the database is an operation that requires periodical updates, optimizing and monitoring.

  7. Hazardous waste database: Waste management policy implications for the US Department of Energy's Environmental Restoration and Waste Management Programmatic Environmental Impact Statement

    International Nuclear Information System (INIS)

    Lazaro, M.A.; Policastro, A.J.; Antonopoulos, A.A.; Hartmann, H.M.; Koebnick, B.; Dovel, M.; Stoll, P.W.

    1994-01-01

    The hazardous waste risk assessment modeling (HaWRAM) database is being developed to analyze the risk from treatment technology operations and potential transportation accidents associated with the hazardous waste management alternatives. These alternatives are being assessed in the Department of Energy's Environmental Restoration and Waste Management Programmatic Environmental Impact Statement (EM PEIS). To support the risk analysis, the current database contains complexwide detailed information on hazardous waste shipments from 45 Department of Energy installations during FY 1992. The database is currently being supplemented with newly acquired data. This enhancement will improve database information on operational hazardous waste generation rates, and the level and type of current on-site treatment at Department of Energy installations

  8. Role of Database Management Systems in Selected Engineering Institutions of Andhra Pradesh: An Analytical Survey

    Directory of Open Access Journals (Sweden)

    Kutty Kumar

    2016-06-01

    Full Text Available This paper aims to analyze the function of database management systems from the perspective of librarians working in engineering institutions in Andhra Pradesh. Ninety-eight librarians from one hundred thirty engineering institutions participated in the study. The paper reveals that training by computer suppliers and software packages are the significant mode of acquiring DBMS skills by librarians; three-fourths of the librarians are postgraduate degree holders. Most colleges use database applications for automation purposes and content value. Electrical problems and untrained staff seem to be major constraints faced by respondents for managing library databases.

  9. Database to manage personal dosimetry Hospital Universitario de La Ribera

    International Nuclear Information System (INIS)

    Melchor, M.; Martinez, D.; Asensio, M.; Candela, F.; Camara, A.

    2011-01-01

    For the management of professionally exposed personnel dosimetry, da La are required for the use and return of dosimeters. in the Department of Radio Physics and Radiation Protection have designed and implemented a database management staff dosimetry Hospital and Area Health Centers. The specific objectives were easily import data from the National Center dosimetric dosimetry, consulting records in a simple dosimetry, dosimeters allow rotary handle, and also get reports from different periods of time to know the return data for users, services, etc.

  10. Development of an integrated database management system to evaluate integrity of flawed components of nuclear power plant

    International Nuclear Information System (INIS)

    Mun, H. L.; Choi, S. N.; Jang, K. S.; Hong, S. Y.; Choi, J. B.; Kim, Y. J.

    2001-01-01

    The object of this paper is to develop an NPP-IDBMS(Integrated DataBase Management System for Nuclear Power Plants) for evaluating the integrity of components of nuclear power plant using relational data model. This paper describes the relational data model, structure and development strategy for the proposed NPP-IDBMS. The NPP-IDBMS consists of database, database management system and interface part. The database part consists of plant, shape, operating condition, material properties and stress database, which are required for the integrity evaluation of each component in nuclear power plants. For the development of stress database, an extensive finite element analysis was performed for various components considering operational transients. The developed NPP-IDBMS will provide efficient and accurate way to evaluate the integrity of flawed components

  11. The Net Enabled Waste Management Database in the context of an indicator of sustainable development for radioactive waste management

    International Nuclear Information System (INIS)

    Csullog, G.W.; Selling, H.; Holmes, R.; Benitez, J.C.

    2002-01-01

    The IAEA was selected by the UN to be the lead agency for the development and implementation of indicators of sustainable development for radioactive waste management (ISD-RW). Starting in late 1999, the UN initiated a program to consolidate a large number of indicators into a smaller set and advised the IAEA that a single ISD-RW was needed. In September 2001, a single indicator was developed by the IAEA and subsequently revised in February 2002. In parallel with its work on the ISD-RW, the IAEA developed and implemented the Net Enabled Waste Management Database (NEWMDB). The NEWMDB is an international database to collect, compile and disseminate information about nationally-based radioactive waste management programmes and waste inventories. The first data collection cycle with the NEWMDB (July 2001 to March 2002) demonstrated that much of the information needed to calculate the ISD-RW could be collected by the IAEA for its international database. However, the first data collection cycle indicated that capacity building, in the area of identifying waste classification schemes used in countries, is required. (author)

  12. Development of database management system for monitoring of radiation workers for actinides

    International Nuclear Information System (INIS)

    Kalyane, G.N.; Mishra, L.; Nadar, M.Y.; Singh, I.S.; Rao, D.D.

    2012-01-01

    Annually around 500 radiation workers are monitored for estimation of lung activities and internal dose due to Pu/Am and U from various divisions of Bhabha Atomic Research Centre (Trombay) and from PREFRE and A3F facilities (Tarapur) in lung counting laboratory located at Bhabha Atomic Research Centre hospital under Routine and Special monitoring program. A 20 cm diameter phoswich and an array of HPGe detector were used for this purpose. In case of positive contamination, workers are followed up and monitored using both the detection systems in different geometries. Management of this huge data becomes difficult and therefore an easily retrievable database system containing all the relevant data of the monitored radiation workers. Materials and methods: The database management system comprises of three main modules integrated together: 1) Apache server installed on a Windows (XP) platform (Apache version 2.2.17) 2) MySQL database management system (MySQL version 5.5.8) 3) PHP (Preformatted Hypertext) programming language (PHP version 5.3.5). All the 3 modules work together seamlessly as a single software program. The front end user interaction is through an user friendly and interactive local web page where internet connection is not required. This front page has hyperlinks to many other pages, which have different utilities for the user. The user has to log in using username and password. Results and Conclusions: Database management system is used for entering, updating and management of lung monitoring data of radiation workers, The program is having following utilities: bio-data entry of new subjects, editing of bio-data of old subjects (only one subject at a time), entry of counting data of that day's lung monitoring, retrieval of old records based on a number of parameters and filters like date of counting, employee number, division, counts fulfilling a given criterion, etc. and calculation of MEQ CWT (Muscle Equivalent Chest Wall Thickness), energy

  13. The Net Enabled Waste Management Database as an international source of radioactive waste management information

    International Nuclear Information System (INIS)

    Csullog, G.W.; Friedrich, V.; Miaw, S.T.W.; Tonkay, D.; Petoe, A.

    2002-01-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an integral part of the IAEA's policies and strategy related to the collection and dissemination of information, both internal to the IAEA in support of its activities and external to the IAEA (publicly available). The paper highlights the NEWMDB's role in relation to the routine reporting of status and trends in radioactive waste management, in assessing the development and implementation of national systems for radioactive waste management, in support of a newly developed indicator of sustainable development for radioactive waste management, in support of reporting requirements for the Joint Convention on the Safety of Spent Fuel Management and on the Safety of Radioactive Waste Management, in support of IAEA activities related to the harmonization of waste management information at the national and international levels and in relation to the management of spent/disused sealed radioactive sources. (author)

  14. The Framework of Knowledge Creation for Online Learning Environments

    Science.gov (United States)

    Huang, Hsiu­-Mei; Liaw, Shu­-Sheng

    2004-01-01

    In today's competitive global economy characterized by knowledge acquisition, the concept of knowledge management has become increasingly prevalent in academic and business practices. Knowledge creation is an important factor and remains a source of competitive advantage over knowledge management. Information technology facilitates knowledge…

  15. Selecting a Relational Database Management System for Library Automation Systems.

    Science.gov (United States)

    Shekhel, Alex; O'Brien, Mike

    1989-01-01

    Describes the evaluation of four relational database management systems (RDBMSs) (Informix Turbo, Oracle 6.0 TPS, Unify 2000 and Relational Technology's Ingres 5.0) to determine which is best suited for library automation. The evaluation criteria used to develop a benchmark specifically designed to test RDBMSs for libraries are discussed. (CLB)

  16. A Model for Sustainable Value Creation in Supply Chain

    OpenAIRE

    KORDİTABAR, Seyed Behzad

    2015-01-01

    Abstract. In order to survive, every company needs to achieve sustainable profitability, which is impossible unless there is sustainable value creation. Regarding the fact that sustainability is closely related with concepts of supply chain management, the present paper intends to propose through a conceptual theorization approach a new comprehensive model drawing on concepts of value creation and sustainability from the perspective of supply chain, specifying the dimensions contributing to s...

  17. Management of radiological related equipments. Creating the equipment management database and analysis of the repair and maintenance records

    International Nuclear Information System (INIS)

    Eguchi, Megumu; Taguchi, Keiichi; Oota, Takashi; Kajiwara, Hiroki; Ono, Kiyotune; Hagio, Kiyofumi; Uesugi, Ekizo; Kajishima, Tetuo; Ueda, Kenji

    2002-01-01

    In 1997, we established the committee of equipments maintenance and management in our department. We designed the database in order to classify and register all the radiological related equipments using Microsoft Access. The management of conditions and cost of each equipment has become easier, by keeping and recording the database in the equipments management ledger and by filing the history of repairs or maintenances occurred to modalities. We then accounted numbers, cost of repairs and downtimes from the data of the repair and maintenance records for four years, and we reexamined the causal analysis of failures and the contents of the regular maintenance for CT and MRI equipments that had shown the higher numbers of repairs. Consequently, we have found the improvement of registration method of the data and the more economical way to use of the cost of repair. (author)

  18. Information flow in the DAMA project beyond database managers: information flow managers

    Science.gov (United States)

    Russell, Lucian; Wolfson, Ouri; Yu, Clement

    1996-12-01

    To meet the demands of commercial data traffic on the information highway, a new look at managing data is necessary. One projected activity, sharing of point of sale information, is being considered in the Demand Activated Manufacturing Project (DAMA) of the American Textile Partnership (AMTEX) project. A scenario is examined in which 100 000 retail outlets communicate over a period of days. They provide the latest estimate of demand for sewn products across a chain of 26 000 suppliers through the use of bill of materials explosions at four levels of detail. Enabling this communication requires an approach that shares common features with both workflows and database management. A new paradigm, the information flow manager, is developed to handle this situation, including the case where members of the supply chain fail to communicate and go out of business. Techniques for approximation are introduced so as to keep estimates of demand as current as possible.

  19. SOFTWARE COMPLEX FOR CREATION AND ACCUMULATION OF MODERN LEARNING MATERIALS

    Directory of Open Access Journals (Sweden)

    V. V. Polinovskyi

    2010-08-01

    Full Text Available The article analyzes weaknesses of existing tools for lecture materials creation and suggests new complex with modular architecture, which supports different types of lecture materials, templates, interactive elements, includes lecture material database with searching, sorting, and grouping capabilities and can be used for creating lectures courses for distance learning, as well as for interactive lectures for full-time courses.

  20. Value creation in industrial heritage management. Evidence from the City of Paper (Fabriano, Italy

    Directory of Open Access Journals (Sweden)

    Mara Cerquetti

    2017-12-01

    Full Text Available The paper discusses the open, inclusive, dynamic, proactive notion of cultural heritage that is emerging in the international scientific debate. Some significant innovations are examined first: the overcoming of the dualism between tangible and intangible cultural heritage, the increasing role of local communities in the processes of heritage recognition, safeguarding and enhancement and the need for valorisation as a democratic mandate. Aiming at developing this approach, the second step of the research focuses on industrial heritage, investigating its specific features and values. A case study is provided in order to understand some crucial issues concerning industrial heritage management and value creation. Focusing on the City of Paper (Fabriano, Italy, the activities carried out by the Museum of Paper and Watermark and by the Institute of Paper History Gianfranco Fedrigoni (ISTOCARTA are analysed in-depth, highlighting the role of collaboration among the different actors involved in industrial heritage management in order to promote sustainable local development.

  1. Documentation of databases in the Wilmar Planning tool

    International Nuclear Information System (INIS)

    Kiviluioma, J.; Meimbom, P.

    2006-01-01

    The Wilmar Planning tool consists of a number of databases and models as shown in Figure 1. This report documents the design of the following subparts of the Wilmar Planning tool: 1. The Scenario database holding the scenario trees generated from the Scenario Tree Creation model. 2. The Input database holding input data to the Joint Market model and the Long-term model apart from the scenario trees. 3. The output database containing the results of a Joint Market model run. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). (LN)

  2. Report on the first Twente Data Management Workshop on XML Databases and Information Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Mihajlovic, V.

    2004-01-01

    The Database Group of the University of Twente initiated a new series of workshops called Twente Data Management workshops (TDM), starting with one on XML Databases and Information Retrieval which took place on 21 June 2004 at the University of Twente. We have set ourselves two goals for the

  3. Database Foundation For The Configuration Management Of The CERN Accelerator Controls Systems

    CERN Document Server

    Zaharieva, Z; Peryt, M

    2011-01-01

    The Controls Configuration Database (CCDB) and its interfaces have been developed over the last 25 years in order to become nowadays the basis for the Configuration Management of the Controls System for all accelerators at CERN. The CCDB contains data for all configuration items and their relationships, required for the correct functioning of the Controls System. The configuration items are quite heterogeneous, depicting different areas of the Controls System – ranging from 3000 Front-End Computers, 75 000 software devices allowing remote control of the accelerators, to valid states of the Accelerators Timing System. The article will describe the different areas of the CCDB, their interdependencies and the challenges to establish the data model for such a diverse configuration management database, serving a multitude of clients. The CCDB tracks the life of the configuration items by allowing their clear identification, triggering of change management processes as well as providing status accounting and aud...

  4. Carbon information disclosure of enterprises and their value creation through market liquidity and cost of equity capital

    Directory of Open Access Journals (Sweden)

    Li Li

    2015-01-01

    Full Text Available Purpose: Drawing on asymmetric information and stakeholder theories, this paper investigates two mechanisms, namely market liquidity and cost of equity capital, by which the carbon information disclosure of enterprises can benefit their value creation. Design/methodology/approach: In this research, web crawler technology is employed to study the link between carbon information disclosure and enterprises value creation?and the carbon information data are provided by all companies listed in Chinese A-share market Findings: The results show that carbon information disclosure have significant positive influence on enterprise value creation, which is embodied in the relationship between carbon information disclosure quantity, depth and enterprise value creation, and market liquidity and cost of equity capital play partially mediating role in it, while the influence of carbon information disclosure quality and concentration on enterprise value creation are not significant in statistics. Research limitations/implications: This paper explains the influence path and mechanism between carbon information disclosure and enterprise value creation deeply, answers the question of whether carbon information disclosure affects enterprise value creation or not in China. Practical implications: This paper finds that carbon information disclosure contributes positively to enterprise value creation suggests that managers can reap more financial benefits by disclosing more carbon information and investing carbon emissions management. So, managers in the enterprises should strengthen the management of carbon information disclosure behavior. Originality/value: The paper gives a different perspective on the influence of carbon information disclosure on enterprise value creation, and suggests a new direction to understand carbon information disclosure behavior.

  5. Advanced evacuation model managed through fuzzy logic during an accident in LNG terminal

    Energy Technology Data Exchange (ETDEWEB)

    Stankovicj, Goran; Petelin, Stojan [Faculty for Maritime Studies and Transport, University of Ljubljana, Portorozh (Sierra Leone); others, and

    2014-07-01

    Evacuation of people located inside the enclosed area of an LNG terminal is a complex problem, especially considering that accidents involving LNG are potentially very hazardous. In order to create an evacuation model managed through fuzzy logic, extensive influence must be generated from safety analyses. A very important moment in the optimal functioning of an evacuation model is the creation of a database which incorporates all input indicators. The output result is the creation of a safety evacuation route which is active at the moment of the accident. (Author)

  6. A framework for cross-observatory volcanological database management

    Science.gov (United States)

    Aliotta, Marco Antonio; Amore, Mauro; Cannavò, Flavio; Cassisi, Carmelo; D'Agostino, Marcello; Dolce, Mario; Mastrolia, Andrea; Mangiagli, Salvatore; Messina, Giuseppe; Montalto, Placido; Fabio Pisciotta, Antonino; Prestifilippo, Michele; Rossi, Massimo; Scarpato, Giovanni; Torrisi, Orazio

    2017-04-01

    In the last years, it has been clearly shown how the multiparametric approach is the winning strategy to investigate the complex dynamics of the volcanic systems. This involves the use of different sensor networks, each one dedicated to the acquisition of particular data useful for research and monitoring. The increasing interest devoted to the study of volcanological phenomena led the constitution of different research organizations or observatories, also relative to the same volcanoes, which acquire large amounts of data from sensor networks for the multiparametric monitoring. At INGV we developed a framework, hereinafter called TSDSystem (Time Series Database System), which allows to acquire data streams from several geophysical and geochemical permanent sensor networks (also represented by different data sources such as ASCII, ODBC, URL etc.), located on the main volcanic areas of Southern Italy, and relate them within a relational database management system. Furthermore, spatial data related to different dataset are managed using a GIS module for sharing and visualization purpose. The standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common space and time scale. In order to share data between INGV observatories, and also with Civil Protection, whose activity is related on the same volcanic districts, we designed a "Master View" system that, starting from the implementation of a number of instances of the TSDSystem framework (one for each observatory), makes possible the joint interrogation of data, both temporal and spatial, on instances located in different observatories, through the use of web services technology (RESTful, SOAP). Similarly, it provides metadata for equipment using standard schemas (such as FDSN StationXML). The "Master View" is also responsible for managing the data policy through a "who owns what" system, which allows you to associate viewing/download of

  7. Database And Interface Modifications: Change Management Without Affecting The Clients

    CERN Document Server

    Peryt, M; Martin Marquez, M; Zaharieva, Z

    2011-01-01

    The first Oracle®-based Controls Configuration Database (CCDB) was developed in 1986, by which the controls system of CERN’s Proton Synchrotron became data-driven. Since then, this mission-critical system has evolved tremendously going through several generational changes in terms of the increasing complexity of the control system, software technologies and data models. Today, the CCDB covers the whole CERN accelerator complex and satisfies a much wider range of functional requirements. Despite its online usage, everyday operations of the machines must not be disrupted. This paper describes our approach with respect to dealing with change while ensuring continuity. How do we manage the database schema changes? How do we take advantage of the latest web deployed application development frameworks without alienating the users? How do we minimize impact on the dependent systems connected to databases through various APIs? In this paper we will provide our answers to these questions, and to many more.

  8. Building a recruitment database for asthma trials: a conceptual framework for the creation of the UK Database of Asthma Research Volunteers.

    Science.gov (United States)

    Nwaru, Bright I; Soyiri, Ireneous N; Simpson, Colin R; Griffiths, Chris; Sheikh, Aziz

    2016-05-26

    Randomised clinical trials are the 'gold standard' for evaluating the effectiveness of healthcare interventions. However, successful recruitment of participants remains a key challenge for many trialists. In this paper, we present a conceptual framework for creating a digital, population-based database for the recruitment of asthma patients into future asthma trials in the UK. Having set up the database, the goal is to then make it available to support investigators planning asthma clinical trials. The UK Database of Asthma Research Volunteers will comprise a web-based front-end that interactively allows participant registration, and a back-end that houses the database containing participants' key relevant data. The database will be hosted and maintained at a secure server at the Asthma UK Centre for Applied Research based at The University of Edinburgh. Using a range of invitation strategies, key demographic and clinical data will be collected from those pre-consenting to consider participation in clinical trials. These data will, with consent, in due course, be linkable to other healthcare, social, economic, and genetic datasets. To use the database, asthma investigators will send their eligibility criteria for participant recruitment; eligible participants will then be informed about the new trial and asked if they wish to participate. A steering committee will oversee the running of the database, including approval of usage access. Novel communication strategies will be utilised to engage participants who are recruited into the database in order to avoid attrition as a result of waiting time to participation in a suitable trial, and to minimise the risk of their being approached when already enrolled in a trial. The value of this database will be whether it proves useful and usable to researchers in facilitating recruitment into clinical trials on asthma and whether patient privacy and data security are protected in meeting this aim. Successful recruitment is

  9. Are Managed Futures Indices Telling Truth? Biases in CTA Databases and Proposals of Potential Enhancements

    Directory of Open Access Journals (Sweden)

    Adam Zaremba

    2011-07-01

    Full Text Available Managed futures are an alternative asset class which has recently became considerably popular among investment industry. However, due to its characteristics, access to managed futures historical performance statistics is relatively confined. All available information originates from commercial and academic databases, reporting to which is entirely voluntary. This situation results in series of biases which distort the managed futures performance in the eyes of investors. The paper consists of two parts. First, the author reviews and describes various biases that influence the reliability of the managed futures indices and databases. The second section encompasses author’s proposals of potential enhancements, which aim to reduce the impact of the biases in order to derive a benchmark that could better reflect characteristics of managed futures investment from the point of view of a potential investor.

  10. A Spatio-Temporal Building Exposure Database and Information Life-Cycle Management Solution

    Directory of Open Access Journals (Sweden)

    Marc Wieland

    2017-04-01

    Full Text Available With an ever-increasing volume and complexity of data collected from a variety of sources, the efficient management of geospatial information becomes a key topic in disaster risk management. For example, the representation of assets exposed to natural disasters is subjected to changes throughout the different phases of risk management reaching from pre-disaster mitigation to the response after an event and the long-term recovery of affected assets. Spatio-temporal changes need to be integrated into a sound conceptual and technological framework able to deal with data coming from different sources, at varying scales, and changing in space and time. Especially managing the information life-cycle, the integration of heterogeneous information and the distributed versioning and release of geospatial information are important topics that need to become essential parts of modern exposure modelling solutions. The main purpose of this study is to provide a conceptual and technological framework to tackle the requirements implied by disaster risk management for describing exposed assets in space and time. An information life-cycle management solution is proposed, based on a relational spatio-temporal database model coupled with Git and GeoGig repositories for distributed versioning. Two application scenarios focusing on the modelling of residential building stocks are presented to show the capabilities of the implemented solution. A prototype database model is shared on GitHub along with the necessary scenario data.

  11. Role of Waste Management in Wealth Creation in Nigeria ...

    African Journals Online (AJOL)

    The concept of entrepreneurship as it relates to waste to wealth by private sector participation (PSP) franchise is considered to have assisted the government create jobs and new businesses for many in contemporary economies, This study essentially try to access whether PSP franchise operators aid in the creation of jobs ...

  12. Creationism in Europe

    DEFF Research Database (Denmark)

    For decades, the creationist movement was primarily situated in the United States. Then, in the 1970s, American creationists found their ideas welcomed abroad, first in Australia and New Zealand, then in Korea, India, South Africa, Brazil, and elsewhere—including Europe, where creationism plays...... an expanding role in public debates about science policy and school curricula. In this, the first comprehensive history of creationism in Europe, leading historians, philosophers, and scientists narrate the rise of—and response to—scientific creationism, creation science, intelligent design, and organized...... antievolutionism in countries and religions throughout Europe. Providing a unique map of creationism in Europe, the authors chart the surprising history of creationist activities and strategies there. Over the past forty years, creationism has spread swiftly among European Catholics, Protestants, Jews, Hindus...

  13. Optimized Database of Higher Education Management Using Data Warehouse

    Directory of Open Access Journals (Sweden)

    Spits Warnars

    2010-04-01

    Full Text Available The emergence of new higher education institutions has created the competition in higher education market, and data warehouse can be used as an effective technology tools for increasing competitiveness in the higher education market. Data warehouse produce reliable reports for the institution’s high-level management in short time for faster and better decision making, not only on increasing the admission number of students, but also on the possibility to find extraordinary, unconventional funds for the institution. Efficiency comparison was based on length and amount of processed records, total processed byte, amount of processed tables, time to run query and produced record on OLTP database and data warehouse. Efficiency percentages was measured by the formula for percentage increasing and the average efficiency percentage of 461.801,04% shows that using data warehouse is more powerful and efficient rather than using OLTP database. Data warehouse was modeled based on hypercube which is created by limited high demand reports which usually used by high level management. In every table of fact and dimension fields will be inserted which represent the loading constructive merge where the ETL (Extraction, Transformation and Loading process is run based on the old and new files.

  14. Development of Human Face Literature Database Using Text Mining Approach: Phase I.

    Science.gov (United States)

    Kaur, Paramjit; Krishan, Kewal; Sharma, Suresh K

    2018-06-01

    The face is an important part of the human body by which an individual communicates in the society. Its importance can be highlighted by the fact that a person deprived of face cannot sustain in the living world. The amount of experiments being performed and the number of research papers being published under the domain of human face have surged in the past few decades. Several scientific disciplines, which are conducting research on human face include: Medical Science, Anthropology, Information Technology (Biometrics, Robotics, and Artificial Intelligence, etc.), Psychology, Forensic Science, Neuroscience, etc. This alarms the need of collecting and managing the data concerning human face so that the public and free access of it can be provided to the scientific community. This can be attained by developing databases and tools on human face using bioinformatics approach. The current research emphasizes on creating a database concerning literature data of human face. The database can be accessed on the basis of specific keywords, journal name, date of publication, author's name, etc. The collected research papers will be stored in the form of a database. Hence, the database will be beneficial to the research community as the comprehensive information dedicated to the human face could be found at one place. The information related to facial morphologic features, facial disorders, facial asymmetry, facial abnormalities, and many other parameters can be extracted from this database. The front end has been developed using Hyper Text Mark-up Language and Cascading Style Sheets. The back end has been developed using hypertext preprocessor (PHP). The JAVA Script has used as scripting language. MySQL (Structured Query Language) is used for database development as it is most widely used Relational Database Management System. XAMPP (X (cross platform), Apache, MySQL, PHP, Perl) open source web application software has been used as the server.The database is still under the

  15. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    Science.gov (United States)

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  16. FUNCTIONAL MODEL OF THE MATERIAL RESOURCES MANAGEMENT FOR PROJECTS OF THE CREATION OF NEW TECHNIQUES

    Directory of Open Access Journals (Sweden)

    S. Yu. Danshyna

    2016-01-01

    Full Text Available The article is devoted to problem of material management arising in the implementation of projects for the development and creation (modernization of the new techniques. The uniqueness of the projects, their limit on the cost and time does not allow the use of traditional approaches to resource management. Such projects are often implemented in the development of companies; where it is not possible to abandon the traditional operating methods of management. The aim of the article is a formalization of the process of material management of projects, a description of its information flows for integrate into the project management practices and for improve the efficiency of material management. For the systematization of information arising from the material resources management, invited the set-theoretic representation of the management process. According with the requirements of project management standards were described the sets and defined rules of their transformation. Specification of the set-theoretic representation helped to establish the area and limits of the modelling process. Further decomposition process became the basis of the functional model, constructed in accordance with the methodology IDEF 0. A graphical representation of the model allows you to visualize the process at different levels of detail. For specification of issues related to the organization and promotion of material flow, were developed functional models of sub-processes and were described the identified data-flows. For the harmonization of process and project approaches formulated conditions for evaluating the efficiency of material management. The developed models can be the basis for designing the structure of companies, for regulation of their project activities, as well as for establishing an information system of management resources of projects.

  17. The role of nature-conformity presentation of data in the creation of an information system for the management of science and education

    Directory of Open Access Journals (Sweden)

    Sergey A. Saltykov

    2017-01-01

    Full Text Available The role of nature-conformity in the expansion interpretation for the presentation of open data in the creation of an information system for the management of science and education is determined. The principle of nature-conformity in our definition is represented as the genesis and development of systems according to their own internal (immanent, natural and / or cultural and external - the surrounding socio-cultural and natural-biological nature. In this context, the novelty of the research is to develop such an important parameter for the modern era in the creation of information systems as the open data presentation. The unique character of the paper is also in the development of technical requirements, and in exploring the possibility of filling the information management system of science and education developed with open data. The article outlines the prospects for the practical use of the information system for the management of science and education. It is emphasized that due to the use of open data it will be possible to integrate all the developed models, tools, principles and create a modern Russian information management system for science and education in accordance with the principle of prudence of forming systems. The following issues were developed in the research: a structural and semantic analysis of the concept of «open data» was carried out; examples of successful work with open data are presented to the discussion by various organizations - state, commercial, banking, etc.;an analysis of some provisions of the state strategy of scientific and technical development of Russia is made; requirements are created for the information system of expert-textual analysis of scientific and educational research. The conducted research has showed that the role of nature-conformity in the presentation of open data in the creation of such an information management system is great and continues to grow in connection with the development

  18. Microcomputer Database Management Systems that Interface with Online Public Access Catalogs.

    Science.gov (United States)

    Rice, James

    1988-01-01

    Describes a study that assessed the availability and use of microcomputer database management interfaces to online public access catalogs. The software capabilities needed to effect such an interface are identified, and available software packages are evaluated by these criteria. A directory of software vendors is provided. (4 notes with…

  19. A survey of the use of database management systems in accelerator projects

    CERN Document Server

    Poole, John

    1995-01-01

    The International Accelerator Database Group (IADBG) was set up in 1994 to bring together the people who are working with databases in accelerator laboratories so that they can exchange information and experience. The group now has members from more than 20 institutes from all around the world, representing nearly double this number of projects. This paper is based on the information gathered by the IADBG and describes why commercial DataBase Management Systems (DBMS) are being used in accelerator projects and what they are being used for. Initially introduced to handle equipment builders' data, commercial DBMS are now being used in almost all areas of accelerators from on-line control to personnel data. A variety of commercial systems are being used in conjunction with a diverse selection of application software for data maintenance/manipulation and controls. This paper reviews the database activities known to IADBG.

  20. Modelling a critical infrastructure-driven spatial database for proactive disaster management: A developing country context

    Directory of Open Access Journals (Sweden)

    David O. Baloye

    2016-04-01

    Full Text Available The understanding and institutionalisation of the seamless link between urban critical infrastructure and disaster management has greatly helped the developed world to establish effective disaster management processes. However, this link is conspicuously missing in developing countries, where disaster management has been more reactive than proactive. The consequence of this is typified in poor response time and uncoordinated ways in which disasters and emergency situations are handled. As is the case with many Nigerian cities, the challenges of urban development in the city of Abeokuta have limited the effectiveness of disaster and emergency first responders and managers. Using geospatial techniques, the study attempted to design and deploy a spatial database running a web-based information system to track the characteristics and distribution of critical infrastructure for effective use during disaster and emergencies, with the purpose of proactively improving disaster and emergency management processes in Abeokuta. Keywords: Disaster Management; Emergency; Critical Infrastructure; Geospatial Database; Developing Countries; Nigeria

  1. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    Science.gov (United States)

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  2. Toward Value Co-Creation: Increasing Women’s Presence in Management Positions through Competition against a Set Target

    Directory of Open Access Journals (Sweden)

    Irene Comeig

    2017-10-01

    Full Text Available Despite empirical evidence that women’s presence in management positions is a source of value co-creation for firms, these positions are still male-dominated. Some evidence from experimental economics suggests that one reason for this imbalance is that women shy away from competition. However, most of these studies have focused on competition systems that pit individuals against each other. We present an economic laboratory experiment that compares competition against others with competition against a set target. The crucial difference is that whereas the former involves competing against opponents, the latter does not. Our results show that significantly more women are willing to compete against a set target than against others. Furthermore, there is no reduction in men’s participation and no general efficiency reduction. Our findings suggest that firms that aim at value co-creation and sustainability through a gender-neutral promotion mechanism should introduce competition against a set target and reduce competition against others. This paper contributes to dispelling stereotypes about women’s reluctance to compete.

  3. New orthopaedic implant management tool for computer-assisted planning, navigation, and simulation: from implant CAD files to a standardized XML-based implant database.

    Science.gov (United States)

    Sagbo, S; Blochaou, F; Langlotz, F; Vangenot, C; Nolte, L-P; Zheng, G

    2005-01-01

    Computer-Assisted Orthopaedic Surgery (CAOS) has made much progress over the last 10 years. Navigation systems have been recognized as important tools that help surgeons, and various such systems have been developed. A disadvantage of these systems is that they use non-standard formalisms and techniques. As a result, there are no standard concepts for implant and tool management or data formats to store information for use in 3D planning and navigation. We addressed these limitations and developed a practical and generic solution that offers benefits for surgeons, implant manufacturers, and CAS application developers. We developed a virtual implant database containing geometrical as well as calibration information for orthopedic implants and instruments, with a focus on trauma. This database has been successfully tested for various applications in the client/server mode. The implant information is not static, however, because manufacturers periodically revise their implants, resulting in the deletion of some implants and the introduction of new ones. Tracking these continuous changes and keeping CAS systems up to date is a tedious task if done manually. This leads to additional costs for system development, and some errors are inevitably generated due to the huge amount of information that has to be processed. To ease management with respect to implant life cycle, we developed a tool to assist end-users (surgeons, hospitals, CAS system providers, and implant manufacturers) in managing their implants. Our system can be used for pre-operative planning and intra-operative navigation, and also for any surgical simulation involving orthopedic implants. Currently, this tool allows addition of new implants, modification of existing ones, deletion of obsolete implants, export of a given implant, and also creation of backups. Our implant management system has been successfully tested in the laboratory with very promising results. It makes it possible to fill the current gap

  4. ALARA database value in future outage work planning and dose management

    International Nuclear Information System (INIS)

    Miller, D.W.; Green, W.H.

    1995-01-01

    ALARA database encompassing job-specific duration and man-rem plant specific information over three refueling outages represents an invaluable tool for the outage work planner and ALARA engineer. This paper describes dose-management trends emerging based on analysis of three refueling outages at Clinton Power Station. Conclusions reached based on hard data available from a relational database dose-tracking system is a valuable tool for planning of future outage work. The system's ability to identify key problem areas during a refueling outage is improving as more outage comparative data becomes available. Trends over a three outage period are identified in this paper in the categories of number and type of radiation work permits implemented, duration of jobs, projected vs. actual dose rates in work areas, and accuracy of outage person-rem projection. The value of the database in projecting 1 and 5 year station person-rem estimates is discussed

  5. ALARA database value in future outage work planning and dose management

    Energy Technology Data Exchange (ETDEWEB)

    Miller, D.W.; Green, W.H. [Clinton Power Station Illinois Power Co., IL (United States)

    1995-03-01

    ALARA database encompassing job-specific duration and man-rem plant specific information over three refueling outages represents an invaluable tool for the outage work planner and ALARA engineer. This paper describes dose-management trends emerging based on analysis of three refueling outages at Clinton Power Station. Conclusions reached based on hard data available from a relational database dose-tracking system is a valuable tool for planning of future outage work. The system`s ability to identify key problem areas during a refueling outage is improving as more outage comparative data becomes available. Trends over a three outage period are identified in this paper in the categories of number and type of radiation work permits implemented, duration of jobs, projected vs. actual dose rates in work areas, and accuracy of outage person-rem projection. The value of the database in projecting 1 and 5 year station person-rem estimates is discussed.

  6. Development of a functional, internet-accessible department of surgery outcomes database.

    Science.gov (United States)

    Newcomb, William L; Lincourt, Amy E; Gersin, Keith; Kercher, Kent; Iannitti, David; Kuwada, Tim; Lyons, Cynthia; Sing, Ronald F; Hadzikadic, Mirsad; Heniford, B Todd; Rucho, Susan

    2008-06-01

    The need for surgical outcomes data is increasing due to pressure from insurance companies, patients, and the need for surgeons to keep their own "report card". Current data management systems are limited by inability to stratify outcomes based on patients, surgeons, and differences in surgical technique. Surgeons along with research and informatics personnel from an academic, hospital-based Department of Surgery and a state university's Department of Information Technology formed a partnership to develop a dynamic, internet-based, clinical data warehouse. A five-component model was used: data dictionary development, web application creation, participating center education and management, statistics applications, and data interpretation. A data dictionary was developed from a list of data elements to address needs of research, quality assurance, industry, and centers of excellence. A user-friendly web interface was developed with menu-driven check boxes, multiple electronic data entry points, direct downloads from hospital billing information, and web-based patient portals. Data were collected on a Health Insurance Portability and Accountability Act-compliant server with a secure firewall. Protected health information was de-identified. Data management strategies included automated auditing, on-site training, a trouble-shooting hotline, and Institutional Review Board oversight. Real-time, daily, monthly, and quarterly data reports were generated. Fifty-eight publications and 109 abstracts have been generated from the database during its development and implementation. Seven national academic departments now use the database to track patient outcomes. The development of a robust surgical outcomes database requires a combination of clinical, informatics, and research expertise. Benefits of surgeon involvement in outcomes research include: tracking individual performance, patient safety, surgical research, legal defense, and the ability to provide accurate information

  7. Population health management as a strategy for creation of optimal healing environments in worksite and corporate settings.

    Science.gov (United States)

    Chapman, Larry S; Pelletier, Kenneth R

    2004-01-01

    This paper provides an (OHE) overview of a population health management (PHM) approach to the creation of optimal healing environments (OHEs) in worksite and corporate settings. It presents a framework for consideration as the context for potential research projects to examine the health, well-being, and economic effects of a set of newer "virtual" prevention interventions operating in an integrated manner in worksite settings. The main topics discussed are the fundamentals of PHM with basic terminology and core principles, a description of PHM core technology and implications of a PHM approach to creating OHEs.

  8. Software configuration management plan for the TWRS controlled baseline database system [TCBD

    International Nuclear Information System (INIS)

    Spencer, S.G.

    1998-01-01

    LHMC, TWRS Business Management Organization (BMO) is designated as system owner, operator, and maintenance authority. The TWAS BMO identified the need for the TCBD. The TWRS BMO users have established all requirements for the database and are responsible for maintaining database integrity and control (after the interface data has been received). Initial interface data control and integrity is maintained through functional and administrative processes and is the responsibility of the database owners who are providing the data. The specific groups within the TWRS BMO affected by this plan are the Financial Management and TWRS Management Support Project, Master Planning, and the Financial Control Integration and Reporting. The interfaces between these organizations are through normal line management chain of command. The Master Planning Group is assigned the responsibility to continue development and maintenance of the TCBD. This group maintains information that includes identification of requirements and changes to those requirements in a TCBD project file. They are responsible for the issuance, maintenance, and change authority of this SCW. LHMC, TWRS TCBD Users are designated as providing the project's requirement changes for implementation and also testing of the TCBD during development. The Master Planning Group coordinates and monitors the user's requests for system requirements (new/existing) as well as beta and acceptance testing. Users are those individuals and organizations needing data or information from the TCBD and having both a need-to-know and the proper training and authority to access the database. Each user or user organization is required to comply with the established requirements and procedures governing the TCBD. Lockheed Martin Services, Inc. (LMSI) is designated the TCBD developer, maintainer, and custodian until acceptance and process testing of the system has been completed via the TWRS BMO. Once this occurs, the TCBD will be completed and

  9. Software configuration management plan for the Hanford site technical database

    International Nuclear Information System (INIS)

    GRAVES, N.J.

    1999-01-01

    The Hanford Site Technical Database (HSTD) is used as the repository/source for the technical requirements baseline and programmatic data input via the Hanford Site and major Hanford Project Systems Engineering (SE) activities. The Hanford Site SE effort has created an integrated technical baseline for the Hanford Site that supports SE processes at the Site and project levels which is captured in the HSTD. The HSTD has been implemented in Ascent Logic Corporation (ALC) Commercial Off-The-Shelf (COTS) package referred to as the Requirements Driven Design (RDD) software. This Software Configuration Management Plan (SCMP) provides a process and means to control and manage software upgrades to the HSTD system

  10. IT Tools and their Use in Strategy Creation in Respect of Economic Results of a Company

    Directory of Open Access Journals (Sweden)

    Ladislav Pálka

    2016-01-01

    Full Text Available Purpose of the article: The article analyzes the current state of information technology in terms of their use in a strategy creation of a company in relation to monitoring the economic results of a company. It investigates, identifies and evaluates the overall situation of the concept and principles of these tools, their effectiveness in drawing up the strategy and strategic company goals, the ability to perform a variety of economic analysis without the need of a complex operation and understanding, but also for an effective evaluation of data for a planning support, management and deciding of management components, leading to the overall success of a company. The reason for this monitoring is a considerable difference between strategic company planning and its real results. Methodology/methods: In terms of methodology, the literature review of the current state of the issue has been used. – Primary: interviews, observations, expert estimation. – Secondary: evaluation of the data from the database of IS, documentation of seminars. – Quantitative Research: mapping the orientation of the issue, the confrontation with the theory. – Qualitative research: projective, structured interview (by users and suppliers. Scientific aim: The main aim of the work is to solve the problems of management and evaluation of the economic process in respect of information technology tools in connection with the formation of corporate strategy and monitoring of financial results of the company. The reason for selecting of the above-mentioned issue is the fact that information technology resources are currently not used in the creation of corporate strategy, specifically in the area of economic goals. Findings: To describe the situation in the region and to clearly define the basic problems used as a basis for the use of IT support tools in creation of corporate strategy, namely economic goals and the use of feedback of information support tools for assessing

  11. Toward public volume database management: a case study of NOVA, the National Online Volumetric Archive

    Science.gov (United States)

    Fletcher, Alex; Yoo, Terry S.

    2004-04-01

    Public databases today can be constructed with a wide variety of authoring and management structures. The widespread appeal of Internet search engines suggests that public information be made open and available to common search strategies, making accessible information that would otherwise be hidden by the infrastructure and software interfaces of a traditional database management system. We present the construction and organizational details for managing NOVA, the National Online Volumetric Archive. As an archival effort of the Visible Human Project for supporting medical visualization research, archiving 3D multimodal radiological teaching files, and enhancing medical education with volumetric data, our overall database structure is simplified; archives grow by accruing information, but seldom have to modify, delete, or overwrite stored records. NOVA is being constructed and populated so that it is transparent to the Internet; that is, much of its internal structure is mirrored in HTML allowing internet search engines to investigate, catalog, and link directly to the deep relational structure of the collection index. The key organizational concept for NOVA is the Image Content Group (ICG), an indexing strategy for cataloging incoming data as a set structure rather than by keyword management. These groups are managed through a series of XML files and authoring scripts. We cover the motivation for Image Content Groups, their overall construction, authorship, and management in XML, and the pilot results for creating public data repositories using this strategy.

  12. Supporting Telecom Business Processes by means of Workflow Management and Federated Databases

    NARCIS (Netherlands)

    Nijenhuis, Wim; Jonker, Willem; Grefen, P.W.P.J.

    This report addresses the issues related to the use of workflow management systems and federated databases to support business processes that operate on large and heterogeneous collections of autonomous information systems. We discuss how they can enhance the overall IT-architecture. Starting from

  13. A geospatial database model for the management of remote sensing datasets at multiple spectral, spatial, and temporal scales

    Science.gov (United States)

    Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George

    2017-10-01

    In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.

  14. Heterogeneous motives and the collective creation of value

    NARCIS (Netherlands)

    Bridoux, F.; Coeurderoy, R.; Durand, R.

    2011-01-01

    The collective creation of value has remained underexplored in management research. Drawing on social psychology and behavioral economics, we analyze the impact of the mix of employee motives to cooperate and compare the collective value generated by three motivational systems: individual monetary

  15. Metabolonote: A wiki-based database for managing hierarchical metadata of metabolome analyses

    Directory of Open Access Journals (Sweden)

    Takeshi eAra

    2015-04-01

    Full Text Available Metabolomics—technology for comprehensive detection of small molecules in an organism—lags behind the other omics in terms of publication and dissemination of experimental data. Among the reasons for this are difficulty precisely recording information about complicated analytical experiments (metadata, existence of various databases with their own metadata descriptions, and low reusability of the published data, resulting in submitters (the researchers who generate the data being insufficiently motivated. To tackle these issues, we developed Metabolonote, a Semantic MediaWiki-based database designed specifically for managing metabolomic metadata. We also defined a metadata and data description format, called TogoMD, with an ID system that is required for unique access to each level of the tree-structured metadata such as study purpose, sample, analytical method, and data analysis. Separation of the management of metadata from that of data and permission to attach related information to the metadata provide advantages for submitters, readers, and database developers. The metadata are enriched with information such as links to comparable data, thereby functioning as a hub of related data resources. They also enhance not only readers' understanding and use of data, but also submitters' motivation to publish the data. The metadata are computationally shared among other systems via APIs, which facilitates the construction of novel databases by database developers. A permission system that allows publication of immature metadata and feedback from readers also helps submitters to improve their metadata. Hence, this aspect of Metabolonote, as a metadata preparation tool, is complementary to high-quality and persistent data repositories such as MetaboLights. A total of 808 metadata for analyzed data obtained from 35 biological species are published currently. Metabolonote and related tools are available free of cost at http://metabolonote.kazusa.or.jp/.

  16. Planning the future of JPL's management and administrative support systems around an integrated database

    Science.gov (United States)

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  17. Preparation of Database for Land use Management in North East of Cairo

    International Nuclear Information System (INIS)

    El-Ghawaby, A.M.

    2012-01-01

    Environmental management in urban areas is difficult due to the amount and miscellaneous data needed for decision making. This amount of data is splendid without adequate database systems and modern methodologies. A geo-database building for East Cairo City Area (ECCA) is built to be used in the process of urban land-use suitability to achieve better performance compared with usual methods used. This Geo-database has required availability of detailed, accurate, updated and geographically referenced data on its terrain physical characteristics and its expected environmental hazards that may occur. A smart environmental suitability model for ECCA is developed and implemented using ERDAS IMAGINE 9.2. This model is capable of suggesting the more appropriate urban land-use, based on the existing spatial and non-spatial potentials and constraints.

  18. Databases

    Directory of Open Access Journals (Sweden)

    Nick Ryan

    2004-01-01

    Full Text Available Databases are deeply embedded in archaeology, underpinning and supporting many aspects of the subject. However, as well as providing a means for storing, retrieving and modifying data, databases themselves must be a result of a detailed analysis and design process. This article looks at this process, and shows how the characteristics of data models affect the process of database design and implementation. The impact of the Internet on the development of databases is examined, and the article concludes with a discussion of a range of issues associated with the recording and management of archaeological data.

  19. Database Administrator

    Science.gov (United States)

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  20. Information Management in Creative Engineering Design and Capabilities of Database Transactions

    DEFF Research Database (Denmark)

    Jacobsen, Kim; Eastman, C. A.; Jeng, T. S.

    1997-01-01

    This paper examines the information management requirements and sets forth the general criteria for collaboration and concurrency control in creative engineering design. Our work attempts to recognize the full range of concurrency, collaboration and complex transactions structure now practiced...... in manual and semi-automated design and the range of capabilities needed as the demands for enhanced but flexible electronic information management unfolds.The objective of this paper is to identify new issues that may advance the use of databases to support creative engineering design. We start...... with a generalized description of the structure of design tasks and how information management in design is dealt with today. After this review, we identify extensions to current information management capabilities that have been realized and/or proposed to support/augment what designers can do now. Given...

  1. GSIMF: a web service based software and database management system for the next generation grids

    International Nuclear Information System (INIS)

    Wang, N; Ananthan, B; Gieraltowski, G; May, E; Vaniachine, A

    2008-01-01

    To process the vast amount of data from high energy physics experiments, physicists rely on Computational and Data Grids; yet, the distribution, installation, and updating of a myriad of different versions of different programs over the Grid environment is complicated, time-consuming, and error-prone. Our Grid Software Installation Management Framework (GSIMF) is a set of Grid Services that has been developed for managing versioned and interdependent software applications and file-based databases over the Grid infrastructure. This set of Grid services provide a mechanism to install software packages on distributed Grid computing elements, thus automating the software and database installation management process on behalf of the users. This enables users to remotely install programs and tap into the computing power provided by Grids

  2. Coordinating Mobile Databases: A System Demonstration

    OpenAIRE

    Zaihrayeu, Ilya; Giunchiglia, Fausto

    2004-01-01

    In this paper we present the Peer Database Management System (PDBMS). This system runs on top of the standard database management system, and it allows it to connect its database with other (peer) databases on the network. A particularity of our solution is that PDBMS allows for conventional database technology to be effectively operational in mobile settings. We think of database mobility as a database network, where databases appear and disappear spontaneously and their network access point...

  3. The image database management system of teaching file using personal computer

    International Nuclear Information System (INIS)

    Shin, M. J.; Kim, G. W.; Chun, T. J.; Ahn, W. H.; Baik, S. K.; Choi, H. Y.; Kim, B. G.

    1995-01-01

    For the systemic management and easy using of teaching file in radiology department, the authors tried to do the setup of a database management system of teaching file using personal computer. We used a personal computer (IBM PC compatible, 486DX2) including a image capture card(Window vision, Dooin Elect, Seoul, Korea) and video camera recorder (8mm, CCD-TR105, Sony, Tokyo, Japan) for the acquisition and storage of images. We developed the database program by using Foxpro for Window 2.6(Microsoft, Seattle, USA) executed in the Window 3.1 (Microsoft, Seattle, USA). Each datum consisted of hospital number, name, sex, age, examination date, keyword, radiologic examination modalities, final diagnosis, radiologic findings, references and representative images. The images were acquired and stored as bitmap format (8 bitmap, 540 X 390 ∼ 545 X 414, 256 gray scale) and displayed on the 17 inch-flat monitor(1024 X 768, Samtron, Seoul, Korea). Without special devices, the images acquisition and storage could be done on the reading viewbox, simply. The image quality on the computer's monitor was less than the one of original film on the viewbox, but generally the characteristics of each lesions could be differentiated. Easy retrieval of data was possible for the purpose of teaching file system. Without high cost appliances, we could consummate the image database system of teaching file using personal computer with relatively inexpensive method

  4. Knowledge creation and transfer among postgraduate students

    Directory of Open Access Journals (Sweden)

    Kreeson Naicker

    2014-08-01

    Objectives: This article reports on an exploratory study undertaken to ascertain how knowledge is created and transferred amongst post-graduate (PG students, using the knowledge (socialisation, externalisation, combination, internalisation [SECI] spiral model. Method: After reviewing relevant literature, a personally administered standardised questionnaire was used to collect data from a convenience sample of PG students in the School of Management, IT and Governance at the University of KwaZulu-Natal, South Africa. The data was analysed to determine if it fit the model based on the four modes of knowledge conversion. Results: Although the School of Management, IT and Governance has mechanisms in place to facilitate knowledge creation and transfer, it nevertheless tends to focus on the four modes of knowledge conversion to varying degrees. Conclusion: The study confirmed that PG students utilise the ‘socialisation’ and ‘externalisation’ modes of knowledge conversion comprehensively; ‘internalisation’ plays a significant role in their knowledge creation and transfer activities and whilst ‘combination’ is utilised to a lesser extent, it still plays a role in PG students’ knowledge creation and transfer activities. PG students also have ‘space’ that allows them to bring hunches, thoughts, notions, intuition or tacit knowledge into reality. Trust and dedication are common amongst PG students. With socialisation and externalisation so high, PG students are aware of each other’s capabilities and competencies, and trust each other enough to share knowledge.

  5. RODOS database adapter

    International Nuclear Information System (INIS)

    Xie Gang

    1995-11-01

    Integrated data management is an essential aspect of many automatical information systems such as RODOS, a real-time on-line decision support system for nuclear emergency management. In particular, the application software must provide access management to different commercial database systems. This report presents the tools necessary for adapting embedded SQL-applications to both HP-ALLBASE/SQL and CA-Ingres/SQL databases. The design of the database adapter and the concept of RODOS embedded SQL syntax are discussed by considering some of the most important features of SQL-functions and the identification of significant differences between SQL-implementations. Finally fully part of the software developed and the administrator's and installation guides are described. (orig.) [de

  6. Risk Management and Value Creation

    DEFF Research Database (Denmark)

    Andersen, Torben Juul; Roggi, Oliviero

    Corporate failures, periodic recessions, regional debt crises and volatile financial markets have intensified the focus on risk management as the means to deal with turbulent conditions. The ability to respond effectively to abrupt environmental impacts is considered an important source...... of competitive advantage. Yet, surprisingly little research has analyzed whether the presumed advantages of effective risk management are associated with superior outcomes. Here we present a comprehensive study of risk management effectiveness and the relationship to corporate performance based on more than 33......,500 observations in 3,400 firms over the turbulent 20-year period 1991-2010. Determining effective risk management as the ability to reduce earnings and cash flow volatility, we find that both have significant positive relationships to lagged performance measures after controlling for industry effects, company...

  7. ORACLE DATABASE SECURITY

    OpenAIRE

    Cristina-Maria Titrade

    2011-01-01

    This paper presents some security issues, namely security database system level, data level security, user-level security, user management, resource management and password management. Security is a constant concern in the design and database development. Usually, there are no concerns about the existence of security, but rather how large it should be. A typically DBMS has several levels of security, in addition to those offered by the operating system or network. Typically, a DBMS has user a...

  8. The new Scandinavian Donations and Transfusions database (SCANDAT2)

    DEFF Research Database (Denmark)

    Edgren, Gustaf; Rostgaard, Klaus; Vasan, Senthil K

    2015-01-01

    : It is possible to create a binational, nationwide database with almost 50 years of follow-up of blood donors and transfused patients for a range of health outcomes. We aim to use this database for further studies of donor health, transfusion-associated risks, and transfusion-transmitted disease....... AND METHODS: We have previously created the anonymized Scandinavian Donations and Transfusions (SCANDAT) database, containing data on blood donors, blood transfusions, and transfused patients, with complete follow-up of donors and patients for a range of health outcomes. Here we describe the re......-creation of SCANDAT with updated, identifiable data. We collected computerized data on blood donations and transfusions from blood banks covering all of Sweden and Denmark. After data cleaning, two structurally identical databases were created and the entire database was linked with nationwide health outcomes...

  9. A Framework to Simplify the Creation of Remote Laboratories

    Directory of Open Access Journals (Sweden)

    Isidro Calvo

    2010-05-01

    Full Text Available Building remote laboratories is not a trivial issue since they are complex systems in which a great number of factors (security, QoS, integration of information of different nature, etc. are involved. This complexity requires the use of diverse technologies that complicate the creation of the laboratories. Current work presents a framework to ease the creation of remote laboratories (both real and virtual from a set of reusable blocks that solve most common issues (connection, student management, experiments assessment, etc. so the designers of the experiments may concentrate on their functionality. The followed approach proposes the use of certain technologies widely used in the control engineering community such as Labview and EJS, so the creation of a new laboratory will require the creation of two applications separately that will be integrated within the framework: (1 One Labview application to acquire process information and (2 a Java applet created with EJS used as graphical interface. The proposed framework was used with a water level automatic system to show how to add new experiments to the framework.

  10. Integration of the ATLAS tag database with data management and analysis components

    Energy Technology Data Exchange (ETDEWEB)

    Cranshaw, J; Malon, D [Argonne National Laboratory, Argonne, IL 60439 (United States); Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C [Department of Physics and Astronomy, University of Glasgow, Glasgow, G12 8QQ, Scotland (United Kingdom)], E-mail: c.nicholson@physics.gla.ac.uk

    2008-07-15

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted.

  11. Integration of the ATLAS tag database with data management and analysis components

    International Nuclear Information System (INIS)

    Cranshaw, J; Malon, D; Doyle, A T; Kenyon, M J; McGlone, H; Nicholson, C

    2008-01-01

    The ATLAS Tag Database is an event-level metadata system, designed to allow efficient identification and selection of interesting events for user analysis. By making first-level cuts using queries on a relational database, the size of an analysis input sample could be greatly reduced and thus the time taken for the analysis reduced. Deployment of such a Tag database is underway, but to be most useful it needs to be integrated with the distributed data management (DDM) and distributed analysis (DA) components. This means addressing the issue that the DDM system at ATLAS groups files into datasets for scalability and usability, whereas the Tag Database points to events in files. It also means setting up a system which could prepare a list of input events and use both the DDM and DA systems to run a set of jobs. The ATLAS Tag Navigator Tool (TNT) has been developed to address these issues in an integrated way and provide a tool that the average physicist can use. Here, the current status of this work is presented and areas of future work are highlighted

  12. Security Research on Engineering Database System

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Engine engineering database system is an oriented C AD applied database management system that has the capability managing distributed data. The paper discusses the security issue of the engine engineering database management system (EDBMS). Through studying and analyzing the database security, to draw a series of securi ty rules, which reach B1, level security standard. Which includes discretionary access control (DAC), mandatory access control (MAC) and audit. The EDBMS implem ents functions of DAC, ...

  13. Preliminary study for unified management of CANDU safety codes and construction of database system

    International Nuclear Information System (INIS)

    Min, Byung Joo; Kim, Hyoung Tae

    2003-03-01

    It is needed to develop the Graphical User Interface(GUI) for the unified management of CANDU safety codes and to construct database system for the validation of safety codes, for which the preliminary study is done in the first stage of the present work. The input and output structures and data flow of CATHENA and PRESCON2 are investigated and the interaction of the variables between CATHENA and PRESCON2 are identified. Furthermore, PC versions of CATHENA and PRESCON2 codes are developed for the interaction of these codes and GUI(Graphic User Interface). The PC versions are assessed by comparing the calculation results with those by HP workstation or from FSAR(Final Safety Analysis Report). Preliminary study on the GUI for the safety codes in the unified management system are done. The sample of GUI programming is demonstrated preliminarily. Visual C++ is selected as the programming language for the development of GUI system. The data for Wolsong plants, reactor core, and thermal-hydraulic experiments executed in the inside and outside of the country, are collected and classified following the structure of the database system, of which two types are considered for the final web-based database system. The preliminary GUI programming for database system is demonstrated, which is updated in the future work

  14. Drug residues in urban water: A database for ecotoxicological risk management.

    Science.gov (United States)

    Destrieux, Doriane; Laurent, François; Budzinski, Hélène; Pedelucq, Julie; Vervier, Philippe; Gerino, Magali

    2017-12-31

    Human-use drug residues (DR) are only partially eliminated by waste water treatment plants (WWTPs), so that residual amounts can reach natural waters and cause environmental hazards. In order to properly manage these hazards in the aquatic environment, a database is made available that integrates the concentration ranges for DR, which cause adverse effects for aquatic organisms, and the temporal variations of the ecotoxicological risks. To implement this database for the ecotoxicological risk assessment (ERA database), the required information for each DR is the predicted no effect concentrations (PNECs), along with the predicted environmental concentrations (PECs). The risk assessment is based on the ratio between the PNECs and the PECs. Adverse effect data or PNECs have been found in the publicly available literature for 45 substances. These ecotoxicity test data have been extracted from 125 different sources. This ERA database contains 1157 adverse effect data and 287 PNECs. The efficiency of this ERA database was tested with a data set coming from a simultaneous survey of WWTPs and the natural environment. In this data set, 26 DR were searched for in two WWTPs and in the river. On five sampling dates, concentrations measured in the river for 10 DR could pose environmental problems of which 7 were measured only downstream of WWTP outlets. From scientific literature and measurements, data implementation with unit homogenisation in a single database facilitates the actual ecotoxicological risk assessment, and may be useful for further risk coming from data arising from the future field survey. Moreover, the accumulation of a large ecotoxicity data set in a single database should not only improve knowledge of higher risk molecules but also supply an objective tool to help the rapid and efficient evaluation of the risk. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Theories of opportunity creation and effective entrepreneurial actions in opportunity creation context

    Directory of Open Access Journals (Sweden)

    Behrooz Jamali

    2018-09-01

    Full Text Available Created opportunities are refered as the opportunities in which none of the supply and demand parties exists clearly and obviously one or both of them must be created. Therefore, several economic inventions should take place in marketing, franchising, etc. so that opportunity can be created. This perception of opportunity deals with the creation of new markets. In the meantime, identifying some entrepreneurial actions influencing on the creation of entrepreneurial opportunities can provide backgrounds for the formation and empowering the opportunity creation. In this paper, some basic ideas about the creation of entrepreneurial opportunities and the evolution of opportunity creation theories are examined. Then effective actions on the opportunity creation are identified. Finally, the structure of the investigated actions is examined using the DEMATLE Method. The results which were according to the opinions of 15 experts of entrepreneurship showed that leadership, decision making, and strategy actions influence other entrepreneurial actions.

  16. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    Science.gov (United States)

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  17. Challenges of Knowledge Management and Creation in Communities of Practice Organisations of Deaf and Non-Deaf Members: Requirements for a Web Platform

    Science.gov (United States)

    de Freitas Guilhermino Trindade, Daniela; Guimaraes, Cayley; Antunes, Diego Roberto; Garcia, Laura Sanchez; Lopes da Silva, Rafaella Aline; Fernandes, Sueli

    2012-01-01

    This study analysed the role of knowledge management (KM) tools used to cultivate a community of practice (CP) in its knowledge creation (KC), transfer, learning processes. The goal of such observations was to determine requirements that KM tools should address for the specific CP formed by Deaf and non-Deaf members of the CP. The CP studied is a…

  18. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    Science.gov (United States)

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  19. PARPs database: A LIMS systems for protein-protein interaction data mining or laboratory information management system

    Directory of Open Access Journals (Sweden)

    Picard-Cloutier Aude

    2007-12-01

    Full Text Available Abstract Background In the "post-genome" era, mass spectrometry (MS has become an important method for the analysis of proteins and the rapid advancement of this technique, in combination with other proteomics methods, results in an increasing amount of proteome data. This data must be archived and analysed using specialized bioinformatics tools. Description We herein describe "PARPs database," a data analysis and management pipeline for liquid chromatography tandem mass spectrometry (LC-MS/MS proteomics. PARPs database is a web-based tool whose features include experiment annotation, protein database searching, protein sequence management, as well as data-mining of the peptides and proteins identified. Conclusion Using this pipeline, we have successfully identified several interactions of biological significance between PARP-1 and other proteins, namely RFC-1, 2, 3, 4 and 5.

  20. Creation of Norms for the Purpose of Global Talent Management

    Science.gov (United States)

    Hedricks, Cynthia A.; Robie, Chet; Harnisher, John V.

    2008-01-01

    Personality scores were used to construct three databases of global norms. The composition of the three databases varied according to percentage of cases by global region, occupational group, applicant status, and gender of the job candidate. Comparison of personality scores across the three norms databases revealed that the magnitude of the…

  1. The Kepler DB, a Database Management System for Arrays, Sparse Arrays and Binary Data

    Science.gov (United States)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-01-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30-minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database (Kepler DB) management system was created to act as the repository of this information. After one year of ight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  2. The Kepler DB: a database management system for arrays, sparse arrays, and binary data

    Science.gov (United States)

    McCauliff, Sean; Cote, Miles T.; Girouard, Forrest R.; Middour, Christopher; Klaus, Todd C.; Wohler, Bill

    2010-07-01

    The Kepler Science Operations Center stores pixel values on approximately six million pixels collected every 30 minutes, as well as data products that are generated as a result of running the Kepler science processing pipeline. The Kepler Database management system (Kepler DB)was created to act as the repository of this information. After one year of flight usage, Kepler DB is managing 3 TiB of data and is expected to grow to over 10 TiB over the course of the mission. Kepler DB is a non-relational, transactional database where data are represented as one-dimensional arrays, sparse arrays or binary large objects. We will discuss Kepler DB's APIs, implementation, usage and deployment at the Kepler Science Operations Center.

  3. Study on Mandatory Access Control in a Secure Database Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper proposes a security policy model for mandatory access control in class B1 database management system whose level of labeling is tuple. The relation-hierarchical data model is extended to multilevel relation-hierarchical data model. Based on the multilevel relation-hierarchical data model, the concept of upper-lower layer relational integrity is presented after we analyze and eliminate the covert channels caused by the database integrity. Two SQL statements are extended to process polyinstantiation in the multilevel secure environment. The system is based on the multilevel relation-hierarchical data model and is capable of integratively storing and manipulating multilevel complicated objects (e. g., multilevel spatial data) and multilevel conventional data ( e. g., integer. real number and character string).

  4. Ultra-Structure database design methodology for managing systems biology data and analyses

    Directory of Open Access Journals (Sweden)

    Hemminger Bradley M

    2009-08-01

    Full Text Available Abstract Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping. Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find

  5. Survey < > Creation

    DEFF Research Database (Denmark)

    2017-01-01

    The project, Survey Creation suggests that point cloud models from 3D scans of an existing space can be the source for explorative drawings. By probing into the procedure of 3D laser scanning, it became possible to make use of the available point clouds to both access geometric representation......) and the creation drawing (of the anticipated)....

  6. Management Guidelines for Database Developers' Teams in Software Development Projects

    Science.gov (United States)

    Rusu, Lazar; Lin, Yifeng; Hodosi, Georg

    Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.

  7. The Future of Asset Management for Human Space Exploration: Supply Classification and an Integrated Database

    Science.gov (United States)

    Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert

    2006-01-01

    One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.

  8. Database system for management of health physics and industrial hygiene records

    International Nuclear Information System (INIS)

    Murdoch, B. T.; Blomquist, J. A.; Cooke, R. H.; Davis, J. T.; Davis, T. M.; Dolecek, E. H.; Halka-Peel, L.; Johnson, D.; Keto, D. N.; Reyes, L. R.; Schlenker, R. A.; Woodring; J. L.

    1999-01-01

    This paper provides an overview of the Worker Protection System (WPS), a client/server, Windows-based database management system for essential radiological protection and industrial hygiene. Seven operational modules handle records for external dosimetry, bioassay/internal dosimetry, sealed sources, routine radiological surveys, lasers, workplace exposure, and respirators. WPS utilizes the latest hardware and software technologies to provide ready electronic access to a consolidated source of worker protection

  9. An XML-based system for synthesis of data from disparate databases.

    Science.gov (United States)

    Kurc, Tahsin; Janies, Daniel A; Johnson, Andrew D; Langella, Stephen; Oster, Scott; Hastings, Shannon; Habib, Farhat; Camerlengo, Terry; Ervin, David; Catalyurek, Umit V; Saltz, Joel H

    2006-01-01

    Diverse data sets have become key building blocks of translational biomedical research. Data types captured and referenced by sophisticated research studies include high throughput genomic and proteomic data, laboratory data, data from imagery, and outcome data. In this paper, the authors present the application of an XML-based data management system to support integration of data from disparate data sources and large data sets. This system facilitates management of XML schemas and on-demand creation and management of XML databases that conform to these schemas. They illustrate the use of this system in an application for genotype-phenotype correlation analyses. This application implements a method of phenotype-genotype correlation based on phylogenetic optimization of large data sets of mouse SNPs and phenotypic data. The application workflow requires the management and integration of genomic information and phenotypic data from external data repositories and from the results of phenotype-genotype correlation analyses. Our implementation supports the process of carrying out a complex workflow that includes large-scale phylogenetic tree optimizations and application of Maddison's concentrated changes test to large phylogenetic tree data sets. The data management system also allows collaborators to share data in a uniform way and supports complex queries that target data sets.

  10. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    Science.gov (United States)

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.

  11. An Integrated Photogrammetric and Spatial Database Management System for Producing Fully Structured Data Using Aerial and Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Farshid Farnood Ahmadi

    2009-03-01

    Full Text Available 3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs; direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS is presented.

  12. Alignment of high-throughput sequencing data inside in-memory databases.

    Science.gov (United States)

    Firnkorn, Daniel; Knaup-Gregori, Petra; Lorenzo Bermejo, Justo; Ganzinger, Matthias

    2014-01-01

    In times of high-throughput DNA sequencing techniques, performance-capable analysis of DNA sequences is of high importance. Computer supported DNA analysis is still an intensive time-consuming task. In this paper we explore the potential of a new In-Memory database technology by using SAP's High Performance Analytic Appliance (HANA). We focus on read alignment as one of the first steps in DNA sequence analysis. In particular, we examined the widely used Burrows-Wheeler Aligner (BWA) and implemented stored procedures in both, HANA and the free database system MySQL, to compare execution time and memory management. To ensure that the results are comparable, MySQL has been running in memory as well, utilizing its integrated memory engine for database table creation. We implemented stored procedures, containing exact and inexact searching of DNA reads within the reference genome GRCh37. Due to technical restrictions in SAP HANA concerning recursion, the inexact matching problem could not be implemented on this platform. Hence, performance analysis between HANA and MySQL was made by comparing the execution time of the exact search procedures. Here, HANA was approximately 27 times faster than MySQL which means, that there is a high potential within the new In-Memory concepts, leading to further developments of DNA analysis procedures in the future.

  13. Databases and bookkeeping for HEP experiments

    International Nuclear Information System (INIS)

    Blobel, V.; Cnops, A.-M.; Fisher, S.M.

    1983-09-01

    The term database is explained as well as the requirements for data bases in High Energy physics (HEP). Also covered are the packages used in HEP, summary of user experience, database management systems, relational database management systems for HEP use and observations. (U.K.)

  14. Microsoft Access Small Business Solutions State-of-the-Art Database Models for Sales, Marketing, Customer Management, and More Key Business Activities

    CERN Document Server

    Hennig, Teresa; Linson, Larry; Purvis, Leigh; Spaulding, Brent

    2010-01-01

    Database models developed by a team of leading Microsoft Access MVPs that provide ready-to-use solutions for sales, marketing, customer management and other key business activities for most small businesses. As the most popular relational database in the world, Microsoft Access is widely used by small business owners. This book responds to the growing need for resources that help business managers and end users design and build effective Access database solutions for specific business functions. Coverage includes::; Elements of a Microsoft Access Database; Relational Data Model; Dealing with C

  15. SAADA: Astronomical Databases Made Easier

    Science.gov (United States)

    Michel, L.; Nguyen, H. N.; Motch, C.

    2005-12-01

    Many astronomers wish to share datasets with their community but have not enough manpower to develop databases having the functionalities required for high-level scientific applications. The SAADA project aims at automatizing the creation and deployment process of such databases. A generic but scientifically relevant data model has been designed which allows one to build databases by providing only a limited number of product mapping rules. Databases created by SAADA rely on a relational database supporting JDBC and covered by a Java layer including a lot of generated code. Such databases can simultaneously host spectra, images, source lists and plots. Data are grouped in user defined collections whose content can be seen as one unique set per data type even if their formats differ. Datasets can be correlated one with each other using qualified links. These links help, for example, to handle the nature of a cross-identification (e.g., a distance or a likelihood) or to describe their scientific content (e.g., by associating a spectrum to a catalog entry). The SAADA query engine is based on a language well suited to the data model which can handle constraints on linked data, in addition to classical astronomical queries. These constraints can be applied on the linked objects (number, class and attributes) and/or on the link qualifier values. Databases created by SAADA are accessed through a rich WEB interface or a Java API. We are currently developing an inter-operability module implanting VO protocols.

  16. Techniques for Automatic Creation of Terrain Databases for Training and Mission Preparation

    NARCIS (Netherlands)

    Kuijper, F.; Son, R. van; Meurs, F. van; Smelik, R.M.; Kraker, J.K. de

    2010-01-01

    In the support of defense agencies and civil authorities TNO runs a research program that strives after automatic generation of terrain databases for a variety of simulation applications. Earlier papers by TNO at the IMAGE conference have reported in-depth on specific projects within this program.

  17. A user-friendly phytoremediation database: creating the searchable database, the users, and the broader implications.

    Science.gov (United States)

    Famulari, Stevie; Witz, Kyla

    2015-01-01

    Designers, students, teachers, gardeners, farmers, landscape architects, architects, engineers, homeowners, and others have uses for the practice of phytoremediation. This research looks at the creation of a phytoremediation database which is designed for ease of use for a non-scientific user, as well as for students in an educational setting ( http://www.steviefamulari.net/phytoremediation ). During 2012, Environmental Artist & Professor of Landscape Architecture Stevie Famulari, with assistance from Kyla Witz, a landscape architecture student, created an online searchable database designed for high public accessibility. The database is a record of research of plant species that aid in the uptake of contaminants, including metals, organic materials, biodiesels & oils, and radionuclides. The database consists of multiple interconnected indexes categorized into common and scientific plant name, contaminant name, and contaminant type. It includes photographs, hardiness zones, specific plant qualities, full citations to the original research, and other relevant information intended to aid those designing with phytoremediation search for potential plants which may be used to address their site's need. The objective of the terminology section is to remove uncertainty for more inexperienced users, and to clarify terms for a more user-friendly experience. Implications of the work, including education and ease of browsing, as well as use of the database in teaching, are discussed.

  18. Creation of a common information system on the Republic of Kazakhstan radiation hazardous objects

    International Nuclear Information System (INIS)

    Kadyrzhanov, K.K.; Kuterbekov, K.A.; Lukashenko, S.N.; Morenko, V.S.; Glushchenko, V.N.

    2005-01-01

    Works on creation of a common information system on the Republic of Kazakhstan territory radiation hazardous objects for providing of radiation situation control and stewardship decision making under nature-conservative measures conducting are considered. The information system is forming on the base of up-to-date GIS system - ArcGIS - and incorporates two databases - geographical and attributive

  19. Path Creation

    DEFF Research Database (Denmark)

    Karnøe, Peter; Garud, Raghu

    2012-01-01

    This paper employs path creation as a lens to follow the emergence of the Danish wind turbine cluster. Supplier competencies, regulations, user preferences and a market for wind power did not pre-exist; all had to emerge in a tranformative manner involving multiple actors and artefacts. Competenc......This paper employs path creation as a lens to follow the emergence of the Danish wind turbine cluster. Supplier competencies, regulations, user preferences and a market for wind power did not pre-exist; all had to emerge in a tranformative manner involving multiple actors and artefacts....... Competencies emerged through processes and mechanisms such as co-creation that implicated multiple learning processes. The process was not an orderly linear one as emergent contingencies influenced the learning processes. An implication is that public policy to catalyse clusters cannot be based...

  20. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 5

    International Nuclear Information System (INIS)

    2003-05-01

    The document consists of two parts: Overview and Country Waste Profile Reports for Reporting Year 2000. The first section contains overview reports that provide assessments of the achievements and shortcomings of the Net Enabled Waste Management Database (NEWMDB) during the first two data collection cycles (July 2001 to March 2002 and July 2002 to February 2003). The second part of the report includes a summary and compilation of waste management data submitted by Agency Member States in both the first and second data collection cycles

  1. Creationism in Europe

    DEFF Research Database (Denmark)

    For decades, the creationist movement was primarily situated in the United States. Then, in the 1970s, American creationists found their ideas welcomed abroad, first in Australia and New Zealand, then in Korea, India, South Africa, Brazil, and elsewhere—including Europe, where creationism plays...... the teaching of creationism as a scientific discipline on an equal footing with the theory of evolution." Creationism in Europe offers a discerning introduction to the cultural history of modern Europe, the variety of worldviews in Europe, and the interplay of science and religion in a global context...

  2. Organizing for creativity, quality and speed in product creation processes

    NARCIS (Netherlands)

    Eijnatten, van F.M.; Simonse, W.L.

    1999-01-01

    Current research in industrial engineering and management sciences shows that organizational architectures are of critical importance for a better performance of product creation processes in terms of creativity, quality and speed. For many companies, streamlining those processes - including

  3. Databases for rRNA gene profiling of microbial communities

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, Matthew

    2013-07-02

    The present invention relates to methods for performing surveys of the genetic diversity of a population. The invention also relates to methods for performing genetic analyses of a population. The invention further relates to methods for the creation of databases comprising the survey information and the databases created by these methods. The invention also relates to methods for analyzing the information to correlate the presence of nucleic acid markers with desired parameters in a sample. These methods have application in the fields of geochemical exploration, agriculture, bioremediation, environmental analysis, clinical microbiology, forensic science and medicine.

  4. Dismantling the Co-creation Unicorn : Investigating the "How" in Inter-firm Collaboration

    OpenAIRE

    Skog, Daniel

    2012-01-01

    In order to face the challenges derived from an increasingly disruptive technologicalenvironment, firms often engage in collaborative arrangements with other firms. While it isargued that inter-firm networks can serve as a way to catalyze innovation, to manage risksinvolved in R&D and to enable the creation of new value through co-creation, the causes andreasons for inter-firm collaboration are well-known. However, little effort has been focused atcritically examining the challenges that ...

  5. Content independence in multimedia databases

    NARCIS (Netherlands)

    A.P. de Vries (Arjen)

    2001-01-01

    textabstractA database management system is a general-purpose software system that facilitates the processes of defining, constructing, and manipulating databases for various applications. This article investigates the role of data management in multimedia digital libraries, and its implications for

  6. Database foundation for the configuration management of the CERN accelerator controls systems

    International Nuclear Information System (INIS)

    Zaharieva, Z.; Martin Marquez, M.; Peryt, M.

    2012-01-01

    The Controls Configuration Database (CCDB) and its interfaces have been developed over the last 25 years in order to become nowadays the basis for the Configuration Management of the Control System for all accelerators at CERN. The CCDB contains data for all configuration items and their relationships, required for the correct functioning of the Control System. The configuration items are quite heterogeneous, depicting different areas of the Control System - ranging from 3000 Front-End Computers, 75000 software devices allowing remote control of the accelerators, to valid states of the Accelerators Timing System. The article will describe the different areas of the CCDB, their inter-dependencies and the challenges to establish the data model for such a diverse configuration management database, serving a multitude of clients. The CCDB tracks the life of the configuration items by allowing their clear identification, triggering of change management processes as well as providing status accounting and audits. This required the development and implementation of a combination of tailored processes and tools. The Controls System is a data-driven one - the data stored in the CCDB is extracted and propagated to the controls hardware in order to configure it remotely. Therefore a special attention is placed on data security and data integrity as an incorrectly configured item can have a direct impact on the operation of the accelerators. (authors)

  7. [Research and development of medical case database: a novel medical case information system integrating with biospecimen management].

    Science.gov (United States)

    Pan, Shiyang; Mu, Yuan; Wang, Hong; Wang, Tong; Huang, Peijun; Ma, Jianfeng; Jiang, Li; Zhang, Jie; Gu, Bing; Yi, Lujiang

    2010-04-01

    To meet the needs of management of medical case information and biospecimen simultaneously, we developed a novel medical case information system integrating with biospecimen management. The database established by MS SQL Server 2000 covered, basic information, clinical diagnosis, imaging diagnosis, pathological diagnosis and clinical treatment of patient; physicochemical property, inventory management and laboratory analysis of biospecimen; users log and data maintenance. The client application developed by Visual C++ 6.0 was used to implement medical case and biospecimen management, which was based on Client/Server model. This system can perform input, browse, inquest, summary of case and related biospecimen information, and can automatically synthesize case-records based on the database. Management of not only a long-term follow-up on individual, but also of grouped cases organized according to the aim of research can be achieved by the system. This system can improve the efficiency and quality of clinical researches while biospecimens are used coordinately. It realizes synthesized and dynamic management of medical case and biospecimen, which may be considered as a new management platform.

  8. Value-driven ERM: making ERM an engine for simultaneous value creation and value protection.

    Science.gov (United States)

    Celona, John; Driver, Jeffrey; Hall, Edward

    2011-01-01

    Enterprise risk management (ERM) began as an effort to integrate the historically disparate silos of risk management in organizations. More recently, as recognition has grown of the need to cover the upside risks in value creation (financial and otherwise), organizations and practitioners have been searching for the means to do this. Existing tools such as heat maps and risk registers are not adequate for this task. Instead, a conceptually new value-driven framework is needed to realize the promise of enterprise-wide coverage of all risks, for both value protection and value creation. The methodology of decision analysis provides the means of capturing systemic, correlated, and value-creation risks on the same basis as value protection risks and has been integrated into the value-driven approach to ERM described in this article. Stanford Hospital and Clinics Risk Consulting and Strategic Decisions Group have been working to apply this value-driven ERM at Stanford University Medical Center. © 2011 American Society for Healthcare Risk Management of the American Hospital Association.

  9. National information network and database system of hazardous waste management in China

    Energy Technology Data Exchange (ETDEWEB)

    Ma Hongchang [National Environmental Protection Agency, Beijing (China)

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry, and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.

  10. Knowledge creation and transfer among postgraduate students

    Directory of Open Access Journals (Sweden)

    Kreeson Naicker

    2014-08-01

    Full Text Available Background: The skill shortages, hyper-competitive economic environments and untapped economies have created a great deal of focus on knowledge. Thus, continuously creating and transferring knowledge is critical for every organisation. Objectives: This article reports on an exploratory study undertaken to ascertain how knowledge is created and transferred amongst post-graduate (PG students, using the knowledge (socialisation, externalisation, combination, internalisation [SECI] spiral model. Method: After reviewing relevant literature, a personally administered standardised questionnaire was used to collect data from a convenience sample of PG students in the School of Management, IT and Governance at the University of KwaZulu-Natal, South Africa. The data was analysed to determine if it fit the model based on the four modes of knowledge conversion. Results: Although the School of Management, IT and Governance has mechanisms in place to facilitate knowledge creation and transfer, it nevertheless tends to focus on the four modes of knowledge conversion to varying degrees. Conclusion: The study confirmed that PG students utilise the ‘socialisation’ and ‘externalisation’ modes of knowledge conversion comprehensively; ‘internalisation’ plays a significant role in their knowledge creation and transfer activities and whilst ‘combination’ is utilised to a lesser extent, it still plays a role in PG students’ knowledge creation and transfer activities. PG students also have ‘space’ that allows them to bring hunches, thoughts, notions, intuition or tacit knowledge into reality. Trust and dedication are common amongst PG students. With socialisation and externalisation so high, PG students are aware of each other’s capabilities and competencies, and trust each other enough to share knowledge.

  11. Advanced information technology: Building stronger databases

    Energy Technology Data Exchange (ETDEWEB)

    Price, D. [Lawrence Livermore National Lab., CA (United States)

    1994-12-01

    This paper discusses the attributes of the Advanced Information Technology (AIT) tool set, a database application builder designed at the Lawrence Livermore National Laboratory. AIT consists of a C library and several utilities that provide referential integrity across a database, interactive menu and field level help, and a code generator for building tightly controlled data entry support. AIT also provides for dynamic menu trees, report generation support, and creation of user groups. Composition of the library and utilities is discussed, along with relative strengths and weaknesses. In addition, an instantiation of the AIT tool set is presented using a specific application. Conclusions about the future and value of the tool set are then drawn based on the use of the tool set with that specific application.

  12. Curiosity and Its Role in Cross-Cultural Knowledge Creation

    Science.gov (United States)

    Mikhaylov, Natalie S.

    2016-01-01

    This paper explores the role of curiosity in promoting cross-cultural knowledge creation and competence development. It is based on a study with four international higher educational institutions, all of which offer management and business education for local and international students. The reality of multicultural and intercultural relationships…

  13. Network and Database Security: Regulatory Compliance, Network, and Database Security - A Unified Process and Goal

    Directory of Open Access Journals (Sweden)

    Errol A. Blake

    2007-12-01

    Full Text Available Database security has evolved; data security professionals have developed numerous techniques and approaches to assure data confidentiality, integrity, and availability. This paper will show that the Traditional Database Security, which has focused primarily on creating user accounts and managing user privileges to database objects are not enough to protect data confidentiality, integrity, and availability. This paper is a compilation of different journals, articles and classroom discussions will focus on unifying the process of securing data or information whether it is in use, in storage or being transmitted. Promoting a change in Database Curriculum Development trends may also play a role in helping secure databases. This paper will take the approach that if one make a conscientious effort to unifying the Database Security process, which includes Database Management System (DBMS selection process, following regulatory compliances, analyzing and learning from the mistakes of others, Implementing Networking Security Technologies, and Securing the Database, may prevent database breach.

  14. Experiences in the creation of an electromyography database to help hand amputated persons.

    Science.gov (United States)

    Atzori, Manfredo; Gijsberts, Arjan; Heynen, Simone; Hager, Anne-Gabrielle Mittaz; Castellimi, Claudio; Caputo, Barbara; Müller, Henning

    2012-01-01

    Currently, trans-radial amputees can only perform a few simple movements with prosthetic hands. This is mainly due to low control capabilities and the long training time that is required to learn controlling them with surface electromyography (sEMG). This is in contrast with recent advances in mechatronics, thanks to which mechanical hands have multiple degrees of freedom and in some cases force control. To help improve the situation, we are building the NinaPro (Non-Invasive Adaptive Prosthetics) database, a database of about 50 hand and wrist movements recorded from several healthy and currently very few amputated persons that will help the community to test and improve sEMG-based natural control systems for prosthetic hands. In this paper we describe the experimental experiences and practical aspects related to the data acquisition.

  15. Chemical Education: A Tool for Wealth Creation from Waste ...

    African Journals Online (AJOL)

    This paper focuses on exposing the indispensible role of chemical education in wealth creation from waste. Every settlement of people has one type of waste or the other to dispose. The challenge of waste management has in recent time occupied researchers such that innovations are geared towards reducing wastes that ...

  16. Efficacy of a Template Creation Approach for Performance Improvement

    Science.gov (United States)

    Lyons, Paul R.

    2011-01-01

    This article presents the training and performance improvement approach, performance templates (P-T), and provides empirical evidence to support the efficacy of P-T. This approach involves a partnership among managers, trainers, and employees in the creation, use, and improvement of guides to affect the performance of critical tasks in the…

  17. Flash Foods' Job Creation and Petroleum Independence with E85

    Energy Technology Data Exchange (ETDEWEB)

    Walk, Steve [Protec Fuel Management LLC, Boca Raton, FL (United States)

    2016-11-21

    Protec Fuel Management project objectives are to help design, build, provide, promote and supply biofuels for the greater energy independence, national security and domestic economic growth through job creations, infrastructure projects and supply chain business stimulants.

  18. Big data reduction framework for value creation in sustainable enterprises

    OpenAIRE

    Rehman, Muhammad Habib ur; Chang, Victor; Batool, Aisha; Teh, Ying Wah

    2016-01-01

    Value creation is a major sustainability factor for enterprises, in addition to profit maximization and revenue generation. Modern enterprises collect big data from various inbound and outbound data sources. The inbound data sources handle data generated from the results of business operations, such as manufacturing, supply chain management, marketing, and human resource management, among others. Outbound data sources handle customer-generated data which are acquired directly or indirectly fr...

  19. TRAM (Transcriptome Mapper: database-driven creation and analysis of transcriptome maps from multiple sources

    Directory of Open Access Journals (Sweden)

    Danieli Gian

    2011-02-01

    clusters with differential expression during the differentiation toward megakaryocyte were identified. Conclusions TRAM is designed to create, and statistically analyze, quantitative transcriptome maps, based on gene expression data from multiple sources. The release includes FileMaker Pro database management runtime application and it is freely available at http://apollo11.isto.unibo.it/software/, along with preconfigured implementations for mapping of human, mouse and zebrafish transcriptomes.

  20. Mapping Heritage: Geospatial Online Databases of Historic Roads. The Case of the N-340 Roadway Corridor on the Spanish Mediterranean

    Directory of Open Access Journals (Sweden)

    Mar Loren-Méndez

    2018-04-01

    Full Text Available The study has developed an online geospatial database for assessing the complexity of roadway heritage, overcoming the limitations of traditional heritage catalogues and databases: the itemization of heritage assets and the rigidity of the database structure. Reflecting the current openness in the field of heritage studies, the research proposes an interdisciplinary approach that reframes heritage databases, both conceptually and technologically. Territorial scale is key for heritage interpretation, the complex characteristics of each type of heritage, and social appropriation. The system is based on an open-source content-management system and framework called ProcessWire, allowing flexibility in the definition of data fields and serving as an internal working tool for research collaboration. Accessibility, flexibility, and ease of use do not preclude rigor: the database works in conjunction with a GIS (Geographic Information System support system and is complemented by a bibliographical archive. A hierarchical multiscalar heritage characterization has been implemented in order to include the different territorial scales and to facilitate the creation of itineraries. Having attained the main goals of conceptual heritage coherence, accessibility, and rigor, the database should strive for broader capacity to integrate GIS information and stimulate public participation, a step toward controlled crowdsourcing and collaborative heritage characterization.

  1. Short-term Outcomes After Open and Laparoscopic Colostomy Creation.

    Science.gov (United States)

    Ivatury, Srinivas Joga; Bostock Rosenzweig, Ian C; Holubar, Stefan D

    2016-06-01

    Colostomy creation is a common procedure performed in colon and rectal surgery. Outcomes by technique have not been well studied. This study evaluated outcomes related to open versus laparoscopic colostomy creation. This was a retrospective review of patients undergoing colostomy creation using univariate and multivariate propensity score analyses. Hospitals participating in the American College of Surgeons National Surgical Quality Improvement Program database were included. Data on patients were obtained from the American College of Surgeons National Surgical Quality Improvement Program 2005-2011 Participant Use Data Files. We measured 30-day mortality, 30-day complications, and predictors of 30-day mortality. A total of 2179 subjects were in the open group and 1132 in the laparoscopic group. The open group had increased age (open, 64 years vs laparoscopic, 60 years), admission from facility (17.0% vs 14.9%), and disseminated cancer (26.1% vs 21.4%). All were statistically significant. The open group had a significantly higher percentage of emergency operations (24.9% vs 7.9%). Operative time was statistically different (81 vs 86 minutes). Thirty-day mortality was significantly higher in the open group (8.7% vs 3.5%), as was any 30-day complication (25.4% vs 17.0%). Propensity-matching analysis on elective patients only revealed that postoperative length of stay and rate of any wound complication were statistically higher in the open group. Multivariate analysis for mortality was performed on the full, elective, and propensity-matched cohorts; age >65 years and dependent functional status were associated with an increased risk of mortality in all of the models. This study has the potential for selection bias and limited generalizability. Colostomy creation at American College of Surgeons National Surgical Quality Improvement Program hospitals is more commonly performed open rather than laparoscopically. Patient age >65 years and dependent functional status are

  2. De-MA: a web Database for electron Microprobe Analyses to assist EMP lab manager and users

    Science.gov (United States)

    Allaz, J. M.

    2012-12-01

    Lab managers and users of electron microprobe (EMP) facilities require comprehensive, yet flexible documentation structures, as well as an efficient scheduling mechanism. A single on-line database system for managing reservations, and providing information on standards, quantitative and qualitative setups (element mapping, etc.), and X-ray data has been developed for this purpose. This system is particularly useful in multi-user facilities where experience ranges from beginners to the highly experienced. New users and occasional facility users will find these tools extremely useful in developing and maintaining high quality, reproducible, and efficient analyses. This user-friendly database is available through the web, and uses MySQL as a database and PHP/HTML as script language (dynamic website). The database includes several tables for standards information, X-ray lines, X-ray element mapping, PHA, element setups, and agenda. It is configurable for up to five different EMPs in a single lab, each of them having up to five spectrometers and as many diffraction crystals as required. The installation should be done on a web server supporting PHP/MySQL, although installation on a personal computer is possible using third-party freeware to create a local Apache server, and to enable PHP/MySQL. Since it is web-based, any user outside the EMP lab can access this database anytime through any web browser and on any operating system. The access can be secured using a general password protection (e.g. htaccess). The web interface consists of 6 main menus. (1) "Standards" lists standards defined in the database, and displays detailed information on each (e.g. material type, name, reference, comments, and analyses). Images such as EDS spectra or BSE can be associated with a standard. (2) "Analyses" lists typical setups to use for quantitative analyses, allows calculation of mineral composition based on a mineral formula, or calculation of mineral formula based on a fixed

  3. Development of intelligent database program for PSI/ISI data management of nuclear power plant

    International Nuclear Information System (INIS)

    Um, Byong Guk; Park, Un Su; Park, Ik Keun; Park, Yun Won; Kang, Suk Chul

    1998-01-01

    An intelligent database program has been developed under fully compatible with windows 95 for the construction of total support system and the effective management of Pre-/In-Service Inspection data. Using the database program, it can be executed the analysis and multi-dimensional evaluation of the defects detected during PSI/ISI in the pipe and the pressure vessel of the nuclear power plants. And also it can be used to investigate the NDE data inspected repetitively and the contents of treatment, and to offer the fundamental data for application of evaluation data related to Fracture Mechanics Analysis(FMA). Furthermore, the PSI/ISI database loads and material properties can be utilized to secure the higher degree of safety, integrity, reliability, and life-prediction of components and systems in nuclear power plant.

  4. MAP SERVICES FOR MANAGEMENT OF HUNTING ORGANIZATIONS (THE CASE OF HUNTING ORGANIZATION “MEDVEDICA”

    Directory of Open Access Journals (Sweden)

    S. A. Zaichenko

    2013-01-01

    Full Text Available The current state map support of the system of hunting management requires updating an information database and the creation of new schemes of hunting organization. In this case the beneficial is using of satellite imagery data for the mapping and also for important environmental research. Presentation of the results in the form of Internet web services provides broad benefits to the paper version of the maps.

  5. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    International Nuclear Information System (INIS)

    Shao, Weber; Kupelian, Patrick A; Wang, Jason; Low, Daniel A; Ruan, Dan

    2014-01-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  6. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    Science.gov (United States)

    Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan

    2014-03-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  7. Content Based Retrieval Database Management System with Support for Similarity Searching and Query Refinement

    Science.gov (United States)

    2002-01-01

    to the OODBMS approach. The ORDBMS approach produced such research prototypes as Postgres [155], and Starburst [67] and commercial products such as...Kemnitz. The POSTGRES Next-Generation Database Management System. Communications of the ACM, 34(10):78–92, 1991. [156] Michael Stonebreaker and Dorothy

  8. Schwinger pair creation of Kaluza-Klein particles: Pair creation without tunneling

    International Nuclear Information System (INIS)

    Friedmann, Tamar; Verlinde, Herman

    2005-01-01

    We study Schwinger pair creation of charged Kaluza-Klein (KK) particles from a static KK electric field. We find that the gravitational backreaction of the electric field on the geometry--which is incorporated via the electric KK-Melvin solution--prevents the electrostatic potential from overcoming the rest mass of the KK particles, thus impeding the tunneling mechanism which is often thought of as responsible for the pair creation. However, we find that pair creation still occurs with a finite rate formally similar to the classic Schwinger result, but via an apparently different mechanism, involving a combination of the Unruh effect and vacuum polarization due to the E-field

  9. Value Creation in Digital Service Platforms

    DEFF Research Database (Denmark)

    Ghazawneh, Ahmad; Mansour, Osama

    2017-01-01

    Value creation is increasingly relevant for owners of digital service platforms (DSPs). These owners have two vital goals: increase their service base and sustain their service offerers. A key element in continuously accommodating these goals is value creation. While the literature on DSPs is gro...... of service offerers. As such, our study proposes and contributes a value creation framework for DSPs that identifies 8 value sources and highlights resource combination and exchange in the process of value creation.......Value creation is increasingly relevant for owners of digital service platforms (DSPs). These owners have two vital goals: increase their service base and sustain their service offerers. A key element in continuously accommodating these goals is value creation. While the literature on DSPs...... is growing, there is a paucity of knowledge on the value creation process in these platforms. Drawing on a qualitative study of Uber drivers in Denmark and Sweden, we synthesize Schumpeter’s theory of value creation to develop an understanding of the value creation process in DSPs from the perspective...

  10. Development of a database system for the management of non-treated radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso, E-mail: ajp@cdtn.br, E-mail: cbf@cdtn.br, E-mail: vc@cdtn.br, E-mail: pos@cdtn.br, E-mail: seless@cdtn.br, E-mail: hauczmj@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  11. Development of a database system for the management of non-treated radioactive waste

    International Nuclear Information System (INIS)

    Pinto, Antônio Juscelino; Freire, Carolina Braccini; Cuccia, Valeria; Santos, Paulo de Oliveira; Seles, Sandro Rogério Novaes; Haucz, Maria Judite Afonso

    2017-01-01

    The radioactive waste produced by the research laboratories at CDTN/CNEN, Belo Horizonte, is stored in the Non-Treated Radwaste Storage (DRNT) until the treatment is performed. The information about the waste is registered and the data about the waste must to be easily retrievable and useful for all the staff involved. Nevertheless, it has been kept in an old Paradox database, which is now becoming outdated. Thus, to achieve this goal, a new Database System for the Non-treated Waste will be developed using Access® platform, improving the control and management of solid and liquid radioactive wastes stored in CDTN. The Database System consists of relational tables, forms and reports, preserving all available information. It must to ensure the control of the waste records and inventory. In addition, it will be possible to carry out queries and reports to facilitate the retrievement of the waste history and localization and the contents of the waste packages. The database will also be useful for grouping the waste with similar characteristics to identify the best type of treatment. The routine problems that may occur due to change of operators will be avoided. (author)

  12. CEO Sites Mission Management System (SMMS)

    Science.gov (United States)

    Trenchard, Mike

    2014-01-01

    Late in fiscal year 2011, the Crew Earth Observations (CEO) team was tasked to upgrade its science site database management tool, which at the time was integrated with the Automated Mission Planning System (AMPS) originally developed for Earth Observations mission planning in the 1980s. Although AMPS had been adapted and was reliably used by CEO for International Space Station (ISS) payload operations support, the database structure was dated, and the compiler required for modifications would not be supported in the Windows 7 64-bit operating system scheduled for implementation the following year. The Sites Mission Management System (SMMS) is now the tool used by CEO to manage a heritage Structured Query Language (SQL) database of more than 2,000 records for Earth science sites. SMMS is a carefully designed and crafted in-house software package with complete and detailed help files available for the user and meticulous internal documentation for future modifications. It was delivered in February 2012 for test and evaluation. Following acceptance, it was implemented for CEO mission operations support in April 2012. The database spans the period from the earliest systematic requests for astronaut photography during the shuttle era to current ISS mission support of the CEO science payload. Besides logging basic image information (site names, locations, broad application categories, and mission requests), the upgraded database management tool now tracks dates of creation, modification, and activation; imagery acquired in response to requests; the status and location of ancillary site information; and affiliations with studies, their sponsors, and collaborators. SMMS was designed to facilitate overall mission planning in terms of site selection and activation and provide the necessary site parameters for the Satellite Tool Kit (STK) Integrated Message Production List Editor (SIMPLE), which is used by CEO operations to perform daily ISS mission planning. The CEO team

  13. Federal databases

    International Nuclear Information System (INIS)

    Welch, M.J.; Welles, B.W.

    1988-01-01

    Accident statistics on all modes of transportation are available as risk assessment analytical tools through several federal agencies. This paper reports on the examination of the accident databases by personal contact with the federal staff responsible for administration of the database programs. This activity, sponsored by the Department of Energy through Sandia National Laboratories, is an overview of the national accident data on highway, rail, air, and marine shipping. For each mode, the definition or reporting requirements of an accident are determined and the method of entering the accident data into the database is established. Availability of the database to others, ease of access, costs, and who to contact were prime questions to each of the database program managers. Additionally, how the agency uses the accident data was of major interest

  14. JDD, Inc. Database

    Science.gov (United States)

    Miller, David A., Jr.

    2004-01-01

    JDD Inc, is a maintenance and custodial contracting company whose mission is to provide their clients in the private and government sectors "quality construction, construction management and cleaning services in the most efficient and cost effective manners, (JDD, Inc. Mission Statement)." This company provides facilities support for Fort Riley in Fo,rt Riley, Kansas and the NASA John H. Glenn Research Center at Lewis Field here in Cleveland, Ohio. JDD, Inc. is owned and operated by James Vaughn, who started as painter at NASA Glenn and has been working here for the past seventeen years. This summer I worked under Devan Anderson, who is the safety manager for JDD Inc. in the Logistics and Technical Information Division at Glenn Research Center The LTID provides all transportation, secretarial, security needs and contract management of these various services for the center. As a safety manager, my mentor provides Occupational Health and Safety Occupation (OSHA) compliance to all JDD, Inc. employees and handles all other issues (Environmental Protection Agency issues, workers compensation, safety and health training) involving to job safety. My summer assignment was not as considered "groundbreaking research" like many other summer interns have done in the past, but it is just as important and beneficial to JDD, Inc. I initially created a database using a Microsoft Excel program to classify and categorize data pertaining to numerous safety training certification courses instructed by our safety manager during the course of the fiscal year. This early portion of the database consisted of only data (training field index, employees who were present at these training courses and who was absent) from the training certification courses. Once I completed this phase of the database, I decided to expand the database and add as many dimensions to it as possible. Throughout the last seven weeks, I have been compiling more data from day to day operations and been adding the

  15. Kajian Unified Theory of Acceptance and Use of Technology Dalam Penggunaan Open Source Software Database Management System

    Directory of Open Access Journals (Sweden)

    Michael Sonny

    2016-06-01

    Full Text Available Perkembangan perangkat lunak computer dewasa ini terjadi sedemikian pesatnya, perkembangan tidak hanya terjadi pada perangkat lunak yang memiliki lisensi tertentu, perangkat open source pun demikian. Perkembangan itu tentu saja sangat menggembirakan bagi pengguna computer khususnya di kalangan pendidikan maupun di kalangan mahasiswa, karena pengguna mempunyai beberapa pilihan untuk menggunakan aplikasi. Perangkat lunak open source juga menawarkan produk yang umumnya gratis, diberikan kode programnya, kebebasan untuk modifikasi dan mengembangkan. Meneliti aplikasi berbasis open source tentu saja sangat beragam seperti aplikasi untuk pemrograman (PHP, Gambas, Database Management System (MySql, SQLite, browsing (Mozilla, Firefox, Opera. Pada penelitian ini di kaji penerimaan aplikasi DBMS (Database Management System seperti MySql dan SQLite dengan menggunakan sebuah model yang dikembangkan oleh Venkantes(2003 yaitu UTAUT (Unified Theory of Acceptance and Use of Technology. Faktor – faktor tertentu juga mempengaruhi dalam melakukan kegiatan pembelajaran aplikasi open source ini, salah satu faktor atau yang disebut dengan moderating yang bisa mempengaruhi efektifitas dan efisiensi. Dengan demikian akan mendapatkan hasil yang bisa membuat kelancaran dalam pembelajaran aplikasi berbasis open source ini.   Kata kunci— open source, Database Management System (DBMS, Modereting

  16. The database system for the management of technical documentations of PWR fuel design project using CD-ROM

    International Nuclear Information System (INIS)

    Park, Bong Sik; Lee, Won Jae; Ryu, Jae Kwon; Jo, In Hang; Chang, Jong Hwa.

    1996-12-01

    In this report, the database system developed for the management of technical documentation of PWR fuel design project using CD-ROM (compact disk - read only memory) is described. The database system, KIRDOCM (KAERI Initial and Reload Fuel project technical documentation management), is developed and installed on PC using Visual Foxpro 3.0. Descriptions are focused on the user interface of the KIRDOCM. Introduction addresses the background and concept of the development. The main chapter describes the user requirements, the analysis of computing environment, the design of KIRDOCM, the implementation of the KIRDOCM, user's manual of KIRDOCM and the maintenance of the KIRDOCM for future improvement. The implementation of KIRDOCM system provides the efficiency in the management, maintenance and indexing of the technical documents. And, it is expected that KIRDOCM may be a good reference in applying Visual Foxpro for the development of information management system. (author). 2 tabs., 13 figs., 8 refs

  17. E3 Staff Database

    Data.gov (United States)

    US Agency for International Development — E3 Staff database is maintained by E3 PDMS (Professional Development & Management Services) office. The database is Mysql. It is manually updated by E3 staff as...

  18. Creating databases for biological information: an introduction.

    Science.gov (United States)

    Stein, Lincoln

    2013-06-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. Copyright 2013 by JohnWiley & Sons, Inc.

  19. Management of the database originated from individual and environment monitoring carried out in the UNIFESP/HSP complex, SP, Brazil

    International Nuclear Information System (INIS)

    Medeiros, Regina Bitelli; Daros, Kellen Adriana Curci; Almeida, Natalia Correia de; Pires, Silvio Ricardo; Jorge, Luiz Tadeu

    2005-01-01

    The Radiological Protection Sector of the Sao Paulo Hospital/Federal University of Sao Paulo, SP, Brazil manages the records of 457 dosemeters. Once the users must know about the absorbed doses monthly and the need of keep the individuals records until the age of 75 years old and for, at least during 30 years after the end of the occupation of the individual, it became necessary to construct a database and a computerized control to manage the accumulated doses. This control, between 1991 and 1999, was effected by means of a relational database (Cobol 85 - Operating System GCOS 7 (ABC Telematic Bull)). After this period, when the company responsible for dosimetry went on to provide computerized results, the data were stored in a Paradox database (Borland). In 2004, the databases were integrated and were created a third database developed in Oracle (IBM) and a system that allowed the institutional Intranet users to consult their accumulated doses annually and the value of the total effective dose accumulated during working life

  20. Database reliability engineering designing and operating resilient database systems

    CERN Document Server

    Campbell, Laine

    2018-01-01

    The infrastructure-as-code revolution in IT is also affecting database administration. With this practical book, developers, system administrators, and junior to mid-level DBAs will learn how the modern practice of site reliability engineering applies to the craft of database architecture and operations. Authors Laine Campbell and Charity Majors provide a framework for professionals looking to join the ranks of today’s database reliability engineers (DBRE). You’ll begin by exploring core operational concepts that DBREs need to master. Then you’ll examine a wide range of database persistence options, including how to implement key technologies to provide resilient, scalable, and performant data storage and retrieval. With a firm foundation in database reliability engineering, you’ll be ready to dive into the architecture and operations of any modern database. This book covers: Service-level requirements and risk management Building and evolving an architecture for operational visibility ...

  1. Value Creation in International Business

    DEFF Research Database (Denmark)

    The edited collection brings into focus the meanings, interpretations and the process of value creation in international business. Exploring value creation in the context of emerging and developed economies, Volume 2 takes the perspective of small and medium sized enterprises and examines various...... approaches to value creation in the process of firm internationalization. Providing theoretical and practical insights, the authors open an intellectual debate into what value is, and how it is created through the internationalization activities of firms. Value Creation in International Business...... is a pioneering two volume work intended to provoke theoretical and empirical development in International Business research. Moreover, it is intended as a bridge between concepts derived from general business firm-level research agendas such as value creation and business model, and internationalization...

  2. Model of Axiological Dimension Risk Management

    Directory of Open Access Journals (Sweden)

    Kulińska Ewa

    2016-01-01

    Full Text Available It was on the basis of the obtained results that identify the key prerequisites for the integration of the management of logistics processes, management of the value creation process, and risk management that the methodological basis for the construction of the axiological dimension of the risk management (ADRM model of logistics processes was determined. By taking into account the contribution of individual concepts to the new research area, its essence was defined as an integrated, structured instrumentation aimed at the identification and implementation of logistics processes supporting creation of the value added as well as the identification and elimination of risk factors disturbing the process of the value creation for internal and external customers. The base for the ADRM concept of logistics processes is the use of the potential being inherent in synergistic effects which are obtained by using prerequisites for the integration of the management of logistics processes, of value creation and risk management as the key determinants of the value creation.

  3. Solid Waste Projection Model: Database (Version 1.4)

    International Nuclear Information System (INIS)

    Blackburn, C.; Cillan, T.

    1993-09-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software and data structures utilized in developing the SWPM Version 1.4 Database. This document is intended for use by experienced database specialists and supports database maintenance, utility development, and database enhancement. Those interested in using the SWPM database should refer to the SWPM Database User's Guide. This document is available from the PNL Task M Project Manager (D. L. Stiles, 509-372-4358), the PNL Task L Project Manager (L. L. Armacost, 509-372-4304), the WHC Restoration Projects Section Manager (509-372-1443), or the WHC Waste Characterization Manager (509-372-1193)

  4. CREATION de VALEUR dans une STARTUP - MODELE et ETUDE de CAS

    OpenAIRE

    Guillaume MARCEAU; Jean-Michel SAHUT

    2014-01-01

    In this article, we propose a value creation model based on the principle of the chain of value in corporate management. We particularly endeavour to show the incidence of a relevant allowance of a company's resources on its profitability, by distinguis

  5. MULTI-AGENT MODEL OF SAFETY MANAGEMENT IN PLANNING PROJECTS FOR THE CREATION OF OBJECTS WITH MASS STAY OF PEOPLE

    Directory of Open Access Journals (Sweden)

    Олег Богданович ЗАЧКО

    2017-03-01

    Full Text Available In today's conditions, with increasing of the scale of industrialization the Ukraine's major cities, also increases the threat of emergency situations (ES, disasters and accidents at the objects with the mass stay of people (OMSP. Inadequate level of paying attention to the exploitation of OMSP at all stages of the project lifecycle gives its tangible negative consequences. The analysis of statistics for the last 5-10 years has shown the significant growth of dynamics of mortality after emergencies on enterprises, which shows that in most cases the cause of these deaths is the lack of strict management consistency across all hierarchy management structure that is the project-oriented management, ignorance the rules of fire safety at the workplace, lack of automatic fire alarm systems and alarm systems and extinguishing, especially in the regional context. Therefore, the definition of the concept of objects with the mass stay of people using safety-oriented approach will allow them to identify and ultimately increase security at such objectsIn the article the literary analysis of the available scientific studies. Developed multi-agent safety management model in planning projects for the creation of objects with the mass stay of people.

  6. Enhanced DIII-D Data Management Through a Relational Database

    Science.gov (United States)

    Burruss, J. R.; Peng, Q.; Schachter, J.; Schissel, D. P.; Terpstra, T. B.

    2000-10-01

    A relational database is being used to serve data about DIII-D experiments. The database is optimized for queries across multiple shots, allowing for rapid data mining by SQL-literate researchers. The relational database relates different experiments and datasets, thus providing a big picture of DIII-D operations. Users are encouraged to add their own tables to the database. Summary physics quantities about DIII-D discharges are collected and stored in the database automatically. Meta-data about code runs, MDSplus usage, and visualization tool usage are collected, stored in the database, and later analyzed to improve computing. Documentation on the database may be accessed through programming languages such as C, Java, and IDL, or through ODBC compliant applications such as Excel and Access. A database-driven web page also provides a convenient means for viewing database quantities through the World Wide Web. Demonstrations will be given at the poster.

  7. Value Creation Challenges in Multichannel Retail Business Models

    Directory of Open Access Journals (Sweden)

    Mika Yrjölä

    2014-08-01

    Full Text Available Purpose: The purpose of the paper is to identify and analyze the challenges of value creation in multichannel retail business models. Design/methodology/approach: With the help of semi-structured interviews with top executives from different retailing environments, this study introduces a model of value creation challenges in the context of multichannel retailing. The challenges are analyzed in terms of three retail business model elements, i.e., format, activities, and governance. Findings: Adopting a multichannel retail business model requires critical rethinking of the basic building blocks of value creation. First of all, as customers effortlessly move between multiple channels, multichannel formats can lead to a mismatch between customer and firm value. Secondly, retailers face pressures to use their activities to form integrated total offerings to customers. Thirdly, multiple channels might lead to organizational silos with conflicting goals. A careful orchestration of value creation is needed to determine the roles and incentives of the channel parties involved. Research limitations/implications: In contrast to previous business model literature, this study did not adopt a network-centric view. By embracing the boundary-spanning nature of the business model, other challenges and elements might have been discovered (e.g., challenges in managing relationships with suppliers. Practical implications: As a practical contribution, this paper has analyzed the challenges retailers face in adopting multichannel business models. Customer tendencies for showrooming behavior highlight the need for generating efficient lock-in strategies. Customized, personal offers and information are ways to increase customer value, differentiate from competition, and achieve lock-in. Originality/value: As a theoretical contribution, this paper empirically investigates value creation challenges in a specific context, lowering the level of abstraction in the mostly

  8. Product Licenses Database Application

    CERN Document Server

    Tonkovikj, Petar

    2016-01-01

    The goal of this project is to organize and centralize the data about software tools available to CERN employees, as well as provide a system that would simplify the license management process by providing information about the available licenses and their expiry dates. The project development process is consisted of two steps: modeling the products (software tools), product licenses, legal agreements and other data related to these entities in a relational database and developing the front-end user interface so that the user can interact with the database. The result is an ASP.NET MVC web application with interactive views for displaying and managing the data in the underlying database.

  9. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    Science.gov (United States)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  10. DataBase on Demand

    International Nuclear Information System (INIS)

    Aparicio, R Gaspar; Gomez, D; Wojcik, D; Coz, I Coterillo

    2012-01-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  11. E-HRM Usage and Value Creation. Does a Facilitating Context Matter?

    NARCIS (Netherlands)

    Ruel, Hubertus Johannes Maria; van der Kaap, Harm G.

    2012-01-01

    Electronic Human Resource Management (e-HRM) is assumed to be a driving force behind HRM value creation. However, the issue remains of whether empirical evidence supports this assumption. Moreover, is the relationship straightforward and direct or is it conditional, and do contextual factors

  12. News from the Library: Looking for materials properties? Find the answer in CINDAS databases

    CERN Multimedia

    CERN Library

    2012-01-01

    Materials properties databases are a crucial source of information when doing research in Materials Science. The creation and regular updating of such databases requires identification and collection of relevant worldwide scientific and technical literature, followed by the compilation, critical evaluation, correlation and synthesis of both existing and new experimental data.   The Center for Information and Numerical Data Analysis and Synthesis (CINDAS) at Purdue University produces several databases on the properties and behaviour of materials. The databases include: - ASMD (Aerospace Structural Metals Database) which gives access to approximately 80,000 data curves on over 220 alloys used in the aerospace and other industries - the Microelectronics Packaging Materials Database (MPMD), providing data and information on the thermal, mechanical, electrical and physical properties of electronics packaging materials, and - the Thermophysical Properties of Matter Database (TPMD), covering the...

  13. The Partners in Recovery program: mental health commissioning using value co-creation.

    Science.gov (United States)

    Cheverton, Jeff; Janamian, Tina

    2016-04-18

    The Australian Government's Partners in Recovery (PIR) program established a new form of mental health intervention which required multiple sectors, services and consumers to work in a more collaborative way. Brisbane North Primary Health Network applied a value co-creation approach with partners and end users, engaging more than 100 organisations in the development of a funding submission to PIR. Engagement platforms were established and continue to provide opportunities for new co-creation experiences. Initially, seven provider agencies - later expanded to eight to include an Aboriginal and Torres Strait Islander provider organisation - worked collaboratively as a Consortium Management Committee. The co-creation development process has been part of achieving the co-created outcomes, which include new initiatives, changes to existing interventions and referral practices, and an increased understanding and awareness of end users' needs.

  14. Cooperation for Common Use of SEE Astronomical Database as a Regional Virtual Observatory in Different Scientific Projects

    Science.gov (United States)

    Pinigin, Gennady; Protsyuk, Yuri; Shulga, Alexander

    The activity of scientific collaborative and co-operative research between South-Eastern European (SEE) observatories is enlarged in the last time. The creation of a common database as a regional virtual observatory is very desirable. The creation of astronomical information resource with a capability of interactive access to databases and telescopes on the base of the general astronomical database of the SEE countries is presented. This resource may be connected with the European network. A short description of the NAO database is presented. The total amount of the NAO information makes about 90 GB, the one obtained from other sources - about 15 GB. The mean diurnal level of the new astronomical information produced with the NAO CCD instruments makes from 300 MB up to 2 GB, depending on the purposes and conditions of observations. The majority of observational data is stored in FITS format. Possibility of using of VO-table format for displaying these data in the Internet is studied. Activities on development and the further refinement of storage, exchange and data processing standards are conducted.

  15. Study on parallel and distributed management of RS data based on spatial database

    Science.gov (United States)

    Chen, Yingbiao; Qian, Qinglan; Wu, Hongqiao; Liu, Shijin

    2009-10-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  16. SFCOMPO: A new database of isotopic compositions of spent nuclear fuel

    International Nuclear Information System (INIS)

    Michel-Sendis, Franco; Gauld, Ian

    2014-01-01

    The numerous applications of nuclear fuel depletion simulations impact all areas related to nuclear safety. They are at the basis of, inter alia, spent fuel criticality safety analyses, reactor physics calculations, burn-up credit methodologies, decay heat thermal analyses, radiation shielding, reprocessing, waste management, deep geological repository safety studies and safeguards. Experimentally determined nuclide compositions of well-characterised spent nuclear fuel (SNF) samples are used to validate the accuracy of depletion code predictions for a given burn-up. At the same time, the measured nuclide composition of the sample is used to determine the burn-up of the fuel. It is therefore essential to have a reliable and well-qualified database of measured nuclide concentrations and relevant reactor operational data that can be used as experimental benchmark data for depletion codes and associated nuclear data. The Spent Fuel Isotopic Composition Database (SFCOMPO) has been hosted by the NEA since 2001. In 2012, a collaborative effort led by the NEA Data Bank and Oak Ridge National Laboratory (ORNL) in the United States, under the guidance of the NEA Expert Group on Assay Data of Spent Nuclear Fuel (EGADSNF) of the Working Party on Nuclear Criticality Safety (WPNCS), has resulted in the creation of an enhanced relational database structure and a significant expansion of the SFCOMPO database, which now contains experimental assay data for a wider selection of international reactor designs. The new database was released online in 2014. This new SFCOMPO database aims to provide access to open experimental SNF assay data to ensure their preservation and to facilitate their qualification as evaluated assay data suitable for the validation of methodologies used to predict the composition of irradiated nuclear fuel. Having a centralised, internationally reviewed database that makes these data openly available for a large selection of international reactor designs is of

  17. Development and Field Test of a Real-Time Database in the Korean Smart Distribution Management System

    Directory of Open Access Journals (Sweden)

    Sang-Yun Yun

    2014-03-01

    Full Text Available Recently, a distribution management system (DMS that can conduct periodical system analysis and control by mounting various applications programs has been actively developed. In this paper, we summarize the development and demonstration of a database structure that can perform real-time system analysis and control of the Korean smart distribution management system (KSDMS. The developed database structure consists of a common information model (CIM-based off-line database (DB, a physical DB (PDB for DB establishment of the operating server, a real-time DB (RTDB for real-time server operation and remote terminal unit data interconnection, and an application common model (ACM DB for running application programs. The ACM DB for real-time system analysis and control of the application programs was developed by using a parallel table structure and a link list model, thereby providing fast input and output as well as high execution speed of application programs. Furthermore, the ACM DB was configured with hierarchical and non-hierarchical data models to reflect the system models that increase the DB size and operation speed through the reduction of the system, of which elements were unnecessary for analysis and control. The proposed database model was implemented and tested at the Gochaing and Jeju offices using a real system. Through data measurement of the remote terminal units, and through the operation and control of the application programs using the measurement, the performance, speed, and integrity of the proposed database model were validated, thereby demonstrating that this model can be applied to real systems.

  18. SeedStor: A Germplasm Information Management System and Public Database.

    Science.gov (United States)

    Horler, R S P; Turner, A S; Fretter, P; Ambrose, M

    2018-01-01

    SeedStor (https://www.seedstor.ac.uk) acts as the publicly available database for the seed collections held by the Germplasm Resources Unit (GRU) based at the John Innes Centre, Norwich, UK. The GRU is a national capability supported by the Biotechnology and Biological Sciences Research Council (BBSRC). The GRU curates germplasm collections of a range of temperate cereal, legume and Brassica crops and their associated wild relatives, as well as precise genetic stocks, near-isogenic lines and mapping populations. With >35,000 accessions, the GRU forms part of the UK's plant conservation contribution to the Multilateral System (MLS) of the International Treaty for Plant Genetic Resources for Food and Agriculture (ITPGRFA) for wheat, barley, oat and pea. SeedStor is a fully searchable system that allows our various collections to be browsed species by species through to complicated multipart phenotype criteria-driven queries. The results from these searches can be downloaded for later analysis or used to order germplasm via our shopping cart. The user community for SeedStor is the plant science research community, plant breeders, specialist growers, hobby farmers and amateur gardeners, and educationalists. Furthermore, SeedStor is much more than a database; it has been developed to act internally as a Germplasm Information Management System that allows team members to track and process germplasm requests, determine regeneration priorities, handle cost recovery and Material Transfer Agreement paperwork, manage the Seed Store holdings and easily report on a wide range of the aforementioned tasks. © The Author(s) 2017. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.

  19. An object-oriented framework for managing cooperating legacy databases

    NARCIS (Netherlands)

    Balsters, H; de Brock, EO

    2003-01-01

    We describe a general semantic framework for precise specification of so-called database federations. A database federation provides for tight coupling of a collection of heterogeneous legacy databases into a global integrated system. Our approach to database federation is based on the UML/OCL data

  20. KALIMER design database development and operation manual

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Hahn, Do Hee; Lee, Yong Bum; Chang, Won Pyo

    2000-12-01

    KALIMER Design Database is developed to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applications. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), 3D CAD database, Team Cooperation System, and Reserved Documents. Results Database is a research results database for mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is a schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment

  1. KALIMER design database development and operation manual

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Hahn, Do Hee; Lee, Yong Bum; Chang, Won Pyo

    2000-12-01

    KALIMER Design Database is developed to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applications. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), 3D CAD database, Team Cooperation System, and Reserved Documents. Results Database is a research results database for mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is a schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment.

  2. Core story creation: analysing narratives to construct stories for learning.

    Science.gov (United States)

    Petty, Julia; Jarvis, Joy; Thomas, Rebecca

    2018-03-16

    Educational research uses narrative enquiry to gain and interpret people's experiences. Narrative analysis is used to organise and make sense of acquired narrative. 'Core story creation' is a way of managing raw data obtained from narrative interviews to construct stories for learning. To explain how core story creation can be used to construct stories from raw narratives obtained by interviewing parents about their neonatal experiences and then use these stories to educate learners. Core story creation involves reconfiguration of raw narratives. Reconfiguration includes listening to and rereading transcribed narratives, identifying elements of 'emplotment' and reordering these to form a constructed story. Thematic analysis is then performed on the story to draw out learning themes informed by the participants. Core story creation using emplotment is a strategy of narrative reconfiguration that produces stories which can be used to develop resources relating to person-centred education about the patient experience. Stories constructed from raw narratives in the context of constructivism can provide a medium or an 'end product' for use in learning resource development. This can then contribute to educating students or health professionals about patients' experiences. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  3. The use of database management systems and artificial intelligence in automating the planning of optical navigation pictures

    Science.gov (United States)

    Davis, Robert P.; Underwood, Ian M.

    1987-01-01

    The use of database management systems (DBMS) and AI to minimize human involvement in the planning of optical navigation pictures for interplanetary space probes is discussed, with application to the Galileo mission. Parameters characterizing the desirability of candidate pictures, and the program generating them, are described. How these parameters automatically build picture records in a database, and the definition of the database structure, are then discussed. The various rules, priorities, and constraints used in selecting pictures are also described. An example is provided of an expert system, written in Prolog, for automatically performing the selection process.

  4. Database security - how can developers and DBAs do it together and what can other Service Managers learn from it

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    This talk gives an overview of security threats affecting databases, preventive measures that we are taking at CERN and best practices in the industry. The presentation will describe how generic the threats are and how can other service managers profit from the database experience to protect other systems.

  5. Clinical Databases for Chest Physicians.

    Science.gov (United States)

    Courtwright, Andrew M; Gabriel, Peter E

    2018-04-01

    A clinical database is a repository of patient medical and sociodemographic information focused on one or more specific health condition or exposure. Although clinical databases may be used for research purposes, their primary goal is to collect and track patient data for quality improvement, quality assurance, and/or actual clinical management. This article aims to provide an introduction and practical advice on the development of small-scale clinical databases for chest physicians and practice groups. Through example projects, we discuss the pros and cons of available technical platforms, including Microsoft Excel and Access, relational database management systems such as Oracle and PostgreSQL, and Research Electronic Data Capture. We consider approaches to deciding the base unit of data collection, creating consensus around variable definitions, and structuring routine clinical care to complement database aims. We conclude with an overview of regulatory and security considerations for clinical databases. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  6. The Co-creation Continuum

    DEFF Research Database (Denmark)

    Ind, Nicholas; Iglesias, Oriol; Markovic, Stefan

    2017-01-01

    -creation - from tactical market research tool to strategic collaborative innovation method, and shows that brands can be positioned along a continuum between these two polarities. This article also presents the implications for those that want to seize the potential of co-creation....

  7. REALIZING BUSINESS PROCESS MANAGEMENT BY HELP OF A PROCESS MAPPING DATABASE TOOL

    CERN Document Server

    Vergili, Ceren

    2016-01-01

    In a typical business sector, processes are the building blocks of the achievement. A considerable percentage of the processes are consisting of business processes. This fact is bringing the fact that business sectors are in need of a management discipline. Business Process Management (BPM) is a discipline that combines modelling, automation, execution, control, measurement, and optimization of process by considering enterprise goals, spanning systems, employees, customers, and partners. CERN’s EN – HE – HM section desires to apply the BPM discipline appropriately for improving their necessary technical, administrative and managerial actions to supply appropriate CERN industrial transport, handling and lifting equipment and to maintain it. For this reason, a Process Mapping Database Tool is created to develop a common understanding about how the section members can visualize their processes, agree on quality standards and on how to improve. It provides a management support by establishing Process Charts...

  8. Path creation in the software industry

    DEFF Research Database (Denmark)

    Leimbach, Timo

    2017-01-01

    The article analyzes the development of the German software company Software AG, which was among the few European companies that succeeded in the US market already in the 1970s. Utilizing the concept of "path creation" it examines how early success impacted the development of the company. It shows...... that at least two paths in the development, the focus on the ADABAS product ecosystem and the underlying technology as well as the strong internationalization, relate to the early success and influenced the further evolution of it. The analyses reveal that they played an important role in how the company...... reacted on the rise of relational databases and the vertical disintegration of the computer industry. As a consequence of the late adoption of them they company got into troubles and needed adjust their profile and orientation during the 1990s and early 2000s, which is analyzed in the final part...

  9. Enhancing Education for Sustainable Development in Environmental University Programmes: A Co-Creation Approach

    Directory of Open Access Journals (Sweden)

    Maria Rosario Perello-Marín

    2018-01-01

    Full Text Available The purpose of this study is to analyse co-creation approach as a strategy at HE as a prerequisite for a successful implementation of sustainable development (HESD, while considering student collaboration in university processes. A questionnaire was handed in to 395 undergraduate environmental students from twelve Ecuadorian universities to test a structural equation model that included four variables—participation, co-creation, satisfaction, and trust. It is worth noting that these topics are increasingly relevant in competitive and innovative universities when promoting management in HESD. The results verify that student participation, as one of the key ESD skills, has a significant and positive influence on co-creation as a generator of student satisfaction and trust, especially in this context. Co-creation, from a higher education perspective, from the premise that students are the centre of the learning process, reinforces the education quality principles in an innovative way, and promotes the HESD perspectives.

  10. ABOUT APPROACHES OF CREATION OF INTEGRATED INFORMATION SYSTEM PDM-ERP

    Directory of Open Access Journals (Sweden)

    V. G. Mikhailov

    2016-01-01

    Full Text Available The problems which has added in the field of creation of systems PDM and their integration with ERP is considered. The analysis of the reasons of low efficiency existing PDM is carried out: insufficiency of the primary information brought in PDM unit, structures of a DB, entering of designations in one field, application of referential character of guiding of composition that leads to lowering of its functionality and creates problems with integration with ERP.It is shown that the uniform integrated information system created on uniform databases is necessary for the enterprises with a full stroke, using as the primary document card part-bom-unit, instead of a file. For it other is necessary in difference from databases existing the general-purpose structure in which it is possible to bring any information.Implementation of the new system CDRP, uniting on functional PDM-ERP and providing enterprise basic needs is offered.

  11. Technical Knowledge Creation: Enabling Tacit Knowledge Use

    DEFF Research Database (Denmark)

    Søberg, Peder Veng; Chaudhuri, Atanu

    2018-01-01

    The paper investigates knowledge creation in nascent technical industries, a somewhat neglected empirical setting concerning knowledge creation. Frameworks on organizational learning and knowledge creation assume that knowledge creation depends on language creation and neglect the benefits involved...... by allowing elements of new product and process ideas to mature in a tacit form, whereas cognitive neuroscience data suggests that technical knowledge creation is largely nonlinguistic. The four case studies point to excessive reliance on group discussion, a need for more trial and error and that field tests...... and prototypes generate new learnings that save time and lowers subsequent risks. Technical knowledge creation in nascent high tech industries requires opportunities to work with and further develop knowledge in its tacit form. The paper refines frameworks on organizational learning and knowledge creation...

  12. HATCHES - a thermodynamic database and management system

    International Nuclear Information System (INIS)

    Cross, J.E.; Ewart, F.T.

    1990-03-01

    The Nirex Safety Assessment Research Programme has been compiling the thermodynamic data necessary to allow simulations of the aqueous behaviour of the elements important to radioactive waste disposal to be made. These data have been obtained from the literature, when available, and validated for the conditions of interest by experiment. In order to maintain these data in an accessible form and to satisfy quality assurance on all data used for assessments, a database has been constructed which resides on a personal computer operating under MS-DOS using the Ashton-Tate dBase III program. This database contains all the input data fields required by the PHREEQE program and, in addition, a body of text which describes the source of the data and the derivation of the PHREEQE input parameters from the source data. The HATCHES system consists of this database, a suite of programs to facilitate the searching and listing of data and a further suite of programs to convert the dBase III files to PHREEQE database format. (Author)

  13. Knowledge creation in nursing education.

    Science.gov (United States)

    Hassanian, Zahra Marzieh; Ahanchian, Mohammad Reza; Ahmadi, Suleiman; Hossein Gholizadeh, Rezvan; Karimi-Moonaghi, Hossein

    2014-09-28

    In today's society, knowledge is recognized as a valuable social asset and the educational system is in search of a new strategy that allows them to construct their knowledge and experience. The purpose of this study was to explore the process of knowledge creation in nursing education. In the present study, the grounded theory approach was used. This method provides a comprehensive approach to collecting, organizing, and analyzing data. Data were obtained through 17 semi-structured interviews with nursing faculties and nursing students. Purposeful and theoretical sampling was conducted. Based on the method of Strauss and Corbin, the data were analyzed using fragmented, deep, and constant-comparative methods. The main categories included striving for growth and reduction of ambiguity, use of knowledge resources, dynamism of mind and social factors, converting knowledge, and creating knowledge. Knowledge was converted through mind processes, individual and group reflection, praxis and research, and resulted in the creation of nursing knowledge. Discrete nursing knowledge is gained through disconformity research in order to gain more individual advantages. The consequence of this analysis was gaining new knowledge. Knowledge management must be included in the mission and strategic planning of nursing education, and it should be planned through operational planning in order to create applicable knowledge.

  14. Job Creation and Job Types

    DEFF Research Database (Denmark)

    Kuhn, Johan M.; Malchow-Møller, Nikolaj; Sørensen, Anders

    We extend earlier analyses of the job creation of start-ups vs. established firms by taking into consideration the educational content of the jobs created and destroyed. We define educationspecific measures of job creation and job destruction at the firm level, and we use these to construct...... a measure of “surplus job creation” defined as jobs created on top of any simultaneous destruction of similar jobs in incumbent firms in the same region and industry. Using Danish employer-employee data from 2002-7, which identify the start-ups and which cover almost the entire private sector......, these measures allow us to provide a more nuanced assessment of the role of entrepreneurial firms in the job-creation process than previous studies. Our findings show that while start-ups are responsible for the entire overall net job creation, incumbents account for more than a third of net job creation within...

  15. Facilitating Value Creation and Delivery in Construction Projects

    DEFF Research Database (Denmark)

    Thyssen, Mikael Hygum; Bonke, Sten

    This thesis is about value creation in the early stages of construction design processes. It has been problem-driven with a specific management concept, the workshop model, as an outset. Essentially the question was; how should construction project design processes be managed with the objective...... awareness about the potential ‘pit-falls’ observed in the case-studies by means of three metaphors for reflection and design-group adjustment. These are: (1) Part-whole conversation, (2) Game of persuasion and (3) Hyper-reality. Reflection and adjustment may require the inclusion of a facilitator....... In addition, concrete suggestions for further development of the design management concept, the workshop model, are provided. In general, the thesis contributes to the emerging literature on construction design man-agement, which is still in its infancy. In addition, the theory part of the thesis contrib...

  16. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 9, May 2008

    International Nuclear Information System (INIS)

    2008-05-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an Internet-based application which contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories in IAEA Member States. It can be accessed via the following Internet address: http://www-newmdb.iaea.org. The Country Waste Profiles provide a concise summary of the information entered into the NEWMDB system by each participating Member State. This Profiles report is based on data collected using the NEWMDB from May to December 2007

  17. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 8, August 2007

    International Nuclear Information System (INIS)

    2007-08-01

    The IAEA's Net Enabled Waste Management Database (NEWMDB) is an Internet-based application which contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories in IAEA Member States. It can be accessed via the following Internet address: http://www-newmdb.iaea.org. The Country Waste Profiles provide a concise summary of the information entered into the NEWMDB system by each participating Member State. This Profiles report is based on data collected using the NEWMDB from May to December 2006

  18. Respiratory cancer database: An open access database of respiratory cancer gene and miRNA.

    Science.gov (United States)

    Choubey, Jyotsna; Choudhari, Jyoti Kant; Patel, Ashish; Verma, Mukesh Kumar

    2017-01-01

    Respiratory cancer database (RespCanDB) is a genomic and proteomic database of cancer of respiratory organ. It also includes the information of medicinal plants used for the treatment of various respiratory cancers with structure of its active constituents as well as pharmacological and chemical information of drug associated with various respiratory cancers. Data in RespCanDB has been manually collected from published research article and from other databases. Data has been integrated using MySQL an object-relational database management system. MySQL manages all data in the back-end and provides commands to retrieve and store the data into the database. The web interface of database has been built in ASP. RespCanDB is expected to contribute to the understanding of scientific community regarding respiratory cancer biology as well as developments of new way of diagnosing and treating respiratory cancer. Currently, the database consist the oncogenomic information of lung cancer, laryngeal cancer, and nasopharyngeal cancer. Data for other cancers, such as oral and tracheal cancers, will be added in the near future. The URL of RespCanDB is http://ridb.subdic-bioinformatics-nitrr.in/.

  19. The determinants of Bank Profitability: Does Liquidity Creation matter?

    Directory of Open Access Journals (Sweden)

    Ahmad Sahyouni

    2018-02-01

    Full Text Available Using a panel data set of 4995 banks across 11 developed and emerging countries during the period (2011-2015, this report analyses the amount of liquidity created by banks, how liquidity creation, bank-specific and the macroeconomic factors affecting bank profitability. The results show evidence of increased creation of liquidity over the period. By applying the panel data fixed effect technique, banks that create more liquidity, are set up to have lower profitability. As well as, Asset management, bank size and capital ratio are positively correlated with bank profitability. While, credit quality and operating efficiency affect bank’s profits negatively. Additionally, macroeconomic factors have different impact on profitability indicators in each market. Our findings may help decision makers inside and outside bank to determine important factors affecting bank profitability.

  20. Pattern-based information portal for business plan co-creation

    DEFF Research Database (Denmark)

    Tanev, Stoyan; Bontchev, Boyan; Ruskov, Petko

    2010-01-01

    in order to help them in development of effective and efficient business plans. It will facilitate entrepreneurs in co-experimenting and co-learning more frequently and faster. Moreover, the paper focuses on the software architecture of the pattern based portal and explains the functionality of its modules......Creation of business plans helps entrepreneurs in managing identification of business opportunities and committing necessary resources for process evolution. Applying patterns in business plan creation facilitates the identification of effective solutions that were adopted in the past and may...... provide a basis for adopting similar solutions in the future within given business context. The article presents the system design of an information portal for business plan cocreation based on patterns. The portal is going to provide start-up and entrepreneurs with ready-to-modify business plan patterns...

  1. Protocol for developing a Database of Zoonotic disease Research in India (DoZooRI).

    Science.gov (United States)

    Chatterjee, Pranab; Bhaumik, Soumyadeep; Chauhan, Abhimanyu Singh; Kakkar, Manish

    2017-12-10

    Zoonotic and emerging infectious diseases (EIDs) represent a public health threat that has been acknowledged only recently although they have been on the rise for the past several decades. On an average, every year since the Second World War, one pathogen has emerged or re-emerged on a global scale. Low/middle-income countries such as India bear a significant burden of zoonotic and EIDs. We propose that the creation of a database of published, peer-reviewed research will open up avenues for evidence-based policymaking for targeted prevention and control of zoonoses. A large-scale systematic mapping of the published peer-reviewed research conducted in India will be undertaken. All published research will be included in the database, without any prejudice for quality screening, to broaden the scope of included studies. Structured search strategies will be developed for priority zoonotic diseases (leptospirosis, rabies, anthrax, brucellosis, cysticercosis, salmonellosis, bovine tuberculosis, Japanese encephalitis and rickettsial infections), and multiple databases will be searched for studies conducted in India. The database will be managed and hosted on a cloud-based platform called Rayyan. Individual studies will be tagged based on key preidentified parameters (disease, study design, study type, location, randomisation status and interventions, host involvement and others, as applicable). The database will incorporate already published studies, obviating the need for additional ethical clearances. The database will be made available online, and in collaboration with multisectoral teams, domains of enquiries will be identified and subsequent research questions will be raised. The database will be queried for these and resulting evidence will be analysed and published in peer-reviewed journals. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise

  2. ATLAS database application enhancements using Oracle 11g

    CERN Document Server

    Dimitrov, G; The ATLAS collaboration; Blaszczyk, M; Sorokoletov, R

    2012-01-01

    The ATLAS experiment at LHC relies on databases for detector online data-taking, storage and retrieval of configurations, calibrations and alignments, post data-taking analysis, file management over the grid, job submission and management, condition data replication to remote sites. Oracle Relational Database Management System (RDBMS) has been addressing the ATLAS database requirements to a great extent for many years. Ten database clusters are currently deployed for the needs of the different applications, divided in production, integration and standby databases. The data volume, complexity and demands from the users are increasing steadily with time. Nowadays more than 20 TB of data are stored in the ATLAS production Oracle databases at CERN (not including the index overhead), but the most impressive number is the hosted 260 database schemas (for the most common case each schema is related to a dedicated client application with its own requirements). At the beginning of 2012 all ATLAS databases at CERN have...

  3. Automation System Goals for the Creation and Operation of the Tool

    Science.gov (United States)

    Khisamutdinov, R. M.; Khisamutdinov, M. R.

    2014-12-01

    Complex automation of processes for creation and operating the instrument, consistent linking of hierarchical levels in the single system of collection and processing of data and operations management, integration with TEAMCENTRE PLM and SAP/R3 ERP significantly improve the quality and efficiency of production preparation.

  4. Knowledge management in a waste based biorefinery in the QbD paradigm.

    Science.gov (United States)

    Rathore, Anurag S; Chopda, Viki R; Gomes, James

    2016-09-01

    Shifting resource base from fossil feedstock to renewable raw materials for production of chemical products has opened up an area of novel applications of industrial biotechnology-based process tools. This review aims to provide a concise and focused discussion on recent advances in knowledge management to facilitate efficient and optimal operation of a biorefinery. Application of quality by design (QbD) and process analytical technology (PAT) as tools for knowledge creation and management at different levels has been highlighted. Role of process integration, government policies, knowledge exchange through collaboration, and use of databases and computational tools have also been touched upon. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. National Levee Database: monitoring, vulnerability assessment and management in Italy

    Science.gov (United States)

    Barbetta, Silvia; Camici, Stefania; Maccioni, Pamela; Moramarco, Tommaso

    2015-04-01

    Italian levees and historical breach failures to be exploited in the framework of an operational procedure addressed to the seepage vulnerability assessment of river reaches where the levee system is an important structural measure against flooding. For its structure, INLED is a dynamic geospatial database with ongoing efforts to add levee data from authorities with the charge of hydraulic risk mitigation. In particular, the database is aimed to provide the available information about: i) location and condition of levees; ii) morphological and geometrical properties; iii) photographic documentation; iv) historical levee failures; v) assessment of vulnerability to overtopping and seepage carried out through a procedure based on simple vulnerability indexes (Camici et al. 2014); vi) management, control and maintenance; vii)flood hazard maps developed by assuming the levee system undamaged/damaged during the flood event. Currently, INLED contains data of levees that are mostly located in the Tiber basin, Central Italy. References Apel H., Merz B. & Thieken A.H. Quantification of uncertainties in flood risk assessments. Int J River Basin Manag 2008, 6, (2), 149-162. Camici S,, Barbetta S., Moramarco T., Levee body vulnerability to seepage: the case study of the levee failure along the Foenna stream on 1st January 2006 (central Italy)", Journal of Flood Risk Management, in press. Colleselli F. Geotechnical problems related to river and channel embankments. Rotterdam, the Netherlands: Springer, 1994. H. R.Wallingford Consultants (HRWC). Risk assessment for flood and coastal defence for strategic planning: high level methodology technical report, London, 2003. Mazzoleni M., Bacchi B., Barontini S., Di Baldassarre G., Pilotti M. & Ranzi R. Flooding hazard mapping in floodplain areas affected by piping breaches in the Po River, Italy. J Hydrol Eng 2014, 19, (4), 717-731.

  6. Djeen (Database for Joomla!'s Extensible Engine): a research information management system for flexible multi-technology project administration.

    Science.gov (United States)

    Stahl, Olivier; Duvergey, Hugo; Guille, Arnaud; Blondin, Fanny; Vecchio, Alexandre Del; Finetti, Pascal; Granjeaud, Samuel; Vigy, Oana; Bidaut, Ghislain

    2013-06-06

    With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. We developed Djeen (Database for Joomla!'s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group.Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material.

  7. The Monitoring Erosion of Agricultural Land and spatial database of erosion events

    Science.gov (United States)

    Kapicka, Jiri; Zizala, Daniel

    2013-04-01

    In 2011 originated in The Czech Republic The Monitoring Erosion of Agricultural Land as joint project of State Land Office (SLO) and Research Institute for Soil and Water Conservation (RISWC). The aim of the project is collecting and record keeping information about erosion events on agricultural land and their evaluation. The main idea is a creation of a spatial database that will be source of data and information for evaluation and modeling erosion process, for proposal of preventive measures and measures to reduce negative impacts of erosion events. A subject of monitoring is the manifestations of water erosion, wind erosion and slope deformation in which cause damaged agriculture land. A website, available on http://me.vumop.cz, is used as a tool for keeping and browsing information about monitored events. SLO employees carry out record keeping. RISWC is specialist institute in the Monitoring Erosion of Agricultural Land that performs keeping the spatial database, running the website, managing the record keeping of events, analysis the cause of origins events and statistical evaluations of keeping events and proposed measures. Records are inserted into the database using the user interface of the website which has map server as a component. Website is based on database technology PostgreSQL with superstructure PostGIS and MapServer UMN. Each record is in the database spatial localized by a drawing and it contains description information about character of event (data, situation description etc.) then there are recorded information about land cover and about grown crops. A part of database is photodocumentation which is taken in field reconnaissance which is performed within two days after notify of event. Another part of database are information about precipitations from accessible precipitation gauges. Website allows to do simple spatial analysis as are area calculation, slope calculation, percentage representation of GAEC etc.. Database structure was designed

  8. Impacts of the creation, expansion and management of English wetlands on mosquito presence and abundance - developing strategies for future disease mitigation.

    Science.gov (United States)

    Medlock, Jolyon M; Vaux, Alexander G C

    2015-03-03

    The incidence of mosquito-borne diseases is increasing in Europe, partly due to the incursion of a number of invasive species known to be vectors of dengue and chikungunya viruses, but also due to the involvement of native species in the transmission of West Nile virus and malaria. For some of these pathogens, there is a risk of the re-emergence of vector-borne diseases that were once widespread in Europe, but declined partly due to large-scale land-drainage projects. Some mosquito species exploit container habitats as breeding sites in urban areas; an adaptation to human-made micro-habitats resulting from increased urbanisation. However, many species thrive in natural wetland ecosystems. Owing to the impacts of climate change there is an urgent need for environmental adaptation, such as the creation of new wetlands to mitigate coastal and inland flooding. In some cases, these initiatives can be coupled with environmental change strategies to protect a range of endangered flora and fauna species by enhancing and extending wetland landscapes, which may by driven by European legislation, particularly in urban areas. This paper reviews field studies conducted in England to assess the impact of newly created wetlands on mosquito colonisation in a) coastal, b) urban and c) arable reversion habitats. It also considers the impact of wetland management on mosquito populations and explores the implications of various water and vegetation management options on the range of British mosquito species. Understanding the impact of wetland creation and management strategies on mosquito prevalence and the potential risk of increasing the levels of nuisance or disease vector species will be crucial in informing health and well-being risk assessments, guiding targeted control, and anticipating the social effects of extreme weather and climate change. Although new wetlands will certainly extend aquatic habitats for mosquitoes, not all species will become a major nuisance or a vector

  9. BIM Guidelines Inform Facilities Management Databases: A Case Study over Time

    Directory of Open Access Journals (Sweden)

    Karen Kensek

    2015-08-01

    Full Text Available A building information model (BIM contains data that can be accessed and exported for other uses during the lifetime of the building especially for facilities management (FM and operations. Working under the guidance of well-designed BIM guidelines to insure completeness and compatibility with FM software, architects and contractors can deliver an information rich data model that is valuable to the client. Large owners such as universities often provide these detailed guidelines and deliverable requirements to their building teams. Investigation of the University of Southern California (USC Facilities Management Service’s (FMS website showed a detailed plan including standards, file names, parameter lists, and other requirements of BIM data, which were specifically designated for facilities management use, as deliverables on new construction projects. Three critical details were also unearthed in the reading of these documents: Revit was the default BIM software; COBie was adapted to help meet facilities management goals; and EcoDomus provided a display of the collected data viewed through Navisworks. Published accounts about the Cinema Arts Complex developed with and under these guidelines reported positive results. Further examination with new projects underway reveal the rapidly changing relational database landscape evident in the new USC “Project Record Revit Requirement Execution Plan (PRxP”.

  10. Strategic Risk Management and Corporate Value Creation

    DEFF Research Database (Denmark)

    Andersen, Torben Juul; Roggi, Oliviero

    Major corporate failures, periodic recessions, regional debt crises and volatile markets have intensified the focus on corporate risk management as the means to deal better with turbulent business conditions. Hence, the ability to respond effectively to the often dramatic environmental changes...... is considered an important source of competitive advantage. However, surprisingly little research has analyzed if the presumed advantages of effective risk management lead to superior performance or assessed important antecedents of effective risk management capabilities. Here we present a comprehensive study...... of risk management effectiveness and the relationship to corporate performance based on panel data for more than 3,400 firms accounting for over 33,500 annual observations during the turbulent period 1991-2010. Determining effective risk management as the ability to reduce earnings and cash flow...

  11. Creationism in Europe

    DEFF Research Database (Denmark)

    Blancke, Stefaan; Hjermitslev, Hans Henrik; Braeckman, Johan

    2013-01-01

    which material is missing from the literature (the “gaps”) and signal which gaps we think should first be filled. Third, on the basis of a forthcoming international historical study, we outline the possible factors that affect the popularity of creationism in Europe (the “prospects”). We also sketch how...... a sustained study of European creationism can contribute to other research domains such as the study of cultural evolution and the relation between science and religion....

  12. A Co-Creation Blended KM Model for Cultivating Critical-Thinking Skills

    Science.gov (United States)

    Yeh, Yu-chu

    2012-01-01

    Both critical thinking (CT) and knowledge management (KM) skills are necessary elements for a university student's success. Therefore, this study developed a co-creation blended KM model to cultivate university students' CT skills and to explore the underlying mechanisms for achieving success. Thirty-one university students participated in this…

  13. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    Science.gov (United States)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  14. Radioactive waste management profiles. A compilation of data from the Net Enabled Waste Management Database (NEWMDB). No. 6, November 2004 (last updated 2004.12.16)

    International Nuclear Information System (INIS)

    2005-03-01

    This Radioactive Waste Management Profiles report is a compilation of data collected by the Net Enabled Waste Management Database (NEWDB) from March to July 2004. The report contains information on national radioactive waste management programmes, plans and activities, relevant laws and regulations, policies and radioactive waste inventories. It provides or references details of the scope of NEWMDB data collections and it explains the formats of individual NEWMDB report pages

  15. Migration from relational to NoSQL database

    Science.gov (United States)

    Ghotiya, Sunita; Mandal, Juhi; Kandasamy, Saravanakumar

    2017-11-01

    Data generated by various real time applications, social networking sites and sensor devices is of very huge amount and unstructured, which makes it difficult for Relational database management systems to handle the data. Data is very precious component of any application and needs to be analysed after arranging it in some structure. Relational databases are only able to deal with structured data, so there is need of NoSQL Database management System which can deal with semi -structured data also. Relational database provides the easiest way to manage the data but as the use of NoSQL is increasing it is becoming necessary to migrate the data from Relational to NoSQL databases. Various frameworks has been proposed previously which provides mechanisms for migration of data stored at warehouses in SQL, middle layer solutions which can provide facility of data to be stored in NoSQL databases to handle data which is not structured. This paper provides a literature review of some of the recent approaches proposed by various researchers to migrate data from relational to NoSQL databases. Some researchers proposed mechanisms for the co-existence of NoSQL and Relational databases together. This paper provides a summary of mechanisms which can be used for mapping data stored in Relational databases to NoSQL databases. Various techniques for data transformation and middle layer solutions are summarised in the paper.

  16. Object-Oriented Database for Managing Building Modeling Components and Metadata: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Long, N.; Fleming, K.; Brackney, L.

    2011-12-01

    Building simulation enables users to explore and evaluate multiple building designs. When tools for optimization, parametrics, and uncertainty analysis are combined with analysis engines, the sheer number of discrete simulation datasets makes it difficult to keep track of the inputs. The integrity of the input data is critical to designers, engineers, and researchers for code compliance, validation, and building commissioning long after the simulations are finished. This paper discusses an application that stores inputs needed for building energy modeling in a searchable, indexable, flexible, and scalable database to help address the problem of managing simulation input data.

  17. Progress on statistical learning systems as data mining tools for the creation of automatic databases in Fusion environments

    International Nuclear Information System (INIS)

    Vega, J.; Murari, A.; Ratta, G.A.; Gonzalez, S.; Dormido-Canto, S.

    2010-01-01

    Nowadays, processing all information of a fusion database is a much more important issue than acquiring more data. Although typically fusion devices produce tens of thousands of discharges, specialized databases for physics studies are normally limited to a few tens of shots. This is due to the fact that these databases are almost always generated manually, which is a very time consuming and unreliable activity. The development of automatic methods to create specialized databases ensures first, the reduction of human efforts to identify and locate physical events, second, the standardization of criteria (reducing the vulnerability to human errors) and, third, the improvement of statistical relevance. Classification and regression techniques have been used for these purposes. The objective has been the automatic recognition of physical events (that can appear in a random and/or infrequent way) in waveforms and video-movies. Results are shown for the JET database.

  18. Integrated Storage and Management of Vector and Raster Data Based on Oracle Database

    Directory of Open Access Journals (Sweden)

    WU Zheng

    2017-05-01

    Full Text Available At present, there are many problems in the storage and management of multi-source heterogeneous spatial data, such as the difficulty of transferring, the lack of unified storage and the low efficiency. By combining relational database and spatial data engine technology, an approach for integrated storage and management of vector and raster data is proposed on the basis of Oracle in this paper. This approach establishes an integrated storage model on vector and raster data and optimizes the retrieval mechanism at first, then designs a framework for the seamless data transfer, finally realizes the unified storage and efficient management of multi-source heterogeneous data. By comparing experimental results with the international leading similar software ArcSDE, it is proved that the proposed approach has higher data transfer performance and better query retrieval efficiency.

  19. DEVELOPING MULTITHREADED DATABASE APPLICATION USING JAVA TOOLS AND ORACLE DATABASE MANAGEMENT SYSTEM IN INTRANET ENVIRONMENT

    OpenAIRE

    Raied Salman

    2015-01-01

    In many business organizations, database applications are designed and implemented using various DBMS and Programming Languages. These applications are used to maintain databases for the organizations. The organization departments can be located at different locations and can be connected by intranet environment. In such environment maintenance of database records become an assignment of complexity which needs to be resolved. In this paper an intranet application is designed an...

  20. Creationism in Europe

    DEFF Research Database (Denmark)

    For decades, the creationist movement was primarily situated in the United States. Then, in the 1970s, American creationists found their ideas welcomed abroad, first in Australia and New Zealand, then in Korea, India, South Africa, Brazil, and elsewhere—including Europe, where creationism plays....... It will be of interest to students and scholars in the history and philosophy of science, religious studies, and evolutionary theory, as well as policy makers and educators concerned about the spread of creationism in our time....

  1. All new custom path photo book creation

    Science.gov (United States)

    Wang, Wiley; Muzzolini, Russ

    2012-03-01

    In this paper, we present an all new custom path to allow consumers to have full control to their photos and the format of their books, while providing them with guidance to make their creation fast and easy. The users can choose to fully automate the initial creation, and then customize every page. The system manage many design themes along with numerous design elements, such as layouts, backgrounds, embellishments and pattern bands. The users can also utilize photos from multiple sources including their computers, Shutterfly accounts, Shutterfly Share sites and Facebook. The users can also use a photo as background, add, move and resize photos and text - putting what they want where they want instead of being confined to templates. The new path allows users to add embellishments anywhere in the book, and the high-performance platform can support up to 1,000 photos per book and up to 25 pictures per page. The path offers either Smart Autofill or Storyboard features allowing customers to populate their books with photos so they can add captions and customize the pages.

  2. Database and Expert Systems Applications

    DEFF Research Database (Denmark)

    Viborg Andersen, Kim; Debenham, John; Wagner, Roland

    schemata, query evaluation, semantic processing, information retrieval, temporal and spatial databases, querying XML, organisational aspects of databases, natural language processing, ontologies, Web data extraction, semantic Web, data stream management, data extraction, distributed database systems......This book constitutes the refereed proceedings of the 16th International Conference on Database and Expert Systems Applications, DEXA 2005, held in Copenhagen, Denmark, in August 2005.The 92 revised full papers presented together with 2 invited papers were carefully reviewed and selected from 390...... submissions. The papers are organized in topical sections on workflow automation, database queries, data classification and recommendation systems, information retrieval in multimedia databases, Web applications, implementational aspects of databases, multimedia databases, XML processing, security, XML...

  3. Managing Functional Power

    DEFF Research Database (Denmark)

    Rosenstand, Claus Andreas Foss; Laursen, Per Kyed

    2013-01-01

    How does one manage functional power relations between leading functions in vision driven digital media creation, and this from idea to master during the creation cycle? Functional power is informal, and it is understood as roles, e.g. project manager, that provide opportunities to contribute...... to the product quality. The area of interest is the vision driven digital media industry in general; however, the point of departure is the game industry due to its aesthetic complexity. The article's contribution to the area is a power graph, which shows the functional power of the leading functions according...... to a general digital media creation cycle. This is used to point out potential power conflicts and their consequences. It is concluded that there is normally more conflict potential in vision driven digital media creation than in digital media creation in general or in software development. The method...

  4. An online spatial database of Australian Indigenous Biocultural Knowledge for contemporary natural and cultural resource management.

    Science.gov (United States)

    Pert, Petina L; Ens, Emilie J; Locke, John; Clarke, Philip A; Packer, Joanne M; Turpin, Gerry

    2015-11-15

    With growing international calls for the enhanced involvement of Indigenous peoples and their biocultural knowledge in managing conservation and the sustainable use of physical environment, it is timely to review the available literature and develop cross-cultural approaches to the management of biocultural resources. Online spatial databases are becoming common tools for educating land managers about Indigenous Biocultural Knowledge (IBK), specifically to raise a broad awareness of issues, identify knowledge gaps and opportunities, and to promote collaboration. Here we describe a novel approach to the application of internet and spatial analysis tools that provide an overview of publically available documented Australian IBK (AIBK) and outline the processes used to develop the online resource. By funding an AIBK working group, the Australian Centre for Ecological Analysis and Synthesis (ACEAS) provided a unique opportunity to bring together cross-cultural, cross-disciplinary and trans-organizational contributors who developed these resources. Without such an intentionally collaborative process, this unique tool would not have been developed. The tool developed through this process is derived from a spatial and temporal literature review, case studies and a compilation of methods, as well as other relevant AIBK papers. The online resource illustrates the depth and breadth of documented IBK and identifies opportunities for further work, partnerships and investment for the benefit of not only Indigenous Australians, but all Australians. The database currently includes links to over 1500 publically available IBK documents, of which 568 are geo-referenced and were mapped. It is anticipated that as awareness of the online resource grows, more documents will be provided through the website to build the database. It is envisaged that this will become a well-used tool, integral to future natural and cultural resource management and maintenance. Copyright © 2015. Published

  5. Database/Operating System Co-Design

    OpenAIRE

    Giceva, Jana

    2016-01-01

    We want to investigate how to improve the information flow between a database and an operating system, aiming for better scheduling and smarter resource management. We are interested in identifying the potential optimizations that can be achieved with a better interaction between a database engine and the underlying operating system, especially by allowing the application to get more control over scheduling and memory management decisions. Therefore, we explored some of the issues that arise ...

  6. Representing clinical communication knowledge through database management system integration.

    Science.gov (United States)

    Khairat, Saif; Craven, Catherine; Gong, Yang

    2012-01-01

    Clinical communication failures are considered the leading cause of medical errors [1]. The complexity of the clinical culture and the significant variance in training and education levels form a challenge to enhancing communication within the clinical team. In order to improve communication, a comprehensive understanding of the overall communication process in health care is required. In an attempt to further understand clinical communication, we conducted a thorough methodology literature review to identify strengths and limitations of previous approaches [2]. Our research proposes a new data collection method to study the clinical communication activities among Intensive Care Unit (ICU) clinical teams with a primary focus on the attending physician. In this paper, we present the first ICU communication instrument, and, we introduce the use of database management system to aid in discovering patterns and associations within our ICU communications data repository.

  7. Knowledge base technology for CT-DIMS: Report 1. [CT-DIMS (Cutting Tool - Database and Information Management System)

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, E.E.

    1993-05-01

    This report discusses progress on the Cutting Tool-Database and Information Management System (CT-DIMS) project being conducted by the University of Illinois Urbana-Champaign (UIUC) under contract to the Department of Energy. This project was initiated in October 1991 by UIUC. The Knowledge-Based Engineering Systems Research Laboratory (KBESRL) at UIUC is developing knowledge base technology and prototype software for the presentation and manipulation of the cutting tool databases at Allied-Signal Inc., Kansas City Division (KCD). The graphical tool selection capability being developed for CT-DIMS in the Intelligent Design Environment for Engineering Automation (IDEEA) will provide a concurrent environment for simultaneous access to tool databases, tool standard libraries, and cutting tool knowledge.

  8. THE KNOWLEDGE MANAGEMENT FOR BEST PRACTICES SHARING IN A DATABASE AT THE TRIBUNAL REGIONAL FEDERAL DA PRIMEIRA REGIÃO

    Directory of Open Access Journals (Sweden)

    Márcia Mazo Santos de Miranda

    2010-08-01

    Full Text Available A quick, effective and powerful alternative for knowledge management is the systematic sharing of best practices. This study identified in the literature recommendations for structuring a best practices database and summarized the benefits of its deployment to the Tribunal Regional Federal da Primeira Região TRF - 1ª Região. A It was conducted a quantitative research was then carried out, with the distribuition of where questionnaires were distributed to federal judges of the TRF- 1ª Região, which was divided into 4 parts: magistrate profile, flow of knowledge / information, internal environment, organizational facilitator. As a result, we identified the need to have a best practices database in the Institution for the organizational knowledge identification, transfer and sharing. The conclusion presents recommendations for the development of the database and highlights its importance for knowledge management in an organization.

  9. Managing expectations: assessment of chemistry databases generated by automated extraction of chemical structures from patents.

    Science.gov (United States)

    Senger, Stefan; Bartek, Luca; Papadatos, George; Gaulton, Anna

    2015-12-01

    First public disclosure of new chemical entities often takes place in patents, which makes them an important source of information. However, with an ever increasing number of patent applications, manual processing and curation on such a large scale becomes even more challenging. An alternative approach better suited for this large corpus of documents is the automated extraction of chemical structures. A number of patent chemistry databases generated by using the latter approach are now available but little is known that can help to manage expectations when using them. This study aims to address this by comparing two such freely available sources, SureChEMBL and IBM SIIP (IBM Strategic Intellectual Property Insight Platform), with manually curated commercial databases. When looking at the percentage of chemical structures successfully extracted from a set of patents, using SciFinder as our reference, 59 and 51 % were also found in our comparison in SureChEMBL and IBM SIIP, respectively. When performing this comparison with compounds as starting point, i.e. establishing if for a list of compounds the databases provide the links between chemical structures and patents they appear in, we obtained similar results. SureChEMBL and IBM SIIP found 62 and 59 %, respectively, of the compound-patent pairs obtained from Reaxys. In our comparison of automatically generated vs. manually curated patent chemistry databases, the former successfully provided approximately 60 % of links between chemical structure and patents. It needs to be stressed that only a very limited number of patents and compound-patent pairs were used for our comparison. Nevertheless, our results will hopefully help to manage expectations of users of patent chemistry databases of this type and provide a useful framework for more studies like ours as well as guide future developments of the workflows used for the automated extraction of chemical structures from patents. The challenges we have encountered

  10. SIMILARITIES BETWEEN THE KNOWLEDGE CREATION AND CONVERSION MODEL AND THE COMPETING VALUES FRAMEWORK: AN INTEGRATIVE APPROACH

    Directory of Open Access Journals (Sweden)

    PAULO COSTA

    2016-12-01

    Full Text Available ABSTRACT Contemporaneously, and with the successive paradigmatic revolutions inherent to management since the XVII century, we are witnessing a new era marked by the structural rupture in the way organizations are perceived. Market globalization, cemented by quick technological evolutions, associated with economic, cultural, political and social transformations characterize a reality where uncertainty is the only certainty for organizations and managers. Knowledge management has been interpreted by managers and academics as a viable alternative in a logic of creation and conversation of sustainable competitive advantages. However, there are several barriers to the implementation and development of knowledge management programs in organizations, with organizational culture being one of the most preponderant. In this sense, and in this article, we will analyze and compare The Knowledge Creation and Conversion Model proposed by Nonaka and Takeuchi (1995 and Quinn and Rohrbaugh's Competing Values Framework (1983, since both have convergent conceptual lines that can assist managers in different sectors to guide their organization in a perspective of productivity, quality and market competitiveness.

  11. Microsoft Enterprise Consortium: A Resource for Teaching Data Warehouse, Business Intelligence and Database Management Systems

    Science.gov (United States)

    Kreie, Jennifer; Hashemi, Shohreh

    2012-01-01

    Data is a vital resource for businesses; therefore, it is important for businesses to manage and use their data effectively. Because of this, businesses value college graduates with an understanding of and hands-on experience working with databases, data warehouses and data analysis theories and tools. Faculty in many business disciplines try to…

  12. Electronic database of arterial aneurysms

    Directory of Open Access Journals (Sweden)

    Fabiano Luiz Erzinger

    2014-12-01

    Full Text Available Background:The creation of an electronic database facilitates the storage of information, as well as streamlines the exchange of data, making easier the exchange of knowledge for future research.Objective:To construct an electronic database containing comprehensive and up-to-date clinical and surgical data on the most common arterial aneurysms, to help advance scientific research.Methods:The most important specialist textbooks and articles found in journals and on internet databases were reviewed in order to define the basic structure of the protocol. Data were computerized using the SINPE© system for integrated electronic protocols and tested in a pilot study.Results:The data entered onto the system was first used to create a Master protocol, organized into a structure of top-level directories covering a large proportion of the content on vascular diseases as follows: patient history; physical examination; supplementary tests and examinations; diagnosis; treatment; and clinical course. By selecting items from the Master protocol, Specific protocols were then created for the 22 arterial sites most often involved by aneurysms. The program provides a method for collection of data on patients including clinical characteristics (patient history and physical examination, supplementary tests and examinations, treatments received and follow-up care after treatment. Any information of interest on these patients that is contained in the protocol can then be used to query the database and select data for studies.Conclusions:It proved possible to construct a database of clinical and surgical data on the arterial aneurysms of greatest interest and, by adapting the data to specific software, the database was integrated into the SINPE© system, thereby providing a standardized method for collection of data on these patients and tools for retrieving this information in an organized manner for use in scientific studies.

  13. The total value equation: a suggested framework for understanding value creation in diagnostic radiology.

    Science.gov (United States)

    Heller, Richard E

    2014-01-01

    As a result of macroeconomic forces necessitating fundamental changes in health care delivery systems, value has become a popular term in the medical industry. Much has been written recently about the idea of value as it relates to health care services in general and the practice of radiology in particular. Of course, cost, value, and cost-effectiveness are not new topics of conversation in radiology. Not only is value one of the most frequently used and complex words in management, entire classes in business school are taught around the concept of understanding and maximizing value. But what is value, and when speaking of value creation strategies, what is it exactly that is meant? For the leader of a radiology department, either private or academic, value creation is a core function. This article provides a deeper examination of what value is, what drives value creation, and how practices and departments can evaluate their own value creation efficiencies. An equation, referred to as the Total Value Equation, is presented as a framework to assess value creation activities and strategies. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  14. Solid Waste Projection Model: Database User's Guide

    International Nuclear Information System (INIS)

    Blackburn, C.L.

    1993-10-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for using Version 1.4 of the SWPM database: system requirements and preparation, entering and maintaining data, and performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not Provide instruction in the use of Paradox, the database management system in which the SWPM database is established

  15. Quantum creation of an inflationary Universe

    International Nuclear Information System (INIS)

    Linde, A.D.

    1984-01-01

    The problem of quantum creation of the Universe is discussed. It is shown that the process of quantum creation of the Universe in a wide class on elementary particle theories leads with a high probability to the creation of an exponentially expanding (inflationary) Universe. Universe size after expansion should exceed l approximately 10 28 cm

  16. Resource Survey Relational Database Management System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Mississippi Laboratories employ both enterprise and localized data collection systems for recording data. The databases utilized by these applications range from...

  17. Heterogeneous distributed databases: A case study

    Science.gov (United States)

    Stewart, Tracy R.; Mukkamala, Ravi

    1991-01-01

    Alternatives are reviewed for accessing distributed heterogeneous databases and a recommended solution is proposed. The current study is limited to the Automated Information Systems Center at the Naval Sea Combat Systems Engineering Station at Norfolk, VA. This center maintains two databases located on Digital Equipment Corporation's VAX computers running under the VMS operating system. The first data base, ICMS, resides on a VAX11/780 and has been implemented using VAX DBMS, a CODASYL based system. The second database, CSA, resides on a VAX 6460 and has been implemented using the ORACLE relational database management system (RDBMS). Both databases are used for configuration management within the U.S. Navy. Different customer bases are supported by each database. ICMS tracks U.S. Navy ships and major systems (anti-sub, sonar, etc.). Even though the major systems on ships and submarines have totally different functions, some of the equipment within the major systems are common to both ships and submarines.

  18. Value Creation in International Business

    DEFF Research Database (Denmark)

    is a pioneering two volume work intended to provoke theoretical and empirical development in International Business research. Moreover, it is intended as a bridge between concepts derived from general business firm-level research agendas such as value creation and business model, and internationalization......The edited collection brings into focus the meanings, interpretations and the process of value creation in international business. Exploring value creation in the context of emerging and developed economies, Volume 2 takes the perspective of small and medium sized enterprises and examines various...

  19. Particle creation in inhomogeneous spacetimes

    International Nuclear Information System (INIS)

    Frieman, J.A.

    1989-01-01

    We study the creation of particles by inhomogeneous perturbations of spatially flat Friedmann-Robertson-Walker cosmologies. For massless scalar fields, the pair-creation probability can be expressed in terms of geometric quantities (curvature invariants). The results suggest that inhomogeneities on scales up to the particle horizon will be damped out near the Planck time. Perturbations on scales larger than the horizon are explicitly shown to yield no created pairs. The results generalize to inhomogeneous spacetimes several earlier studies of pair creation in homogeneous anisotropic cosmologies

  20. Skyrmion creation and annihilation by spin waves

    International Nuclear Information System (INIS)

    Liu, Yizhou; Yin, Gen; Lake, Roger K.; Zang, Jiadong; Shi, Jing

    2015-01-01

    Single skyrmion creation and annihilation by spin waves in a crossbar geometry are theoretically analyzed. A critical spin-wave frequency is required both for the creation and the annihilation of a skyrmion. The minimum frequencies for creation and annihilation are similar, but the optimum frequency for creation is below the critical frequency for skyrmion annihilation. If a skyrmion already exists in the cross bar region, a spin wave below the critical frequency causes the skyrmion to circulate within the central region. A heat assisted creation process reduces the spin-wave frequency and amplitude required for creating a skyrmion. The effective field resulting from the Dzyaloshinskii-Moriya interaction and the emergent field of the skyrmion acting on the spin wave drive the creation and annihilation processes

  1. The Application of SECI Model as a Framework of Knowledge Creation in Virtual Learning: Case Study of IUST Virtual Classes

    Science.gov (United States)

    Hosseini, Seyede Mehrnoush

    2011-01-01

    The research aims to define SECI model of knowledge creation (socialization, externalization, combination, and internalization) as a framework of Virtual class management which can lead to better online teaching-learning mechanisms as well as knowledge creation. It has used qualitative research methodology including researcher's close observation…

  2. Automated knowledge base development from CAD/CAE databases

    Science.gov (United States)

    Wright, R. Glenn; Blanchard, Mary

    1988-01-01

    Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.

  3. Value Co-creation Behaviour

    DEFF Research Database (Denmark)

    Laud, Gaurangi; Karpen, Ingo Oswald

    2017-01-01

    Purpose:The purpose of this paper is to identify antecedents and consequences ofcustomers’ value co-creation behaviour (VCB). VCB as a means to facilitatevalue realisation processes is gaining importance in service research andpractice. Encouraging such enactments can be challenging, but can also...... offercompetitive advantages. Design/methodology/approach:We empirically investigate a conceptual model by converging threecontemporary concepts of co-creation research – embeddedness, VCB and value-in-context– and examining the interdependencies between them. Data were collected in anonline forum of a leading......, the studyhighlights the significance of the nature of customer’s social constellationsto develop contexts where value outcomes are actualised. Understanding thefactors that shape VCB offers insights for firms to recognise how and wherevalue propositions can be deployed that drives on-going co-creation processes...

  4. OCA Oracle Database 11g database administration I : a real-world certification guide

    CERN Document Server

    Ries, Steve

    2013-01-01

    Developed as a practical book, ""Oracle Database 11g Administration I Certification Guide"" will show you all you need to know to effectively excel at being an Oracle DBA, for both examinations and the real world. This book is for anyone who needs the essential skills to become an Oracle DBA, pass the Oracle Database Administration I exam, and use those skills in the real world to manage secure, high performance, and highly available Oracle databases.

  5. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    Science.gov (United States)

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  6. Study of relational nuclear databases and online services

    International Nuclear Information System (INIS)

    Fan Tieshuan; Guo Zhiyu; Liu Wenlong; Ye Weiguo; Feng Yuqing; Song Xiangxiang; Huang Gang; Hong Yingjue; Liu Tinjin; Chen Jinxiang; Tang Guoyou; Shi Zhaoming; Liu Chi; Chen Jiaer; Huang Xiaolong

    2004-01-01

    A relational nuclear database management and web-based services software system has been developed. Its objective is to allow users to access numerical and graphical representation of nuclear data and to easily reconstruct nuclear data in original standardized formats from the relational databases. It presents 9 relational nuclear libraries: 5 ENDF format neutron reaction databases (BROND), CENDL, ENDF, JEF and JENDL), the ENSDF database, the EXFOR database, the IAEA Photonuclear Data Library and the charged particle reaction data from the FENDL database. The computer programs providing support for database management and data retrievals are based on the Linux implementation of PHP and the MySQL software, and are platform-independent. The first version of this software was officially released in September 2001

  7. One approach to design of speech emotion database

    Science.gov (United States)

    Uhrin, Dominik; Chmelikova, Zdenka; Tovarek, Jaromir; Partila, Pavol; Voznak, Miroslav

    2016-05-01

    This article describes a system for evaluating the credibility of recordings with emotional character. Sound recordings form Czech language database for training and testing systems of speech emotion recognition. These systems are designed to detect human emotions in his voice. The emotional state of man is useful in the security forces and emergency call service. Man in action (soldier, police officer and firefighter) is often exposed to stress. Information about the emotional state (his voice) will help to dispatch to adapt control commands for procedure intervention. Call agents of emergency call service must recognize the mental state of the caller to adjust the mood of the conversation. In this case, the evaluation of the psychological state is the key factor for successful intervention. A quality database of sound recordings is essential for the creation of the mentioned systems. There are quality databases such as Berlin Database of Emotional Speech or Humaine. The actors have created these databases in an audio studio. It means that the recordings contain simulated emotions, not real. Our research aims at creating a database of the Czech emotional recordings of real human speech. Collecting sound samples to the database is only one of the tasks. Another one, no less important, is to evaluate the significance of recordings from the perspective of emotional states. The design of a methodology for evaluating emotional recordings credibility is described in this article. The results describe the advantages and applicability of the developed method.

  8. Advanced Query Formulation in Deductive Databases.

    Science.gov (United States)

    Niemi, Timo; Jarvelin, Kalervo

    1992-01-01

    Discusses deductive databases and database management systems (DBMS) and introduces a framework for advanced query formulation for end users. Recursive processing is described, a sample extensional database is presented, query types are explained, and criteria for advanced query formulation from the end user's viewpoint are examined. (31…

  9. Designing Learning for Co-Creation

    DEFF Research Database (Denmark)

    Gnaur, Dorina; Larsen-Nielsen, Marie

    2017-01-01

    Designing learning for co-creation - conceptual and practical considerations, Dorina Gnaur and Inger Marie Larsen-Nielsen explore the practical educational point of view. The question they are posing themselves is: how can higher and further education (HE) educate for co-creation, that is, provide...... educational frameworks that respond to the societal demand for co-creation, particularly within the public welfare sector? First, they focus on which organisational and individual requirements an HE learning design should take into account in order to support the diffusion of co-creation competences....... Then they argue for the need to integrate these considerations in the learning design and demonstrate a practical application in the form of a didactical design. They call this a hybrid learning design, in that it takes advantage of technological developments to mediate co-creative learning in multiple learning...

  10. Experience of MAPS in monitoring of personnel movement with on-line database management system

    International Nuclear Information System (INIS)

    Rajendran, T.S.; Anand, S.D.

    1992-01-01

    As a part of physical protection system, access control system has been installed in Madras Atomic Power Station(MAPS) to monitor and regulate the movement of persons within MAPS. The present system in its original form was meant only for security monitoring. A PC based database management system was added to this to computerize the availability of work force for actual work. (author). 2 annexures

  11. Implementing a Value Creation Model in a Startup

    OpenAIRE

    Guillaume Marceau

    2014-01-01

    In this article, we propose a value creation model based on the principle of the chain of value in corporate management. We particularly endeavor to show the incidence of a relevant allowance of a company's resources on its profitability, by distinguishing on one hand the activities that are directly profitable and on the other hand those which have a support function. This distinction is applied to the study of a services company in computer engineering, in terms of internal balance and pote...

  12. Public Value Creation Enabled by Healthcare IS Projects – a resource-based-view

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Svejvig, Per; Laursen, Markus

    Creation of value from IT projects is a recurring theme that has diffused into healthcare information sys-tems (HIS). By applying a resource-based-view on findings from a study on the optimisation project of an integrated health information system (HIS) we develop a framework of capabilities needed...... in a public HIS setting to create value. The framework consists of Professional- , Organisational-, Patient Perceived- and Employee Perceived-Value dimensions. HIS is partly overlooked in the public management literature and the aspect of emergence and (personal as well as organisational) learning plays...... an important role in the creation value in HIS-projects....

  13. Database and Geographic Information System (GIS for the Via Francigena: a New Way to Read Sigeric’s itinerary

    Directory of Open Access Journals (Sweden)

    Alessio Innocenti

    2017-12-01

    Full Text Available In order to define the path of a medieval road, it is essential to use different kind of sources, like the written texts, the archaeological and material remains referred to the road, the study of the geomorphological context, the toponymy. Modern technologies can help us to examine and use all these sources: first of all, the creation of a database could permit to manage all the data we have about a road; secondly, the database could be loaded in a GIS software, in order to answer to some historical, archaeological and topographical questions. This methodology can be applied to the “Via Francigena” case: starting from Sigeric’s itinerary, which is the main source about the road, it is possible to create a database containing all the data about the submansiones mentioned by the text. Furthermore, loading the database in a GIS software gives the possibility to study the road in its entire length, helping us to understand the relationships between the Via Francigena, the other itineraries and the ancient roads. But at the same time, this enables us to study the route in a specific region, and it could also be the opportunity to comprehend the evolution of the historical landscapes, focusing both on the track of the road and on the territory that the road has conditioned, according to the concept of “street areas”.

  14. Efficiency improvements of offline metrology job creation

    Science.gov (United States)

    Zuniga, Victor J.; Carlson, Alan; Podlesny, John C.; Knutrud, Paul C.

    1999-06-01

    Progress of the first lot of a new design through the production line is watched very closely. All performance metrics, cycle-time, in-line measurement results and final electrical performance are critical. Rapid movement of this lot through the line has serious time-to-market implications. Having this material waiting at a metrology operation for an engineer to create a measurement job plan wastes valuable turnaround time. Further, efficient use of a metrology system is compromised by the time required to create and maintain these measurement job plans. Thus, having a method to develop metrology job plans prior to the actual running of the material through the manufacture area can significantly improve both cycle time and overall equipment efficiency. Motorola and Schlumberger have worked together to develop and test such a system. The Remote Job Generator (RJG) created job plans for new device sin a manufacturing process from an NT host or workstation, offline. This increases available system tim effort making production measurements, decreases turnaround time on job plan creation and editing, and improves consistency across job plans. Most importantly this allows job plans for new devices to be available before the first wafers of the device arrive at the tool for measurement. The software also includes a database manager which allows updates of existing job plans to incorporate measurement changes required by process changes or measurement optimization. This paper will review the result of productivity enhancements through the increased metrology utilization and decreased cycle time associated with the use of RJG. Finally, improvements in process control through better control of Job Plans across different devices and layers will be discussed.

  15. OAP- OFFICE AUTOMATION PILOT GRAPHICS DATABASE SYSTEM

    Science.gov (United States)

    Ackerson, T.

    1994-01-01

    The Office Automation Pilot (OAP) Graphics Database system offers the IBM PC user assistance in producing a wide variety of graphs and charts. OAP uses a convenient database system, called a chartbase, for creating and maintaining data associated with the charts, and twelve different graphics packages are available to the OAP user. Each of the graphics capabilities is accessed in a similar manner. The user chooses creation, revision, or chartbase/slide show maintenance options from an initial menu. The user may then enter or modify data displayed on a graphic chart. The cursor moves through the chart in a "circular" fashion to facilitate data entries and changes. Various "help" functions and on-screen instructions are available to aid the user. The user data is used to generate the graphics portion of the chart. Completed charts may be displayed in monotone or color, printed, plotted, or stored in the chartbase on the IBM PC. Once completed, the charts may be put in a vector format and plotted for color viewgraphs. The twelve graphics capabilities are divided into three groups: Forms, Structured Charts, and Block Diagrams. There are eight Forms available: 1) Bar/Line Charts, 2) Pie Charts, 3) Milestone Charts, 4) Resources Charts, 5) Earned Value Analysis Charts, 6) Progress/Effort Charts, 7) Travel/Training Charts, and 8) Trend Analysis Charts. There are three Structured Charts available: 1) Bullet Charts, 2) Organization Charts, and 3) Work Breakdown Structure (WBS) Charts. The Block Diagram available is an N x N Chart. Each graphics capability supports a chartbase. The OAP graphics database system provides the IBM PC user with an effective means of managing data which is best interpreted as a graphic display. The OAP graphics database system is written in IBM PASCAL 2.0 and assembler for interactive execution on an IBM PC or XT with at least 384K of memory, and a color graphics adapter and monitor. Printed charts require an Epson, IBM, OKIDATA, or HP Laser

  16. Relational databases for SSC design and control

    International Nuclear Information System (INIS)

    Barr, E.; Peggs, S.; Saltmarsh, C.

    1989-01-01

    Most people agree that a database is A Good Thing, but there is much confusion in the jargon used, and in what jobs a database management system and its peripheral software can and cannot do. During the life cycle of an enormous project like the SSC, from conceptual and theoretical design, through research and development, to construction, commissioning and operation, an enormous amount of data will be generated. Some of these data, originating in the early parts of the project, will be needed during commissioning or operation, many years in the future. Two of these pressing data management needs-from the magnet research and industrialization programs and the lattice design-have prompted work on understanding and adapting commercial database practices for scientific projects. Modern relational database management systems (rDBMS's) cope naturally with a large proportion of the requirements of data structures, like the SSC database structure built for the superconduction cable supplies, uses, and properties. This application is similar to the commercial applications for which these database systems were developed. The SSC application has further requirements not immediately satisfied by the commercial systems. These derive from the diversity of the data structures to be managed, the changing emphases and uses during the project lifetime, and the large amount of scientific data processing to be expected. 4 refs., 5 figs

  17. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    Science.gov (United States)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  18. Enabling Semantic Queries Against the Spatial Database

    Directory of Open Access Journals (Sweden)

    PENG, X.

    2012-02-01

    Full Text Available The spatial database based upon the object-relational database management system (ORDBMS has the merits of a clear data model, good operability and high query efficiency. That is why it has been widely used in spatial data organization and management. However, it cannot express the semantic relationships among geospatial objects, making the query results difficult to meet the user's requirement well. Therefore, this paper represents an attempt to combine the Semantic Web technology with the spatial database so as to make up for the traditional database's disadvantages. In this way, on the one hand, users can take advantages of ORDBMS to store and manage spatial data; on the other hand, if the spatial database is released in the form of Semantic Web, the users could describe a query more concisely with the cognitive pattern which is similar to that of daily life. As a consequence, this methodology enables the benefits of both Semantic Web and the object-relational database (ORDB available. The paper discusses systematically the semantic enriched spatial database's architecture, key technologies and implementation. Subsequently, we demonstrate the function of spatial semantic queries via a practical prototype system. The query results indicate that the method used in this study is feasible.

  19. Portuguese food composition database quality management system.

    Science.gov (United States)

    Oliveira, L M; Castanheira, I P; Dantas, M A; Porto, A A; Calhau, M A

    2010-11-01

    The harmonisation of food composition databases (FCDB) has been a recognised need among users, producers and stakeholders of food composition data (FCD). To reach harmonisation of FCDBs among the national compiler partners, the European Food Information Resource (EuroFIR) Network of Excellence set up a series of guidelines and quality requirements, together with recommendations to implement quality management systems (QMS) in FCDBs. The Portuguese National Institute of Health (INSA) is the national FCDB compiler in Portugal and is also a EuroFIR partner. INSA's QMS complies with ISO/IEC (International Organization for Standardisation/International Electrotechnical Commission) 17025 requirements. The purpose of this work is to report on the strategy used and progress made for extending INSA's QMS to the Portuguese FCDB in alignment with EuroFIR guidelines. A stepwise approach was used to extend INSA's QMS to the Portuguese FCDB. The approach included selection of reference standards and guides and the collection of relevant quality documents directly or indirectly related to the compilation process; selection of the adequate quality requirements; assessment of adequacy and level of requirement implementation in the current INSA's QMS; implementation of the selected requirements; and EuroFIR's preassessment 'pilot' auditing. The strategy used to design and implement the extension of INSA's QMS to the Portuguese FCDB is reported in this paper. The QMS elements have been established by consensus. ISO/IEC 17025 management requirements (except 4.5) and 5.2 technical requirements, as well as all EuroFIR requirements (including technical guidelines, FCD compilation flowchart and standard operating procedures), have been selected for implementation. The results indicate that the quality management requirements of ISO/IEC 17025 in place in INSA fit the needs for document control, audits, contract review, non-conformity work and corrective actions, and users' (customers

  20. Facilitating Value Co-Creation

    DEFF Research Database (Denmark)

    Veith, Anne; Assaf, Albert; Josiassen, Alexander

    2013-01-01

    will also lead to a high rewards. According to postmodern consumerism theory, consumers are intrinsically motivated to participate (Arnould et al., 2006; Borghini & Caru, 2008; Etgar, 2008; Fisher & Smith, 2011), but may also be extrinsic motivated by, for instance, appraisal and 'autonomy' (Etgar, 2008......). Therefore, for instance, being part of the process is a key incentive for consumers. Postmodern consumers' search for unique experiences calls for individualization, personalization, etc. Although Prahalad & Ramaswamy (2004), Karpen et al. (2008), and Karpen et al. (2011) have presented S-D Logic...... as a middle range theory it is still difficult for organizations to operationalize their co-creation efforts. This paper argues that postmodern consumerism can be used to guide the operationalization of the co-creation process by identifying the key facilitators of co-creation for the postmodern consumer...

  1. Ludic Educational Game Creation Tool

    DEFF Research Database (Denmark)

    Vidakis, Nikolaos; Syntychakis, Efthimios; Kalafatis, Konstantinos

    2015-01-01

    This paper presents initial findings and ongoing work of the game creation tool, a core component of the IOLAOS(IOLAOS in ancient Greece was a divine hero famed for helping with some of Heracles’s labors.) platform, a general open authorable framework for educational and training games. The game...... creation tool features a web editor, where the game narrative can be manipulated, according to specific needs. Moreover, this tool is applied for creating an educational game according to a reference scenario namely teaching schoolers road safety. A ludic approach is used both in game creation and play....... Helping children staying safe and preventing serious injury on the roads is crucial. In this context, this work presents an augmented version of the IOLAOS architecture including an enhanced game creation tool and a new multimodality module. In addition presents a case study for creating educational games...

  2. LHCb: Managing Large Data Productions in LHCb

    CERN Multimedia

    Tsaregorodtsev, A

    2009-01-01

    LHC experiments are producing very large volumes of data either accumulated from the detectors or generated via the Monte-Carlo modeling. The data should be processed as quickly as possible to provide users with the input for their analysis. Processing of multiple hundreds of terabytes of data necessitates generation, submission and following a huge number of grid jobs running all over the Computing Grid. Manipulation of these large and complex workloads is impossible without powerful production management tools. In LHCb, the DIRAC Production Management System (PMS) is used to accomplish this task. It enables production managers and end-users to deal with all kinds of data generation, processing and storage. Application workflow tools allow to define jobs as complex sequences of elementary application steps expressed as Directed Acyclic Graphs. Specialized databases and a number of dedicated software agents ensure automated data driven job creation and submission. The productions are accomplished by thorough ...

  3. Network and Database Security: Regulatory Compliance, Network, and Database Security - A Unified Process and Goal

    OpenAIRE

    Errol A. Blake

    2007-01-01

    Database security has evolved; data security professionals have developed numerous techniques and approaches to assure data confidentiality, integrity, and availability. This paper will show that the Traditional Database Security, which has focused primarily on creating user accounts and managing user privileges to database objects are not enough to protect data confidentiality, integrity, and availability. This paper is a compilation of different journals, articles and classroom discussions ...

  4. Security Management in a Multimedia System

    Science.gov (United States)

    Rednic, Emanuil; Toma, Andrei

    2009-01-01

    In database security, the issue of providing a level of security for multimedia information is getting more and more known. For the moment the security of multimedia information is done through the security of the database itself, in the same way, for all classic and multimedia records. So what is the reason for the creation of a security…

  5. Taking stock of project value creation: A structured literature review with future directions for research and practice

    DEFF Research Database (Denmark)

    Laursen, Markus; Svejvig, Per

    2016-01-01

    This paper aims to take stock of what we know about project value creation and to present future directions for research and practice. We performed an explorative and unstructured literature review, which was subsequently paired with a structured literature review. We join several research areas...... by adopting the project value creation perspective on literature relating to benefits, value, performance, and success in projects. Our review includes 111 contributions analyzed through both an inductive and deductive approach. We find that relevant literature dates back to the early 1980s, and the still...... developing value-centric view has been the subject of many publications in recent years. We contribute to research on project value creation through four directions for future research: rejuvenating value management through combining value, benefits, and costs; supplementing value creation with value capture...

  6. OTI Activity Database

    Data.gov (United States)

    US Agency for International Development — OTI's worldwide activity database is a simple and effective information system that serves as a program management, tracking, and reporting tool. In each country,...

  7. Creation of radiation defects in KCl crystals

    International Nuclear Information System (INIS)

    Lushchik, A.Ch.; Pung, L.A.; Khaldre, Yu.Yu.; Kolk, Yu.V.

    1981-01-01

    Optical and EPR methods were used to study the creation of anion and cation Frenkel defects in KCl crystals irradiated by X-ray and VUV-radiation. The decay of excitons with the creation of charged Frenkel defects (α and I centres) was detected and investigated at 4.2 K. The decay of excitons as well as the recombination of electrons with self-trapped holes leads to the creation of neutral Frenkel defects (F and H centres). The creation of Cl 3 - and Vsub(F) centres (cation vacancy is a component of these centres) by X-irradiation at 80 K proves the possibility of cation defects creation in KCl [ru

  8. Private and Efficient Query Processing on Outsourced Genomic Databases.

    Science.gov (United States)

    Ghasemi, Reza; Al Aziz, Md Momin; Mohammed, Noman; Dehkordi, Massoud Hadian; Jiang, Xiaoqian

    2017-09-01

    Applications of genomic studies are spreading rapidly in many domains of science and technology such as healthcare, biomedical research, direct-to-consumer services, and legal and forensic. However, there are a number of obstacles that make it hard to access and process a big genomic database for these applications. First, sequencing genomic sequence is a time consuming and expensive process. Second, it requires large-scale computation and storage systems to process genomic sequences. Third, genomic databases are often owned by different organizations, and thus, not available for public usage. Cloud computing paradigm can be leveraged to facilitate the creation and sharing of big genomic databases for these applications. Genomic data owners can outsource their databases in a centralized cloud server to ease the access of their databases. However, data owners are reluctant to adopt this model, as it requires outsourcing the data to an untrusted cloud service provider that may cause data breaches. In this paper, we propose a privacy-preserving model for outsourcing genomic data to a cloud. The proposed model enables query processing while providing privacy protection of genomic databases. Privacy of the individuals is guaranteed by permuting and adding fake genomic records in the database. These techniques allow cloud to evaluate count and top-k queries securely and efficiently. Experimental results demonstrate that a count and a top-k query over 40 Single Nucleotide Polymorphisms (SNPs) in a database of 20 000 records takes around 100 and 150 s, respectively.

  9. Optical measurements of paintings and the creation of an artwork database for authenticity.

    Directory of Open Access Journals (Sweden)

    Seonhee Hwang

    Full Text Available Paintings have high cultural and commercial value, so that needs to be preserved. Many techniques have been attempted to analyze properties of paintings, including X-ray analysis and optical coherence tomography (OCT methods, and enable conservation of paintings from forgeries. In this paper, we suggest a simple and accurate optical analysis system to protect them from counterfeit which is comprised of fiber optics reflectance spectroscopy (FORS and line laser-based topographic analysis. The system is designed to fully cover the whole area of paintings regardless of its size for the accurate analysis. For additional assessments, a line laser-based high resolved OCT was utilized. Some forgeries were created by the experts from the three different styles of genuine paintings for the experiments. After measuring surface properties of paintings, we could observe the results from the genuine works and the forgeries have the distinctive characteristics. The forgeries could be distinguished maximally 76.5% with obtained RGB spectra by FORS and 100% by topographic analysis. Through the several executions, the reliability of the system was confirmed. We could verify that the measurement system is worthwhile for the conservation of the valuable paintings. To store the surface information of the paintings in micron scale, we created a numerical database. Consequently, we secured the databases of three different famous Korean paintings for accurate authenticity.

  10. DataSpread: Unifying Databases and Spreadsheets.

    Science.gov (United States)

    Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya

    2015-08-01

    Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current "pane" (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases.

  11. Developing genomic knowledge bases and databases to support clinical management: current perspectives.

    Science.gov (United States)

    Huser, Vojtech; Sincan, Murat; Cimino, James J

    2014-01-01

    Personalized medicine, the ability to tailor diagnostic and treatment decisions for individual patients, is seen as the evolution of modern medicine. We characterize here the informatics resources available today or envisioned in the near future that can support clinical interpretation of genomic test results. We assume a clinical sequencing scenario (germline whole-exome sequencing) in which a clinical specialist, such as an endocrinologist, needs to tailor patient management decisions within his or her specialty (targeted findings) but relies on a genetic counselor to interpret off-target incidental findings. We characterize the genomic input data and list various types of knowledge bases that provide genomic knowledge for generating clinical decision support. We highlight the need for patient-level databases with detailed lifelong phenotype content in addition to genotype data and provide a list of recommendations for personalized medicine knowledge bases and databases. We conclude that no single knowledge base can currently support all aspects of personalized recommendations and that consolidation of several current resources into larger, more dynamic and collaborative knowledge bases may offer a future path forward.

  12. Database and applications security integrating information security and data management

    CERN Document Server

    Thuraisingham, Bhavani

    2005-01-01

    This is the first book to provide an in-depth coverage of all the developments, issues and challenges in secure databases and applications. It provides directions for data and application security, including securing emerging applications such as bioinformatics, stream information processing and peer-to-peer computing. Divided into eight sections, each of which focuses on a key concept of secure databases and applications, this book deals with all aspects of technology, including secure relational databases, inference problems, secure object databases, secure distributed databases and emerging

  13. Understanding indigenous knowledge: Bridging the knowledge gap through a knowledge creation model for agricultural development

    Directory of Open Access Journals (Sweden)

    Edda T. Lwoga

    2010-12-01

    Full Text Available This article addresses the management of agricultural indigenous knowledge (IK in developing countries, with a specific focus on Tanzania. It provides background details on IK and its importance for agricultural development. It introduces various knowledge management (KM concepts and discusses their application in managing IK in the developing world by placing Nonaka’s knowledge creation theory (Nonaka 1991; Nonaka & Takeuchi 1995; Nonaka, Toyama & Konno 2000 in the context of the local communities. Data from focus groups were used to triangulate with data from interviews in order to validate, confirm and corroborate quantitative results with qualitative findings. The study findings showed that knowledge creation theory can be used to manage IK in the local communities, however, adequate and appropriate resources need to be allocated for capturing and preserving IK before it disappears altogether. For sustainable agricultural development, the communities have to be placed within a knowledge-creating setting that continuously creates, distributes and shares knowledge within and beyond the communities’ boundaries and integrates it with new agricultural technologies, innovations and knowledge.

  14. Report on the present situation of the FY 1998 technical literature database; 1998 nendo gijutsu bunken database nado genjo chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    To study database which contributes to the future scientific technology information distribution, survey/analysis were conducted of the present status of the service supply side. In the survey on the database trend, the trend of relations between DB producers and distributors was investigated. As a result, there were seen the increase in DB producers, expansion of internet/distribution/service, etc., and there were no changes in the U.S.-centered structure. Further, it was recognized that the DB service in the internet age now faces the time of change as seen in existing producers' response to internet, on-line service of primary information source, creation of new on-line service, etc. By the internet impact, the following are predicted for the future DB service: slump of producers without strong points and gateway type distributors, appearance of new types of DB service, etc. (NEDO)

  15. A Database Integrity Pattern Language

    Directory of Open Access Journals (Sweden)

    Octavian Paul ROTARU

    2004-08-01

    Full Text Available Patterns and Pattern Languages are ways to capture experience and make it re-usable for others, and describe best practices and good designs. Patterns are solutions to recurrent problems.This paper addresses the database integrity problems from a pattern perspective. Even if the number of vendors of database management systems is quite high, the number of available solutions to integrity problems is limited. They all learned from the past experience applying the same solutions over and over again.The solutions to avoid integrity threats applied to in database management systems (DBMS can be formalized as a pattern language. Constraints, transactions, locks, etc, are recurrent integrity solutions to integrity threats and therefore they should be treated accordingly, as patterns.

  16. Analysis of Cloud-Based Database Systems

    Science.gov (United States)

    2015-06-01

    deploying the VM, we installed SQL Server 2014 relational database management software (RDBMS) and restored a copy of the PYTHON database onto the server ...management views within SQL Server , we retrieved lists of the most commonly executed queries, the percentage of reads versus writes, as well as...Monitor. This gave us data regarding resource utilization and queueing. The second tool we used was the SQL Server Profiler provided by Microsoft

  17. [Establishement for regional pelvic trauma database in Hunan Province].

    Science.gov (United States)

    Cheng, Liang; Zhu, Yong; Long, Haitao; Yang, Junxiao; Sun, Buhua; Li, Kanghua

    2017-04-28

    To establish a database for pelvic trauma in Hunan Province, and to start the work of multicenter pelvic trauma registry.
 Methods: To establish the database, literatures relevant to pelvic trauma were screened, the experiences from the established trauma database in China and abroad were learned, and the actual situations for pelvic trauma rescue in Hunan Province were considered. The database for pelvic trauma was established based on the PostgreSQL and the advanced programming language Java 1.6.
 Results: The complex procedure for pelvic trauma rescue was described structurally. The contents for the database included general patient information, injurious condition, prehospital rescue, conditions in admission, treatment in hospital, status on discharge, diagnosis, classification, complication, trauma scoring and therapeutic effect. The database can be accessed through the internet by browser/servicer. The functions for the database include patient information management, data export, history query, progress report, video-image management and personal information management.
 Conclusion: The database with whole life cycle pelvic trauma is successfully established for the first time in China. It is scientific, functional, practical, and user-friendly.

  18. A cosmological model with particle creation

    International Nuclear Information System (INIS)

    Chatterjee, Sujit

    2001-01-01

    A higher dimensional cosmological model is proposed where an expanding universe evolves from the vacuum fluctuation and matter creation takes place out of the gravitational energy. Choosing a particular form of the matter creation function N(t) as an initial conditions it can be shown that starting from an inflationary era the cosmos enters the higher dimensional Friedmann-like phase after a time scale when the matter creation stops

  19. Multi-pair states in electron–positron pair creation

    Directory of Open Access Journals (Sweden)

    Anton Wöllert

    2016-09-01

    Full Text Available Ultra strong electromagnetic fields can lead to spontaneous creation of single or multiple electron–positron pairs. A quantum field theoretical treatment of the pair creation process combined with numerical methods provides a description of the fermionic quantum field state, from which all observables of the multiple electron–positron pairs can be inferred. This allows to study the complex multi-particle dynamics of electron–positron pair creation in-depth, including multi-pair statistics as well as momentum distributions and spin. To illustrate the potential benefit of this approach, it is applied to the intermediate regime of pair creation between nonperturbative Schwinger pair creation and perturbative multiphoton pair creation where the creation of multi-pair states becomes nonnegligible but cascades do not yet set in. Furthermore, it is demonstrated how spin and helicity of the created electrons and positrons are affected by the polarization of the counterpropagating laser fields, which induce the creation of electron–positron pairs.

  20. Multi-pair states in electron–positron pair creation

    Energy Technology Data Exchange (ETDEWEB)

    Wöllert, Anton, E-mail: woellert@mpi-hd.mpg.de; Bauke, Heiko, E-mail: heiko.bauke@mpi-hd.mpg.de; Keitel, Christoph H.

    2016-09-10

    Ultra strong electromagnetic fields can lead to spontaneous creation of single or multiple electron–positron pairs. A quantum field theoretical treatment of the pair creation process combined with numerical methods provides a description of the fermionic quantum field state, from which all observables of the multiple electron–positron pairs can be inferred. This allows to study the complex multi-particle dynamics of electron–positron pair creation in-depth, including multi-pair statistics as well as momentum distributions and spin. To illustrate the potential benefit of this approach, it is applied to the intermediate regime of pair creation between nonperturbative Schwinger pair creation and perturbative multiphoton pair creation where the creation of multi-pair states becomes nonnegligible but cascades do not yet set in. Furthermore, it is demonstrated how spin and helicity of the created electrons and positrons are affected by the polarization of the counterpropagating laser fields, which induce the creation of electron–positron pairs.

  1. Multi-pair states in electron–positron pair creation

    International Nuclear Information System (INIS)

    Wöllert, Anton; Bauke, Heiko; Keitel, Christoph H.

    2016-01-01

    Ultra strong electromagnetic fields can lead to spontaneous creation of single or multiple electron–positron pairs. A quantum field theoretical treatment of the pair creation process combined with numerical methods provides a description of the fermionic quantum field state, from which all observables of the multiple electron–positron pairs can be inferred. This allows to study the complex multi-particle dynamics of electron–positron pair creation in-depth, including multi-pair statistics as well as momentum distributions and spin. To illustrate the potential benefit of this approach, it is applied to the intermediate regime of pair creation between nonperturbative Schwinger pair creation and perturbative multiphoton pair creation where the creation of multi-pair states becomes nonnegligible but cascades do not yet set in. Furthermore, it is demonstrated how spin and helicity of the created electrons and positrons are affected by the polarization of the counterpropagating laser fields, which induce the creation of electron–positron pairs.

  2. Data management with a landslide inventory of the Franconian Alb (Germany) using a spatial database and GIS tools

    Science.gov (United States)

    Bemm, Stefan; Sandmeier, Christine; Wilde, Martina; Jaeger, Daniel; Schwindt, Daniel; Terhorst, Birgit

    2014-05-01

    The area of the Swabian-Franconian cuesta landscape (Southern Germany) is highly prone to landslides. This was apparent in the late spring of 2013, when numerous landslides occurred as a consequence of heavy and long-lasting rainfalls. The specific climatic situation caused numerous damages with serious impact on settlements and infrastructure. Knowledge on spatial distribution of landslides, processes and characteristics are important to evaluate the potential risk that can occur from mass movements in those areas. In the frame of two projects about 400 landslides were mapped and detailed data sets were compiled during years 2011 to 2014 at the Franconian Alb. The studies are related to the project "Slope stability and hazard zones in the northern Bavarian cuesta" (DFG, German Research Foundation) as well as to the LfU (The Bavarian Environment Agency) within the project "Georisks and climate change - hazard indication map Jura". The central goal of the present study is to create a spatial database for landslides. The database should contain all fundamental parameters to characterize the mass movements and should provide the potential for secure data storage and data management, as well as statistical evaluations. The spatial database was created with PostgreSQL, an object-relational database management system and PostGIS, a spatial database extender for PostgreSQL, which provides the possibility to store spatial and geographic objects and to connect to several GIS applications, like GRASS GIS, SAGA GIS, QGIS and GDAL, a geospatial library (Obe et al. 2011). Database access for querying, importing, and exporting spatial and non-spatial data is ensured by using GUI or non-GUI connections. The database allows the use of procedural languages for writing advanced functions in the R, Python or Perl programming languages. It is possible to work directly with the (spatial) data entirety of the database in R. The inventory of the database includes (amongst others

  3. Data management and data analysis techniques in pharmacoepidemiological studies using a pre-planned multi-database approach: a systematic literature review.

    Science.gov (United States)

    Bazelier, Marloes T; Eriksson, Irene; de Vries, Frank; Schmidt, Marjanka K; Raitanen, Jani; Haukka, Jari; Starup-Linde, Jakob; De Bruin, Marie L; Andersen, Morten

    2015-09-01

    To identify pharmacoepidemiological multi-database studies and to describe data management and data analysis techniques used for combining data. Systematic literature searches were conducted in PubMed and Embase complemented by a manual literature search. We included pharmacoepidemiological multi-database studies published from 2007 onwards that combined data for a pre-planned common analysis or quantitative synthesis. Information was retrieved about study characteristics, methods used for individual-level analyses and meta-analyses, data management and motivations for performing the study. We found 3083 articles by the systematic searches and an additional 176 by the manual search. After full-text screening of 75 articles, 22 were selected for final inclusion. The number of databases used per study ranged from 2 to 17 (median = 4.0). Most studies used a cohort design (82%) instead of a case-control design (18%). Logistic regression was most often used for individual-level analyses (41%), followed by Cox regression (23%) and Poisson regression (14%). As meta-analysis method, a majority of the studies combined individual patient data (73%). Six studies performed an aggregate meta-analysis (27%), while a semi-aggregate approach was applied in three studies (14%). Information on central programming or heterogeneity assessment was missing in approximately half of the publications. Most studies were motivated by improving power (86%). Pharmacoepidemiological multi-database studies are a well-powered strategy to address safety issues and have increased in popularity. To be able to correctly interpret the results of these studies, it is important to systematically report on database management and analysis techniques, including central programming and heterogeneity testing. © 2015 The Authors. Pharmacoepidemiology and Drug Safety published by John Wiley & Sons, Ltd.

  4. Particle creation during vacuum decay

    International Nuclear Information System (INIS)

    Rubakov, V.A.

    1984-01-01

    The hamiltonian approach is developed with regard to the problem of particle creation during the tunneling process, leading to the decay of the false vacuum in quantum field theory. It is shown that, to the lowest order in (h/2π), the particle creation is described by the euclidean Schroedinger equation in an external field of a bounce. A technique for solving this equation is developed in an analogy to the Bogoliubov transformation technique, in the theory of particle creation in the presence of classical background fields. The technique is illustrated by two examples, namely, the particle creation during homogeneous vacuum decay and during the tunneling process leading to the materialization of the thin-wall bubble of a new vacuum in the metastable one. The curious phenomenon of intensive particle annihilation during vacuum decay is discussed and explicitly illustrated within the former example. The non-unitary extension of the Bogoliubov u, v transformations is described in the appendix. (orig.)

  5. An Innovative Approach to the Integrated Management System Development: SIMPRO-IMS Web Based Environment

    Directory of Open Access Journals (Sweden)

    Kristina Zgodavova

    2012-12-01

    Full Text Available The purpose of this paper is to contribute to learning, knowledge creation and knowledge transfer for building organization innovability by integrating the management systems in the SIMPRO-IMS web based environment. The paper content consists of the interpretation of role-play simulation, role-play simulation process description, methodology, and the employment of role-play simulation outcomes, as well as the discussion of the knowledge thus obtained. Primary the model of the SIMPRO-Q education environment has been developed and tested during a period of 15 years in several industrial organizations as well as service organizations such as Higher Education Institution (HEI and Healthcare Organization (HCO. The newest version SIMPRO-IMS has recently been developed to support a need of integration of management systems and information archiving. With the last development, SIMPRO-IMS web based environment, processes of five ISO systems are integrated for parallel development, implementation, auditing, maintaining and leading. SIMPRO-IMS provides management with the apparatus necessary to realize a systematic and verifiable approach to the creation and control of IMS documentation. At the same time contributes to the preservation of organization memory in response to the growing challenges of globalization and digitalization. The research is limited by the complexity of a real system and possible empiric results verification. The results achieved are verified when people really overcome the resistance to change. This can be assessed thoughtfully only after some period of time. Another limitation is presented by measurability of real enhancement achieved in quality, safety and environmentality of production, and business continuity and social responsibility of an organization. Development and progress in the methodology of SIMPRO-IMS web based environment is encoded in upgrading the SIMPRO database by processes of the environmental management

  6. Percutaneous Mesocaval Shunt Creation in a Patient with Chronic Portal and Superior Mesenteric Vein Thrombosis

    International Nuclear Information System (INIS)

    Bercu, Zachary L.; Sheth, Sachin B.; Noor, Amir; Lookstein, Robert A.; Fischman, Aaron M.; Nowakowski, F. Scott; Kim, Edward; Patel, Rahul S.

    2015-01-01

    The creation of a transjugular intrahepatic portosystemic shunt (TIPS) is a critical procedure for the treatment of recurrent variceal bleeding and refractory ascites in the setting of portal hypertension. Chronic portal vein thrombosis remains a relative contraindication to conventional TIPS and options are limited in this scenario. Presented is a novel technique for management of refractory ascites in a patient with hepatitis C cirrhosis and chronic portal and superior mesenteric vein thrombosis secondary to schistosomiasis and lupus anticoagulant utilizing fluoroscopically guided percutaneous mesocaval shunt creation

  7. Percutaneous Mesocaval Shunt Creation in a Patient with Chronic Portal and Superior Mesenteric Vein Thrombosis

    Energy Technology Data Exchange (ETDEWEB)

    Bercu, Zachary L., E-mail: zachary.bercu@mountsinai.org; Sheth, Sachin B., E-mail: sachinsheth@gmail.com [Icahn School of Medicine at Mount Sinai, Division of Interventional Radiology (United States); Noor, Amir, E-mail: amir.noor@gmail.com [The George Washington University School of Medicine and Health Sciences (United States); Lookstein, Robert A., E-mail: robert.lookstein@mountsinai.org; Fischman, Aaron M., E-mail: aaron.fischman@mountsinai.org; Nowakowski, F. Scott, E-mail: scott.nowakowski@mountsinai.org; Kim, Edward, E-mail: edward.kim@mountsinai.org; Patel, Rahul S., E-mail: rahul.patel@mountsinai.org [Icahn School of Medicine at Mount Sinai, Division of Interventional Radiology (United States)

    2015-10-15

    The creation of a transjugular intrahepatic portosystemic shunt (TIPS) is a critical procedure for the treatment of recurrent variceal bleeding and refractory ascites in the setting of portal hypertension. Chronic portal vein thrombosis remains a relative contraindication to conventional TIPS and options are limited in this scenario. Presented is a novel technique for management of refractory ascites in a patient with hepatitis C cirrhosis and chronic portal and superior mesenteric vein thrombosis secondary to schistosomiasis and lupus anticoagulant utilizing fluoroscopically guided percutaneous mesocaval shunt creation.

  8. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    Science.gov (United States)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available

  9. Job Creation and Job Types

    DEFF Research Database (Denmark)

    Kuhn, Johan Moritz; Malchow-Møller, Nikolaj; Sørensen, Anders

    2016-01-01

    We extend earlier analyses of the job creation of start-ups versus established firms by considering the educational content of the jobs created and destroyed. We define education-specific measures of job creation and job destruction at the firm level, and we use these measures to construct a meas...

  10. Job Creation and Job Types

    DEFF Research Database (Denmark)

    Kuhn, Johan M.; Malchow-Møller, Nikolaj; Sørensen, Anders

    We extend earlier analyses of the job creation of start-ups vs. established firms by taking into consideration the educational content of the jobs created and destroyed. We define educationspecific measures of job creation and job destruction at the firm level, and we use these to construct a mea...

  11. Money Creation in a Random Matching Model

    OpenAIRE

    Alexei Deviatov

    2006-01-01

    I study money creation in versions of the Trejos-Wright (1995) and Shi (1995) models with indivisible money and individual holdings bounded at two units. I work with the same class of policies as in Deviatov and Wallace (2001), who study money creation in that model. However, I consider an alternative notion of implementability–the ex ante pairwise core. I compute a set of numerical examples to determine whether money creation is beneficial. I find beneficial e?ects of money creation if indiv...

  12. DESIGN AND CONSTRUCTION OF A FOREST SPATIAL DATABASE: AN APPLICATION

    Directory of Open Access Journals (Sweden)

    Turan Sönmez

    2006-11-01

    Full Text Available General Directorate of Forests (GDF has not yet created the spatial forest database to manage forest and catch the developed countries in forestry. The lack of spatial forest database results in collection of the spatial data redundancy, communication problems among the forestry organizations. Also it causes Turkish forestry to be backward of informatics’ era. To solve these problems; GDF should establish spatial forest database supported Geographic Information System (GIS. To design the spatial database, supported GIS, which provides accurate, on time and current data/info for decision makers and operators in forestry, and to develop sample interface program to apply and monitor classical forest management plans is paramount in contemporary forest management planning process. This research is composed of three major stages: (i spatial rototype database design considering required by the three hierarchical organizations of GDF (regional directorate of forests, forest enterprise, and territorial division, (ii user interface program developed to apply and monitor classical management plans based on the designed database, (iii the implementation of the designed database and its user interface in Artvin Central Planning Unit.

  13. The state of the art of medical imaging technology: from creation to archive and back.

    Science.gov (United States)

    Gao, Xiaohong W; Qian, Yu; Hui, Rui

    2011-01-01

    Medical imaging has learnt itself well into modern medicine and revolutionized medical industry in the last 30 years. Stemming from the discovery of X-ray by Nobel laureate Wilhelm Roentgen, radiology was born, leading to the creation of large quantities of digital images as opposed to film-based medium. While this rich supply of images provides immeasurable information that would otherwise not be possible to obtain, medical images pose great challenges in archiving them safe from corrupted, lost and misuse, retrievable from databases of huge sizes with varying forms of metadata, and reusable when new tools for data mining and new media for data storing become available. This paper provides a summative account on the creation of medical imaging tomography, the development of image archiving systems and the innovation from the existing acquired image data pools. The focus of this paper is on content-based image retrieval (CBIR), in particular, for 3D images, which is exemplified by our developed online e-learning system, MIRAGE, home to a repository of medical images with variety of domains and different dimensions. In terms of novelties, the facilities of CBIR for 3D images coupled with image annotation in a fully automatic fashion have been developed and implemented in the system, resonating with future versatile, flexible and sustainable medical image databases that can reap new innovations.

  14. Exploiting relational database technology in a GIS

    Science.gov (United States)

    Batty, Peter

    1992-05-01

    All systems for managing data face common problems such as backup, recovery, auditing, security, data integrity, and concurrent update. Other challenges include the ability to share data easily between applications and to distribute data across several computers, whereas continuing to manage the problems already mentioned. Geographic information systems are no exception, and need to tackle all these issues. Standard relational database-management systems (RDBMSs) provide many features to help solve the issues mentioned so far. This paper describes how the IBM geoManager product approaches these issues by storing all its geographic data in a standard RDBMS in order to take advantage of such features. Areas in which standard RDBMS functions need to be extended are highlighted, and the way in which geoManager does this is explained. The performance implications of storing all data in the relational database are discussed. An important distinction is made between the storage and management of geographic data and the manipulation and analysis of geographic data, which needs to be made when considering the applicability of relational database technology to GIS.

  15. Investigation the impact of outsourcing on competitive advantages' creation by considering Porter's model; Case study: Zamyad Company

    Directory of Open Access Journals (Sweden)

    Ahmad Reza Kasrai

    2012-08-01

    Full Text Available Competitive advantage is an important factor in boosting companies' success and is considered more emphatically in management and strategic marketing literature in recent years. There are many different ideas about effective factors in creation of competitive advantages. Also fast rate of change in business, is forcing CEOs to utilize some strategies, which have the best impact on current organizational circumstances and the future trend of investigation in organizational trades. Outsourcing is one of the best strategies, which are widely utilized by CEOs in different organizations. Many managers believe that outsourcing is the solitary way for preserving the balance of organization in 21 century. Based on Porter competitive advantage model, there are three strategies, which lead a company to reach competitive advantage. These strategies are cost leadership, differentiation strategy and segmentation strategy. In this article, we are investigating outsourcing effects on creation of competitive advantages through Porter model in an automotive factory in Iran. We design a questionnaire for gathering necessary information about the role of outsourcing in creation of different strategies as competitive advantages in managers' point of view. We analyze the questionnaires and implement a goodness of fit test to recognize the distribution of data and the statistical method. Preliminary results show that nonparametric statistic methods can be utilized for testing our hypothesis. We use a Wilcoxon test to consider the null hypothesis and a Friedman test to estimate the rank of means. Our findings verify an undeniable effect of outsourcing on creation of competitive advantage and the ranking list is presented.

  16. Correspondence of Concept Hierarchies in Semantic Web Based upon Global Instances and its Application to Facility Management Database

    Science.gov (United States)

    Takahashi, Hiroki; Nishi, Yuusuke; Gion, Tomohiro; Minami, Shinichi; Fukunaga, Tatsuya; Ogata, Jiro; Yoshie, Osamu

    Semantic Web is the technology which determines the relevance of data over the Web using meta-data and which enables advanced search of global information. It is now desired to develop and apply this technology to many situations of facility management. In facility management, vocabulary should be unified to share the database of facilities for generating optimal maintenance schedule and so on. Under such situations, ontology databases are usually used to describe composition or hierarchy of facility parts. However, these vocabularies used in databases are not unified even between factories of same company, and this situation causes communication hazard between them. Moreover, concept involved in the hierarchy cannot be corresponded each other. There are some methods to correspond concepts of different hierarchy. But these methods have some defects, because they only attend target hierarchy itself and the number of instances. We propose improved method for corresponding concepts between different concepts' hierarchies, which uses other hierarchies all over the world of Web and the distance of instances to identify their relations. Our method can work even if the sets of instances belonging to the concepts are not identical.

  17. Components of Co-creation

    DEFF Research Database (Denmark)

    Tanev, Stoyan

    2009-01-01

    , such an approach misses the advantages of an empirically driven quantitative approach that benefits from larger size samples and is more appropriate for theory building through the development and testing of hypotheses. It is important, therefore, to seek the development of a research methodology that combines...... the benefits of both qualitative and quantitative research approaches for studying the nature of value co-creation. The article provides a first attempt to identify the main research steps of such a methodology. It provides some preliminary results on the key components of value co-creation between firms...... the inner logic of the value co-creation phenomenon as well as the nature of the results reported in this article. The specific nature of the results was found to be suitable for the application of small-N techniques such as the Qualitative Comparative Analysis (QCA) technique which combines the advantages...

  18. Engineering-Geological Data Model - The First Step to Build National Polish Standard for Multilevel Information Management

    Science.gov (United States)

    Ryżyński, Grzegorz; Nałęcz, Tomasz

    2016-10-01

    The efficient geological data management in Poland is necessary to support multilevel decision processes for government and local authorities in case of spatial planning, mineral resources and groundwater supply and the rational use of subsurface. Vast amount of geological information gathered in the digital archives and databases of Polish Geological Survey (PGS) is a basic resource for multi-scale national subsurface management. Data integration is the key factor to allow development of GIS and web tools for decision makers, however the main barrier for efficient geological information management is the heterogeneity of data in the resources of the Polish Geological Survey. Engineering-geological database is the first PGS thematic domain applied in the whole data integration plan. The solutions developed within this area will facilitate creation of procedures and standards for multilevel data management in PGS. Twenty years of experience in delivering digital engineering-geological mapping in 1:10 000 scale and archival geotechnical reports acquisition and digitisation allowed gathering of more than 300 thousands engineering-geological boreholes database as well as set of 10 thematic spatial layers (including foundation conditions map, depth to the first groundwater level, bedrock level, geohazards). Historically, the desktop approach was the source form of the geological-engineering data storage, resulting in multiple non-correlated interbase datasets. The need for creation of domain data model emerged and an object-oriented modelling (UML) scheme has been developed. The aim of the aforementioned development was to merge all datasets in one centralised Oracle server and prepare the unified spatial data structure for efficient web presentation and applications development. The presented approach will be the milestone toward creation of the Polish national standard for engineering-geological information management. The paper presents the approach and methodology

  19. Exploring value creation from corporate-foresight activities

    DEFF Research Database (Denmark)

    Rohrbeck, René

    2012-01-01

    This paper looks at value creation from corporate futures research. Through a literature review, potential value creation is identified. This serves as guidance for an empirical investigation in which value creation is observed and linked to methods and practices. Using data from 20 case studies......, three examples of value creation are discussed in detail. In addition, cross-case analysis allowed me to identify four success criteria for corporate foresight activities: (1) foresighters committed to creating value, (2) participation of internal stakeholders, (3) analysis that follows a systemic logic...

  20. Applying Stochastic Metaheuristics to the Problem of Data Management in a Multi-Tenant Database Cluster

    Directory of Open Access Journals (Sweden)

    E. A. Boytsov

    2014-01-01

    Full Text Available A multi-tenant database cluster is a concept of a data-storage subsystem for cloud applications with the multi-tenant architecture. The cluster is a set of relational database servers with the single entry point, combined into one unit with a cluster controller. This system is aimed to be used by applications developed according to Software as a Service (SaaS paradigm and allows to place tenants at database servers so that providing their isolation, data backup and the most effective usage of available computational power. One of the most important problems about such a system is an effective distribution of data into servers, which affects the degree of individual cluster nodes load and faulttolerance. This paper considers the data-management approach, based on the usage of a load-balancing quality measure function. This function is used during initial placement of new tenants and also during placement optimization steps. Standard schemes of metaheuristic optimization such as simulated annealing and tabu search are used to find a better tenant placement.